Refine
Year of publication
Document Type
- Article (5124)
- Doctoral Thesis (1466)
- Part of Periodical (210)
- Conference Proceeding (164)
- Preprint (159)
- Book (85)
- Contribution to a Periodical (60)
- Review (38)
- Working Paper (22)
- Part of a Book (16)
Language
Keywords
- inflammation (80)
- COVID-19 (58)
- SARS-CoV-2 (48)
- apoptosis (38)
- glioblastoma (38)
- Inflammation (37)
- cancer (37)
- breast cancer (34)
- autophagy (29)
- Apoptosis (25)
Institute
- Medizin (7364) (remove)
Background: Rare Diseases (RDs), which are defined as diseases affecting no more than 5 out of 10,000 people, are often severe, chronic and life-threatening. A main problem is the delay in diagnosing RDs. Clinical decision support systems (CDSSs) for RDs are software systems to support clinicians in the diagnosis of patients with RDs. Due to their clinical importance, we conducted a scoping review to determine which CDSSs are available to support the diagnosis of RDs patients, whether the CDSSs are available to be used by clinicians and which functionalities and data are used to provide decision support.
Methods: We searched PubMed for CDSSs in RDs published between December 16, 2008 and December 16, 2018. Only English articles, original peer reviewed journals and conference papers describing a clinical prototype or a routine use of CDSSs were included. For data charting, we used the data items “Objective and background of the publication/project”, “System or project name”, “Functionality”, “Type of clinical data”, “Rare Diseases covered”, “Development status”, “System availability”, “Data entry and integration”, “Last software update” and “Clinical usage”.
Results: The search identified 636 articles. After title and abstracting screening, as well as assessing the eligibility criteria for full-text screening, 22 articles describing 19 different CDSSs were identified. Three types of CDSSs were classified: “Analysis or comparison of genetic and phenotypic data,” “machine learning” and “information retrieval”. Twelve of nineteen CDSSs use phenotypic and genetic data, followed by clinical data, literature databases and patient questionnaires. Fourteen of nineteen CDSSs are fully developed systems and therefore publicly available. Data can be entered or uploaded manually in six CDSSs, whereas for four CDSSs no information for data integration was available. Only seven CDSSs allow further ways of data integration. thirteen CDSS do not provide information about clinical usage.
Conclusions: Different CDSS for various purposes are available, yet clinicians have to determine which is best for their patient. To allow a more precise usage, future research has to focus on CDSSs RDs data integration, clinical usage and updating clinical knowledge. It remains interesting which of the CDSSs will be used and maintained in the future.
Congenital diaphragmatic hernia (CDH) is a relatively common and life-threatening birth defect, characterized by incomplete formation of the diaphragm. Because CDH herniation occurs at the same time as preacinar airway branching, normal lung development becomes severely disrupted, resulting almost invariably in pulmonary hypoplasia. Despite various research efforts over the past decades, the pathogenesis of CDH and associated lung hypoplasia remains poorly understood. With the advent of molecular techniques, transgenic animal models of CDH have generated a large number of candidate genes, thus providing a novel basis for future research and treatment. This review article offers a comprehensive overview of genes and signaling pathways implicated in CDH etiology, whilst also discussing strengths and limitations of transgenic animal models in relation to the human condition.
Ubiquitination, and its control by deubiquitinating enzymes (DUBs), mediates protein stability, function, signaling and cell fate. The ovarian tumor (OTU) family DUB OTULIN (FAM105B) exclusively cleaves linear (Met1-linked) poly-ubiquitin chains and plays important roles in auto-immunity, inflammation and infection. OTULIN regulates Met1-linked ubiquitination downstream of tumor necrosis factor receptor 1 (TNFR1), toll-like receptor (TLR) and nucleotide-binding and oligomerization domain-containing protein 2 (NOD2) receptor activation and interacts with the Met1 ubiquitin-specific linear ubiquitin chain assembly complex (LUBAC) E3 ligase. However, despite extensive research efforts, the receptor and cytosolic roles of OTULIN and the distributions of multiple Met1 ubiquitin-associated E3-DUB complexes in the regulation of cell fate still remain controversial and unclear. Apart from that, novel ubiquitin-independent OTULIN functions have emerged highlighting an even more complex role of OTULIN in cellular homeostasis. For example, OTULIN interferes with endosome-to-plasma membrane trafficking and the OTULIN-related pseudo-DUB OTULINL (FAM105A) resides at the endoplasmic reticulum (ER). Here, we discuss how OTULIN contributes to cell fate control and highlight novel ubiquitin-dependent and -independent functions.
In the application of range of motion (ROM) tests there is little agreement on the number of repetitions to be measured and the number of preceding warm-up protocols. In stretch training a plateau in ROM gains can be seen after four to five repetitions. With increasing number of repetitions, the gain in ROM is reduced. This study examines the question of whether such an effect occurs in common ROM tests. Twenty-two healthy sport students (10 m/12 f.) with an average age of 25.3 ± 1.94 years (average height 174.1 ± 9.8 cm; weight 66.6 ± 11.3 kg and BMI 21.9 ± 2.0 kg/cm2) volunteered in this study. Each subject performed five ROM tests in a randomized order—measured either via a tape measure or a digital inclinometer: Tape measure was used to evaluate the Fingertip-to-Floor test (FtF) and the Lateral Inclination test (LI). Retroflexion of the trunk modified after Janda (RF), Thomas test (TT) and a Shoulder test modified after Janda (ST) were evaluated with a digital inclinometer. In order to show general acute effects within 20 repetitions we performed ANOVA/Friedman-test with multiple comparisons. A non-linear regression was then performed to identify a plateau formation. Significance level was set at 5%. In seven out of eight ROM tests (five tests in total with three tests measured both left and right sides) significant flexibility gains were observed (FtF: p < 0.001; LI-left/right: p < 0.001/0.001; RF: p = 0.009; ST-left/right: p < 0.001/p = 0.003; TT-left: p < 0.001). A non-linear regression with random effects was successfully applied on FtF, RF, LI-left/right, ST-left and TT-left and thus, indicate a gradual decline in the amount of gained ROM. An acute effect was observed in most ROM tests, which is characterized by a gradual decline of ROM gain. For those tests, we can state that the acute effect described in the stretching literature also applies to the performance of typical ROM tests. Since a non-linear behavior was shown, it is the decision of the practitioner to weigh up between measurement accuracy and expenditure. Researchers and practitioners should consider this when applying ROM assessments to healthy young adults.
Aims: Preventing hospitalization by detecting early evidence of heart failure (HF) decompensation in an outpatient setting can improve patient's quality of life and reduce costs of care. The purpose of this study was to assess the value of cardiac acoustic biomarkers (CABs), a combination of cardiohaemic vibrations synchronized with ECG signals, and heart rate (HR) for detecting HF decompensation during first 3 months after hospital discharge for HF.
Methods and results: Patients with an ejection fraction ≤35% (HFrEF) and hospitalized for decompensated HF were enrolled in a prospective observational study. All subjects wore a wearable cardioverter‐defibrillator (ZOLL LifeVest®, Pittsburgh, PA, USA) that is capable of recording CABs and HR. The primary endpoint of the study was the first HF event, defined as HF readmission or HF emergency room visit. From June 2017 through August 2019, 671 patients with HFrEF were enrolled. Eighty‐one patients (12.1%) had a total of 112 HF events. The algorithm detected HF events with a median of 32 days (interquartile range = 11‐45) in advance of the first HF event. The algorithm had a sensitivity of 69%, specificity of 60%, positive predictive value of 19%, and a negative predictive value of 94%. Of note, the baseline (first 7 days post‐enrolment) algorithm using CABs and HR was superior to New York Heart Association classification in detecting patients more likely to have HF decompensation (sensitivity and specificity of 61% and 68% vs. 46% and 55%, respectively).
Conclusions: This prospective international registry showed that an algorithm incorporating CABs and HR data detected HF events 30 days in advance of the event in patients with HFrEF during first 3 months after hospital discharge. Therefore, integrating CAB technology into clinical practice may prevent HF rehospitalizations.
Cholinesterase alterations in delirium after cardiosurgery: a German monocentric prospective study
(2020)
Objectives: Postoperative delirium (POD) is a common complication after elective cardiac surgery. Recent evidence indicates that a disruption in the normal activity of the cholinergic system may be associated with delirium.
Design: Prospective observational study.
Setting: Single-centre at a European academic hospital.
Primary: and secondary outcome measures In our study the enzyme activities of acetylcholinesterase (AChE) and butyrylcholinesterase (BChE) were determined preoperatively as well as on the first and second postoperative day. The confusion assessment method for the intensive care unit was used to screen patients for the presence of POD.
Results: A total of 114 patients were included in the study. POD was associated with a decrease in BChE activity on postoperative day 1 (p=0.03). In addition, patients who developed POD, had significantly lower preoperative AChE activity than patients without POD (p<0.01). Multivariate analysis identified a preoperatively decreased AChE activity (OR 3.1; 95% CI 1.14 to 8.46), anticholinergic treatment (OR 5.09; 95% CI 1.51 to 17.23), elevated European System for Cardiac Operative Risk Evaluation (OR 3.68; 95% CI 1.04 to 12.99) and age (OR 3.02; 95% CI 1.06 to 8.62) to be independently associated with the development of POD.
Conclusions: We conclude that a reduction in the acetylcholine hydrolysing enzyme activity in patients undergoing cardiac surgery may correlate with the development of POD.
Evoked potentials in the amplitude-time spectrum of the electroencephalogram are commonly used to assess the extent of brain responses to stimulation with noxious contact heat. The magnitude of the N- and P-waves are used as a semi-objective measure of the response to the painful stimulus: the higher the magnitude, the more painful the stimulus has been perceived. The strength of the N-P-wave response is also largely dependent on the chosen reference electrode site. The goal of this study was to examine which reference technique excels both in practical and theoretical terms when analyzing noxious contact heat evoked potentials (CHEPS) in the amplitude-time spectrum. We recruited 21 subjects (10 male, 11 female, mean age of 55.79 years). We applied seven noxious contact heat stimuli using two temperatures, 51°C, and 54°C, to each subject. During EEG analysis, we aimed to identify the referencing technique which produces the highest N-wave and P-wave amplitudes with as little artifactual influence as possible. For this purpose, we applied the following six referencing techniques: mathematically linked A1/A2 (earlobes), average reference, REST, AFz, Pz, and mathematically linked PO7/PO8. We evaluated how these techniques impact the N-P amplitudes of CHEPS based on our data from healthy subjects. Considering all factors, we found that mathematically linked earlobes to be the ideal referencing site to use when displaying and evaluating CHEPS in the amplitude-time spectrum.
Background: Anemia is the most important complication during major surgery and transfusion of red blood cells is the mainstay to compensate for life threating blood loss. Therefore, accurate measurement of hemoglobin (Hb) concentration should be provided in real-time. Blood Gas Analysis (BGA) provides rapid point-of-care assessment using smaller sampling tubes compared to central laboratory (CL) services. Objective: This study aimed to investigate the accuracy of BGA hemoglobin testing as compared to CL services. Methods: Data of the ongoing LIBERAL-Trial (Liberal transfusion strategy to prevent mortality and anemia-associated ischemic events in elderly non-cardiac surgical patients, LIBERAL) was used to assess the bias for Hb level measured by BGA devices (ABL800 Flex analyzer®, GEM series® and RapidPoint 500®) and CL as the reference method. For that, we analyzed pairs of Hb level measured by CL and BGA within two hours. Furthermore, the impact of various confounding factors including age, gender, BMI, smoker status, transfusion of RBC, intraoperative hemodilution, and co-medication was elucidated. In order to ensure adequate statistical analysis, only data of participating centers providing more than 200 Hb pairs were used. Results: In total, three centers including 963 patients with 1,814 pairs of Hb measurements were analyzed. Mean bias was comparable between ABL800 Flex analyzer® and GEM series®: - 0.38 ± 0.15 g/dl whereas RapidPoint 500® showed a smaller bias (-0.09 g/dl) but greater median absolute deviation (± 0.45 g/dl). In order to avoid interference with different standard deviations caused by the different analytic devices, we focused on two centers using the same BGA technique (309 patients and 1,570 Hb pairs). A Bland-Altman analysis and LOWESS curve showed that bias decreased with smaller Hb values in absolute numbers but increased relatively. The smoker status showed the greatest reduction in bias (0.1 g/dl, p<0.001) whereas BMI (0.07 g/dl, p = 0.0178), RBC transfusion (0.06 g/dl, p<0.001), statins (0.04 g/dl, p<0.05) and beta blocker (0.03 g/dl, p = 0.02) showed a slight effect on bias. Intraoperative substitution of volume and other co-medications did not influence the bias significantly. Conclusion: Many interventions like substitution of fluids, coagulating factors or RBC units rely on the accuracy of laboratory measurement devices. Although BGA Hb testing showed a consistently stable difference to CL, our data confirm that BGA devices are associated with different bias. Therefore, we suggest that hospitals assess their individual bias before implementing BGA as valid and stable supplement to CL. However, based on the finding that bias decreased with smaller Hb values, which in turn are used for transfusion decision, we expect no unnecessary or delayed RBC transfusion, and no major impact on the LIBERAL trial performance.
Background: Meta-analysis of observational studies concluded that soft drinks may increase the risk of depression, while high consumption of coffee and tea may reduce the risk. Objectives were to explore the associations between the consumption of soft drinks, coffee or tea and: (1) a history of major depressive disorder (MDD) and (2) the severity of depressive symptoms clusters (mood, cognitive and somatic/vegetative symptoms). Methods: Cross-sectional and longitudinal analysis based on baseline and 12-month-follow-up data collected from four countries participating in the European MooDFOOD prevention trial. In total, 941 overweight adults with subsyndromal depressive symptoms aged 18 to 75 years were analyzed. History of MDD, depressive symptoms and beverages intake were assessed. Results: Sugar-sweetened soft drinks were positively related to MDD history rates whereas soft drinks with non-nutritive sweeteners were inversely related for the high vs. low categories of intake. Longitudinal analysis showed no significant associations between beverages and mood, cognitive and somatic/vegetative clusters. Conclusion: Our findings point toward a relationship between soft drinks and past MDD diagnoses depending on how they are sweetened while we found no association with coffee and tea. No significant effects were found between any studied beverages and the depressive symptoms clusters in a sample of overweight adults.
The incidence of FIX inhibitors in severe hemophilia B (SHB) is not well defined. Frequencies of 3-5% have been reported but most studies to date were small, including patients with different severities, and without prospective follow-up for inhibitor incidence. Study objective was to investigate inhibitor incidence in patients with SHB followed up to 500 exposure days (ED), the frequency of allergic reactions, and the relationship with genotypes. Consecutive previously untreated patients (PUPs) with SHB enrolled into the PedNet cohort were included. Detailed data was collected for the first 50 ED, followed by annual collection of inhibitor status and allergic reactions. Presence of inhibitors was defined by at least two consecutive positive samples. Additionally, data on factor IX gene mutation was collected. 154 PUPs with SHB were included; 75% were followed until 75 ED, and 43% until 500 ED. Inhibitors developed in 14 patients (7 high-titre). Median number of ED at inhibitor manifestation was 11 (IQR 6.5-36.5). Cumulative inhibitor incidence was 9.3% (95%CI 4.4-14.1) at 75 ED, and 10.2% (5.1-15.3) at 500 ED. Allergic reactions occurred in 4 (28.6%) inhibitor patients. Missense mutations were most frequent (46.8%) overall but not associated with inhibitors. Nonsense mutations and deletions with large structural changes comprised all mutations among inhibitor patients and were associated with an inhibitor risk of 26.9% and 33.3%, respectively. In an unselected, well-defined cohort of PUPs with SHB, cumulative inhibitor incidence was 10.2% at 500 ED. Nonsense mutations and large deletions were strongly associated with the risk of inhibitor development. The PedNet Registry is registered at clinicaltrials.gov; identifier: NCT02979119