Refine
Year of publication
Has Fulltext
- yes (32)
Is part of the Bibliography
- no (32)
Keywords
- Amblyopia (3)
- Dose response (3)
- Eccentric fixation (3)
- Efficiency (3)
- Occlusion treatment (3)
- ABC transporter (1)
- ALL (1)
- AML (1)
- ARDS (1)
- Apoptosis (1)
Institute
- Medizin (20)
- Exzellenzcluster Makromolekulare Komplexe (2)
- Geowissenschaften (2)
- Pharmazie (2)
- Biochemie und Chemie (1)
- Biochemie, Chemie und Pharmazie (1)
- DFG-Forschergruppen (1)
- Georg-Speyer-Haus (1)
- Geowissenschaften / Geographie (1)
- Physik (1)
- Sonderforschungsbereiche / Forschungskollegs (1)
- Sportwissenschaften (1)
- Zentrum für Arzneimittelforschung, Entwicklung und Sicherheit (ZAFES) (1)
Background and Objectives: Delirium is a common and major complication subsequent to cardiac surgery. Despite scientific efforts, there are no parameters which reliably predict postoperative delirium. In delirium pathology, natriuretic peptides (NPs) interfere with the blood–brain barrier and thus promote delirium. Therefore, we aimed to assess whether NPs may predict postoperative delirium and long-term outcomes. Materials and Methods: To evaluate the predictive value of NPs for delirium we retrospectively analyzed data from a prospective, randomized study for serum levels of atrial natriuretic peptide (ANP) and the precursor of C-type natriuretic peptide (NT-proCNP) in patients undergoing coronary artery bypass grafting (CABG) with or without cardiopulmonary bypass (off-pump coronary bypass grafting; OPCAB). Delirium was assessed by a validated chart-based method. Long-term outcomes were assessed 10 years after surgery by a telephone interview. Results: The overall incidence of delirium in the total cohort was 48% regardless of the surgical approach (CABG vs. OPCAB). Serum ANP levels >64.6 pg/mL predicted delirium with a sensitivity (95% confidence interval) of 100% (75.3–100) and specificity of 42.9% (17.7–71.1). Serum NT-proCNP levels >1.7 pg/mL predicted delirium with a sensitivity (95% confidence interval) of 92.3% (64.0–99.8) and specificity of 42.9% (17.7–71.1). Both NPs could not predict postoperative survival or long-term cognitive decline. Conclusions: We found a positive correlation between delirium and preoperative plasma levels of ANP and NT-proCNP. A well-powered and prospective study might identify NPs as biomarkers indicating the risk of delirium and postoperative cognitive decline in patients at risk for postoperative delirium.
Background: Liver fibrosis in human immunodeficiency virus (HIV)-infected individuals is mostly attributable to co-infection with hepatitis B or C. The impact of other risk factors, including prolonged exposure to combined antiretroviral therapy (cART) is poorly understood. Our aim was to determine the prevalence of liver fibrosis and associated risk factors in HIV-infected individuals based on non-invasive fibrosis assessment using transient elastography (TE) and serum biomarkers (Fibrotest [FT]).
Methods: In 202 consecutive HIV-infected individuals (159 men; mean age 47 ± 9 years; 35 with hepatitis-C-virus [HCV] co-infection), TE and FT were performed. Repeat TE examinations were conducted 1 and 2 years after study inclusion.
Results: Significant liver fibrosis was present in 16% and 29% of patients, respectively, when assessed by TE (≥ 7.1 kPa) and FT (> 0.48). A combination of TE and FT predicted significant fibrosis in 8% of all patients (31% in HIV/HCV co-infected and 3% in HIV mono-infected individuals). Chronic ALT, AST and γ-GT elevation was present in 29%, 20% and 51% of all cART-exposed patients and in 19%, 8% and 45.5% of HIV mono-infected individuals. Overall, factors independently associated with significant fibrosis as assessed by TE (OR, 95% CI) were co-infection with HCV (7.29, 1.95-27.34), chronic AST (6.58, 1.30-33.25) and γ-GT (5.17, 1.56-17.08) elevation and time on dideoxynucleoside therapy (1.01, 1.00-1.02). In 68 HIV mono-infected individuals who had repeat TE examinations, TE values did not differ significantly during a median follow-up time of 24 months (median intra-patient changes at last TE examination relative to baseline: -0.2 kPa, p = 0.20).
Conclusions: Chronic elevation of liver enzymes was observed in up to 45.5% of HIV mono-infected patients on cART. However, only a small subset had significant fibrosis as predicted by TE and FT. There was no evidence for fibrosis progression during follow-up TE examinations.
Background & Aims: Thrombopoietin receptor agonists are a new class of compounds licenced for the treatment of immune thrombocytopenic purpura. They are currently being studied for patients with thrombopenia in advanced liver disease or under therapy for hepatitis C. There are indications that the risk for development of portal vein thrombosis in patients with advanced liver cirrhosis might be increased under therapy with thrombopoietin receptor agonists. We report a case of a patient with Child class B liver cirrhosis with concurrent immune thrombocytopenic purpura that developed portal vein thrombosis under therapy with the thrombopoietin receptor agonist romiplostim.
Methods: A 50-year-old woman with hepatitis C virus associated immune thrombocytopenic purpura and Child class B liver cirrhosis presented in our emergency with rapidly evolving hydropic decompensation and general malaise. For immune thrombocytopenic purpura, the patient was started on the thrombopoietin receptor agonist romiplostim nine months ago.
Results: During hospitalization, the platelet count was measured above 330,000/μl and partial portal vein thrombosis was diagnosed by imaging studies. The thrombotic event was assumed to be associated with the romiplostim treatment for immune thrombocytopenic purpura via excessive elevation of platelet count. After anticoagulation with heparin and cessation of romiplostim treatment, complete recanalisation of the portal vein was achieved.
Conclusions: We conclude that romiplostim should be used with precaution in patients with hepatitis C-associated immune thrombocytopenic purpura and advanced liver cirrhosis as the risk for thrombotic complications may increase significantly.
Background. Transcatheter aortic valve implantation (TAVI) is currently recommended for patients with severe aortic stenosis at intermediate or high surgical risk. The decision process during TAVI evaluation includes a thorough benefit-risk assessment, and knowledge about long-term benefits and outcomes may improve patients’ expectation management. Objective. To evaluate patients’ perceived health status and self-reported long-term outcome more than 5 years after TAVI. Methods and Results. Demographic and procedure data were obtained from all patients treated with TAVI at our institution from 2006 to 2012. A cross-sectional survey was conducted on the patients alive, measuring health status, including the EQ-5D-5L questionnaire, and clinical outcomes. 103 patients (22.8%) were alive at a median follow-up period of 7 years (5.4–9.8). 99 (96%) of the 103 patients were included in the final analysis. The mean age at follow-up was 86.5 years ± 8.0 years, and 56.6% were female. Almost all patients (93.9%) described an improvement of their quality of life after receiving TAVI. At late follow-up, the mean utility index and EQ-VAS score were 0.80 ± 0.20 and 58.49 ± 11.49, respectively. Mobility was found to be the most frequently reported limitation (85.4%), while anxiety/depression was the least frequently reported limitation (19.8%). With respect to functional class, 64.7% were in New York Heart Association (NYHA) class III or IV, compared to 67.0% prior to TAVI (p = 0.51). Self-reported long-term outcomes revealed mainly low long-term complication rates. 74 total hospitalizations were reported after TAVI, and among those 43% for cardiovascular reasons. Within cardiovascular rehospitalizations, new pacemaker implantations were the most frequently reported (18.9%), followed by cardiac decompensation and coronary heart disease (15.6%). Conclusion. The majority of the patients described an improvement of health status after TAVI. More than five years after TAVI, the patients’ perceived health status was satisfactory, and the incidence of clinical events and hospitalizations was very low.
Extensive black shale deposits formed in the Early Cretaceous South Atlantic, supporting the notion that this emerging ocean basin was a globally important site of organic carbon burial. The magnitude of organic carbon burial in marine basins is known to be controlled by various tectonic, oceanographic, hydrological, and climatic processes acting on different temporal and spatial scales, the nature and relative importance of which are poorly understood for the young South Atlantic. Here we present new bulk and molecular geochemical data from an Aptian–Albian sediment record recovered from the deep Cape Basin at Deep Sea Drilling Project (DSDP) Site 361, which we combine with general circulation model results to identify driving mechanisms of organic carbon burial. A multimillion-year decrease (i.e., Early Aptian–Albian) in organic carbon burial, reflected in a lithological succession of black shale, gray shale, and red beds, was caused by increasing bottom water oxygenation due to abating hydrographic restriction via South Atlantic–Southern Ocean gateways. These results emphasize basin evolution and ocean gateway development as a decisive primary control on enhanced organic carbon preservation in the Cape Basin at geological timescales (> 1 Myr). The Early Aptian black shale sequence comprises alternations of shales with high (> 6 %) and relatively low (∼ 3.5 %) organic carbon content of marine sources, the former being deposited during the global Oceanic Anoxic Event (OAE) 1a, as well as during repetitive intervals before and after OAE 1a. In all cases, these short-term intervals of enhanced organic carbon burial coincided with strong influxes of sediments derived from the proximal African continent, indicating closely coupled climate–land–ocean interactions. Supported by our model results, we show that fluctuations in weathering-derived nutrient input from the southern African continent, linked to changes in orbitally driven humidity and aridity, were the underlying drivers of repetitive episodes of enhanced organic carbon burial in the deep Cape Basin. These results suggest that deep marine environments of emerging ocean basins responded sensitively and directly to short-term fluctuations in riverine nutrient fluxes. We explain this relationship using the lack of wide and mature continental shelf seas that could have acted as a barrier or filter for nutrient transfer from the continent into the deep ocean.
Extensive black shale deposits formed in the Early Cretaceous South Atlantic, supporting the notion that this emerging ocean basin was a globally important site of organic carbon burial. The magnitude of organic carbon burial in marine basins is known to be controlled by various tectonic, oceanographic, hydrological, and climatic processes acting on different temporal and spatial scales, the nature and relative importance of which are poorly understood for the young South Atlantic. Here we present new bulk and molecular geochemical data from an Aptian–Albian sediment record recovered from the deep Cape Basin at Deep Sea Drilling Project (DSDP) Site 361, which we combine with general circulation model results to identify driving mechanisms of organic carbon burial. A multi-million year decrease (i.e. Early Aptian–Albian) in organic carbon burial, reflected in a lithological succession of black shale, gray shale, and red beds, was caused by increasing bottom water oxygenation due to abating tectonic restriction via South Atlantic-Southern Ocean gateways. These results emphasize basin evolution and ocean gateway development as a decisive primary control on enhanced organic carbon preservation in the Cape Basin at geological time scales (>1 Myr). The Early Aptian black shale sequence comprises alternations of shales with high (>5%) and relatively low (~3%) organic carbon content of marine sources, the former being deposited during the global Oceanic Anoxic Event (OAE) 1a, as well as during repetitive events before and after OAE 1a. In all cases, these short-term events of enhanced organic carbon burial coincided with strong influxes of sediments derived from the proximal African continent, indicating closely coupled climate–land–ocean interactions. Supported by our model results, we propose that fluctuations in weathering-derived nutrient input from the southern African continent, linked to fluctuations in pCO2 and/or orbitally driven humidity/aridity, were the underlying drivers of short-term organic carbon burial in the deep Cape Basin. These results suggest that deep marine environments of emerging ocean basins responded sensitively and directly to short term fluctuations in riverine nutrient fluxes. We explain this relationship by the lack of wide and mature continental shelf seas that could have acted as a barrier or filter for nutrient transfer from the continent into the deep ocean.
Long non-coding RNAs (lncRNAs) contribute to cardiac (patho)physiology. Aging is the major risk factor for cardiovascular disease with cardiomyocyte apoptosis as one underlying cause. Here, we report the identification of the aging-regulated lncRNA Sarrah (ENSMUST00000140003) that is anti-apoptotic in cardiomyocytes. Importantly, loss of SARRAH (OXCT1-AS1) in human engineered heart tissue results in impaired contractile force development. SARRAH directly binds to the promoters of genes downregulated after SARRAH silencing via RNA-DNA triple helix formation and cardiomyocytes lacking the triple helix forming domain of Sarrah show an increase in apoptosis. One of the direct SARRAH targets is NRF2, and restoration of NRF2 levels after SARRAH silencing partially rescues the reduction in cell viability. Overexpression of Sarrah in mice shows better recovery of cardiac contractile function after AMI compared to control mice. In summary, we identified the anti-apoptotic evolutionary conserved lncRNA Sarrah, which is downregulated by aging, as a regulator of cardiomyocyte survival.
Purpose: Amblyopia with eccentric fixation, especially when not diagnosed early, is a therapeutic challenge, as visual outcome is known to be poorer than in amblyopia with central fixation. Consequently, treatment after late diagnosis is often denied. Electronic monitoring of occlusion provides us the chance to gain first focussed insight into age-dependent dose response and treatment efficiency, as well as the shift of fixation in this rare group of paediatric patients. Methods: In our prospective pilot study, we examined amblyopes with eccentric fixation during 12 months of occlusion treatment. We evaluated their visual acuity, recorded patching duration using a TheraMon®-microsensor, and determined their fixation with a direct ophthalmoscope. Dose-response relationship and treatment efficiency were calculated. Results: The study included 12 participants with strabismic and combined amblyopia aged 2.9–12.4 years (mean 6.5). Median prescription of occlusion was 7.7 h/day (range 6.6–9.9) and median daily received occlusion was 5.2 h/day (range 0.7–9.7). At study end, median acuity gain was 0.6 log units (range 0–1.6) and residual interocular visual acuity difference (IOVAD) 0.3 log units (range 0–1.8). There was neither significant acuity gain nor reduction in IOVAD after the 6th month of treatment. Children younger than 4 years showed best response with lowest residual IOVAD at study end. Efficiency calculation showed an acuity gain of approximately one line from 100 h of patching in the first 2 months and half a line after 6 months. There was a significant decline of treatment efficiency with age (p = 0.01). Foveolar fixation was achieved after median 3 months (range 1–6). Three patients (> 6 years) did not gain central fixation. Conclusion: Eccentric fixation is a challenge to therapy success. Based on electronic monitoring, our study quantified for the first time the reduction of treatment efficiency with increasing age in amblyopes with eccentric fixation. Despite some improvement in patients up to 8 years, older patients showed significantly lower treatment efficiency. In younger patients with good adherence, despite poor initial acuity, central fixation and low residual IOVAD could be attained after median 3 months. Hence, the necessity of early diagnosis and intensive occlusion should be emphasized.
Purpose: Amblyopia with eccentric fixation, especially when not diagnosed early, is a therapeutic challenge, as visual outcome is known to be poorer than in amblyopia with central fixation. Consequently, treatment after late diagnosis is often denied. Electronic monitoring of occlusion provides us the chance to gain first focussed insight into age-dependent dose response and treatment efficiency, as well as the shift of fixation in this rare group of paediatric patients. Methods: In our prospective pilot study, we examined amblyopes with eccentric fixation during 12 months of occlusion treatment. We evaluated their visual acuity, recorded patching duration using a TheraMon®-microsensor, and determined their fixation with a direct ophthalmoscope. Dose-response relationship and treatment efficiency were calculated. Results: The study included 12 participants with strabismic and combined amblyopia aged 2.9–12.4 years (mean 6.5). Median prescription of occlusion was 7.7 h/day (range 6.6–9.9) and median daily received occlusion was 5.2 h/day (range 0.7–9.7). At study end, median acuity gain was 0.6 log units (range 0–1.6) and residual interocular visual acuity difference (IOVAD) 0.3 log units (range 0–1.8). There was neither significant acuity gain nor reduction in IOVAD after the 6th month of treatment. Children younger than 4 years showed best response with lowest residual IOVAD at study end. Efficiency calculation showed an acuity gain of approximately one line from 100 h of patching in the first 2 months and half a line after 6 months. There was a significant decline of treatment efficiency with age (p = 0.01). Foveolar fixation was achieved after median 3 months (range 1–6). Three patients (> 6 years) did not gain central fixation. Conclusion: Eccentric fixation is a challenge to therapy success. Based on electronic monitoring, our study quantified for the first time the reduction of treatment efficiency with increasing age in amblyopes with eccentric fixation. Despite some improvement in patients up to 8 years, older patients showed significantly lower treatment efficiency. In younger patients with good adherence, despite poor initial acuity, central fixation and low residual IOVAD could be attained after median 3 months. Hence, the necessity of early diagnosis and intensive occlusion should be emphasized.
Background: Biological psychiatry aims to understand mental disorders in terms of altered neurobiological pathways. However, for one of the most prevalent and disabling mental disorders, Major Depressive Disorder (MDD), patients only marginally differ from healthy individuals on the group-level. Whether Precision Psychiatry can solve this discrepancy and provide specific, reliable biomarkers remains unclear as current Machine Learning (ML) studies suffer from shortcomings pertaining to methods and data, which lead to substantial over-as well as underestimation of true model accuracy.
Methods: Addressing these issues, we quantify classification accuracy on a single-subject level in N=1,801 patients with MDD and healthy controls employing an extensive multivariate approach across a comprehensive range of neuroimaging modalities in a well-curated cohort, including structural and functional Magnetic Resonance Imaging, Diffusion Tensor Imaging as well as a polygenic risk score for depression.
Findings Training and testing a total of 2.4 million ML models, we find accuracies for diagnostic classification between 48.1% and 62.0%. Multimodal data integration of all neuroimaging modalities does not improve model performance. Similarly, training ML models on individuals stratified based on age, sex, or remission status does not lead to better classification. Even under simulated conditions of perfect reliability, performance does not substantially improve. Importantly, model error analysis identifies symptom severity as one potential target for MDD subgroup identification.
Interpretation: Although multivariate neuroimaging markers increase predictive power compared to univariate analyses, single-subject classification – even under conditions of extensive, best-practice Machine Learning optimization in a large, harmonized sample of patients diagnosed using state-of-the-art clinical assessments – does not reach clinically relevant performance. Based on this evidence, we sketch a course of action for Precision Psychiatry and future MDD biomarker research.