Refine
Year of publication
Document Type
- Article (5330) (remove)
Has Fulltext
- yes (5330)
Keywords
- inflammation (76)
- COVID-19 (60)
- SARS-CoV-2 (46)
- cancer (38)
- glioblastoma (38)
- apoptosis (35)
- Inflammation (34)
- breast cancer (34)
- autophagy (29)
- prostate cancer (28)
Institute
- Medizin (5330) (remove)
Background: Patients with chronic hepatitis C virus (HCV) infection and active or previous hepatitis B virus (HBV) are at risk of HBV reactivation (HBV-R) during direct-acting antiviral (DAA) therapy. Recent reports suggest that HBV-R may even occur several months after completion of DAA therapy. The aim of this study was to assess the risk of HBV-R in patients with resolved HBV after successful DAA therapy during long-term follow-up (FU).
Methods: Among 848 patients treated for chronic HCV, all patients with resolved HBV and long-term FU data were eligible for inclusion. Patients were HBV DNA/hepatitis B surface antigen (HBsAg)–negative at the end of therapy (EOT) and were followed for up to 52 weeks thereafter. Patients underwent regular alanine transaminase (ALT) testing, and additional HBV DNA/HBsAg testing was performed at FU week 12, end of FU, and in case of an ALT increase above the upper limit of normal (>ULN).
Results: A total of 108 patients were followed up for a mean (range) of 41.5 (24–52) weeks after EOT. None of the patients experienced reverse HBsAg seroconversion or reappearance of HBV DNA. One patient received a liver transplantation; 1 patient was diagnosed with de novo hepatocellular carcinoma, and 2 patients died. Eighteen patients (16.7%) had increased ALT levels (grade 0/1). Of those, the majority were male (72.2%) and significantly more patients had cirrhosis (66.7% vs 36.2%, P = .015) or received ribavirin as part of their treatment regimen (86.7% vs 46.8%, P = .041). None of these were associated with HBV-R.
Conclusions: Our results indicate that the risk of HBV-R in patients with resolved HBV treated with DAAs for HCV is low during long-term follow-up.
Background: Intracerebral haemorrhage growth is associated with poor clinical outcome and is a therapeutic target for improving outcome. We aimed to determine the absolute risk and predictors of intracerebral haemorrhage growth, develop and validate prediction models, and evaluate the added value of CT angiography.
Methods: In a systematic review of OVID MEDLINE—with additional hand-searching of relevant studies' bibliographies— from Jan 1, 1970, to Dec 31, 2015, we identified observational cohorts and randomised trials with repeat scanning protocols that included at least ten patients with acute intracerebral haemorrhage. We sought individual patient-level data from corresponding authors for patients aged 18 years or older with data available from brain imaging initially done 0·5–24 h and repeated fewer than 6 days after symptom onset, who had baseline intracerebral haemorrhage volume of less than 150 mL, and did not undergo acute treatment that might reduce intracerebral haemorrhage volume. We estimated the absolute risk and predictors of the primary outcome of intracerebral haemorrhage growth (defined as >6 mL increase in intracerebral haemorrhage volume on repeat imaging) using multivariable logistic regression models in development and validation cohorts in four subgroups of patients, using a hierarchical approach: patients not taking anticoagulant therapy at intracerebral haemorrhage onset (who constituted the largest subgroup), patients taking anticoagulant therapy at intracerebral haemorrhage onset, patients from cohorts that included at least some patients taking anticoagulant therapy at intracerebral haemorrhage onset, and patients for whom both information about anticoagulant therapy at intracerebral haemorrhage onset and spot sign on acute CT angiography were known.
Findings: Of 4191 studies identified, 77 were eligible for inclusion. Overall, 36 (47%) cohorts provided data on 5435 eligible patients. 5076 of these patients were not taking anticoagulant therapy at symptom onset (median age 67 years, IQR 56–76), of whom 1009 (20%) had intracerebral haemorrhage growth. Multivariable models of patients with data on antiplatelet therapy use, data on anticoagulant therapy use, and assessment of CT angiography spot sign at symptom onset showed that time from symptom onset to baseline imaging (odds ratio 0·50, 95% CI 0·36–0·70; p<0·0001), intracerebral haemorrhage volume on baseline imaging (7·18, 4·46–11·60; p<0·0001), antiplatelet use (1·68, 1·06–2·66; p=0·026), and anticoagulant use (3·48, 1·96–6·16; p<0·0001) were independent predictors of intracerebral haemorrhage growth (C-index 0·78, 95% CI 0·75–0·82). Addition of CT angiography spot sign (odds ratio 4·46, 95% CI 2·95–6·75; p<0·0001) to the model increased the C-index by 0·05 (95% CI 0·03–0·07).
Interpretation: In this large patient-level meta-analysis, models using four or five predictors had acceptable to good discrimination. These models could inform the location and frequency of observations on patients in clinical practice, explain treatment effects in prior randomised trials, and guide the design of future trials.
Funding: UK Medical Research Council and British Heart Foundation.
To search for novel strategies to enhance the tumor necrosis factor-related apoptosis-inducing ligand (TRAIL)-induced apoptosis pathways in glioblastoma, we used the B-cell lymphoma 2/Bcl2-like 2-inhibitor ABT-737. Here we report that ABT-737 and TRAIL cooperate to induce apoptosis in several glioblastoma cell lines in a highly synergistic manner (combination index <0.1). Interestingly, the concerted action of ABT-737 and TRAIL to trigger the accumulation of truncated Bid (tBid) at mitochondrial membranes is identified as a key underlying mechanism. ABT-737 and TRAIL cooperate to cleave BH3-interacting domain death agonist (Bid) into its active fragment tBid, leading to increased accumulation of tBid at mitochondrial membranes. Coinciding with tBid accumulation, the activation of Bcl2-associated X protein (Bax), loss of mitochondrial membrane potential, release of cytochrome-c and second mitochondria-derived activator of caspase (Smac) into the cytosol and caspase activation are strongly increased in cotreated cells. Of note, knockdown of Bid significantly decreases ABT-737- and TRAIL-mediated Bax activation and apoptosis. Also, caspase-3 silencing reduces ABT-737- and TRAIL-induced Bid cleavage and apoptosis, indicating that a caspase-3-driven, mitochondrial feedback loop contributes to Bid processing. Importantly, ABT-737 profoundly enhances TRAIL-triggered apoptosis in primary cultured glioblastoma cells derived from tumor material, underlining the clinical relevance. Also, ABT-737 acts in concert with TRAIL to suppress tumor growth in an in vivo glioblastoma model. In conclusion, the rational combination of ABT-737 and TRAIL cooperates to trigger tBid mitochondrial accumulation and apoptosis. This approach presents a promising strategy for targeting the apoptosis pathways in glioblastoma, which warrants further investigation.
Purpose: The diagnosis of abusive head trauma (AHT) is complex and neuroimaging plays a crucial role. Our goal was to determine whether non-neuroradiologists with standard neuroradiology knowledge perform as well as neuroradiologists with experience in pediatric neuroimaging in interpreting MRI in cases of presumptive AHT (pAHT).
Methods: Twenty children were retrospectively evaluated. Patients had been diagnosed with pAHT (6 patients), non-abusive head trauma-NAHT (5 patients), metabolic diseases (3 patients), and benign enlargement of the subarachnoid spaces (BESS) (6 patients). The MRI was assessed blindly, i.e., no clinical history was given to the 3 non-neuroradiologists and 3 neuroradiologists from 2 different institutions.
Results: Blindly, neuroradiologists demonstrated higher levels of sensitivity and positive predictive value in the diagnosis of pAHT (89%) than non-neuroradiologists (50%). Neuroradiologists chose correctly pAHT as the most probable diagnosis 16 out of 18 times; in contrast, non-neuroradiologists only chose 9 out of 18 times. In our series, the foremost important misdiagnosis for pAHT was NAHT (neuroradiologists twice and non-neuroradiologists 5 times). Only victims of motor vehicle accidents were blindly misdiagnosed as pAHT. No usual household NAHT was not misdiagnosed as pAHT. Neuroradiologists correctly ruled out pAHT in all cases of metabolic diseases and BESS.
Conclusion: MRI in cases of suspected AHT should be evaluated by neuroradiologists with experience in pediatric neuroimaging. Neuroradiologists looked beyond the subdural hemorrhage (SDH) and were more precise in the assessment of pAHT and its differential diagnosis than non-neuroradiologists were. It seems that non-neuroradiologists mainly assess whether or not a pAHT is present depending on the presence or absence of SDH.
This article intends to give an overview about developments in European Regulatory and Health Technology Assessment (HTA) of new cancer drugs. As background information, it will refer to an overview article by Bergmann et al. [1], which pointed out the status and the limitations of the current system. The authors discussed possible steps to improve the interface between regulators and HTA bodies but stated that this alone will not be sufficient to overcome heterogeneous HTA assessments between HTA agencies. Major issues and challenges for the foreseeable future will be to overcome the heterogeneity of patient access decisions of pharmaceutical payers across Europe which is due to (i) considerably different scientific approaches and methodology to the more or less formal evaluation of cost-effectiveness; (ii) differing health priorities across the countries that reflect historically developed cultural differences and values or different unmet medical needs and (iii) different economic strengths among nations, regions and locales that necessarily drive health care budgetary decisions. The authors consider that this needs a science-based common position on methodology, greater commitments by politicians and health care decision makers to ensure equal access for patients across the EU to anti-tumour medicines. ...
Purpose: The coronavirus disease 2019 (COVID-19) poses major challenges to health-care systems worldwide. This pandemic demonstrates the importance of timely access to intensive care and, therefore, this study aims to explore the accessibility of intensive care beds in 14 European countries and its impact on the COVID-19 case fatality ratio (CFR).
Methods: We examined access to intensive care beds by deriving (1) a regional ratio of intensive care beds to 100,000 population capita (accessibility index, AI) and (2) the distance to the closest intensive care unit. The cross-sectional analysis was performed at a 5-by-5 km spatial resolution and results were summarized nationally for 14 European countries. The relationship between AI and CFR was analyzed at the regional level.
Results: We found national-level differences in the levels of access to intensive care beds. The AI was highest in Germany (AI = 35.3), followed by Estonia (AI = 33.5) and Austria (AI = 26.4), and lowest in Sweden (AI = 5) and Denmark (AI = 6.4). The average travel distance to the closest hospital was highest in Croatia (25.3 min by car) and lowest in Luxembourg (9.1 min). Subnational results illustrate that capacity was associated with population density and national-level inventories. The correlation analysis revealed a negative correlation of ICU accessibility and COVID-19 CFR (r = − 0.57; p < 0.001).
Conclusion: Geographical access to intensive care beds varies significantly across European countries and low ICU accessibility was associated with a higher proportion of COVID-19 deaths to cases (CFR). Important differences in access are due to the sizes of national resource inventories and the distribution of health-care facilities relative to the human population. Our findings provide a resource for officials planning public health responses beyond the current COVID-19 pandemic, such as identifying potential locations suitable for temporary facilities or establishing logistical plans for moving severely ill patients to facilities with available beds.
Background: Health care accessibility is known to differ geographically. With this study we focused on analysing accessibility of general and specialized obstetric units in England and Germany with regard to urbanity, area deprivation and neonatal outcome using routine data.
Methods: We used a floating catchment area method to measure obstetric care accessibility, the degree of urbanization (DEGURBA) to measure urbanity and the index of multiple deprivation to measure area deprivation.
Results: Accessibility of general obstetric units was significantly higher in Germany compared to England (accessibility index of 16.2 vs. 11.6; p < 0.001), whereas accessibility of specialized obstetric units was higher in England (accessibility index for highest level of care of 0.235 vs. 0.002; p < 0.001). We further demonstrated higher obstetric accessibility for people living in less deprived areas in Germany (r = − 0.31; p < 0.001) whereas no correlation was present in England. There were also urban–rural disparities present, with higher accessibility in urban areas in both countries (r = 0.37–0.39; p < 0.001). The analysis did not show that accessibility affected neonatal outcomes. Finally, our computer generated model for obstetric care provider demand in terms of birth counts showed a very strong correlation with actual birth counts at obstetric units (r = 0.91–0.95; p < 0.001).
Conclusion: In Germany the focus of obstetric care seemed to be put on general obstetric units leading to higher accessibility compared to England. Regarding specialized obstetric care the focus in Germany was put on high level units whereas in England obstetric care seems to be more balanced between the different levels of care with larger units on average leading to higher accessibility.
NADH:ubiquinone-oxidoreductase (complex I) is the largest membrane protein complex of the respiratory chain. Complex I couples electron transfer to vectorial proton translocation across the inner mitochondrial membrane. The L shaped structure of complex I is divided into a membrane arm and a matrix arm. Fourteen central subunits are conserved throughout species, while some 30 accessory subunits are typically found in eukaryotes. Complex I dysfunction is associated with mutations in the nuclear and mitochondrial genome, resulting in a broad spectrum of neuromuscular and neurodegenerative diseases. Accessory subunit NDUFS4 in the matrix arm is a hot spot for mutations causing Leigh or Leigh-like syndrome. In this review, we focus on accessory subunits of the matrix arm and discuss recent reports on the function of accessory subunit NDUFS4 and its interplay with NDUFS6, NDUFA12, and assembly factor NDUFAF2 in complex I assembly.
Background: Dual-source dual-energy computed tomography (DECT) offers the potential for opportunistic osteoporosis screening by enabling phantomless bone mineral density (BMD) quantification. This study sought to assess the accuracy and precision of volumetric BMD measurement using dual-source DECT in comparison to quantitative CT (QCT). Methods: A validated spine phantom consisting of three lumbar vertebra equivalents with 50 (L1), 100 (L2), and 200 mg/cm3 (L3) calcium hydroxyapatite (HA) concentrations was scanned employing third-generation dual-source DECT and QCT. While BMD assessment based on QCT required an additional standardised bone density calibration phantom, the DECT technique operated by using a dedicated postprocessing software based on material decomposition without requiring calibration phantoms. Accuracy and precision of both modalities were compared by calculating measurement errors. In addition, correlation and agreement analyses were performed using Pearson correlation, linear regression, and Bland-Altman plots. Results: DECT-derived BMD values differed significantly from those obtained by QCT (p < 0.001) and were found to be closer to true HA concentrations. Relative measurement errors were significantly smaller for DECT in comparison to QCT (L1, 0.94% versus 9.68%; L2, 0.28% versus 5.74%; L3, 0.24% versus 3.67%, respectively). DECT demonstrated better BMD measurement repeatability compared to QCT (coefficient of variance < 4.29% for DECT, < 6.74% for QCT). Both methods correlated well to each other (r = 0.9993; 95% confidence interval 0.9984–0.9997; p < 0.001) and revealed substantial agreement in Bland-Altman plots. Conclusions: Phantomless dual-source DECT-based BMD assessment of lumbar vertebra equivalents using material decomposition showed higher diagnostic accuracy compared to QCT.
Background: Because Endomyocardial Biopsy has low sensitivity of about 20%, it can be performed near to myocardium that presented as Late Gadolinium Enhancement (LGE) in cardiovascular magnetic resonance (CMR). However the important issue of comparing topography of CMR and histological findings has not yet been investigated. Thus the current study was performed using an animal model of myocarditis. Results: In 10 male Lewis rats Experimental Autoimmune myocarditis was induced, 10 rats served as control. On day 21 animals were examined by CMR to compare topographic distribution of LGE to histological inflammation. Sensitivity, specificity, positive and negative predictive values for LGE in diagnosing myocarditis were determined for each segment of myocardium. Latter diagnostic values varied widely depending on topographic distribution of LGE and inflammation as well as on the used CMR sequence. Sensitivity of LGE was up to 76% (left lateral myocardium) and positive predictive values were up to 85% (left lateral myocardium), whereas sensitivity and positive predictive value dropped to 0 - 33% (left inferior myocardium). Conclusions: Topographic distribution of LGE and histological inflammation seem to influence sensitivity, specifity, positive and negative predictive values. Nevertheless, positive predictive value for LGE of up to 85% indicates that Endomyocardial Biopsy should be performed "MR-guided". LGE seems to have greater sensitivity than Endomyocardial Biopsy for the diagnosis of myocarditis.