Refine
Year of publication
Document Type
- Article (5486) (remove)
Has Fulltext
- yes (5486)
Keywords
- inflammation (80)
- COVID-19 (62)
- SARS-CoV-2 (49)
- cancer (39)
- glioblastoma (38)
- apoptosis (37)
- Inflammation (35)
- breast cancer (34)
- autophagy (30)
- prostate cancer (30)
Institute
- Medizin (5486) (remove)
Exposure to locusts, which belong to the arthropod phylum, is an underestimated health problem, especially among workers in research facilities exposed to laboratory animals. We describe a rare case of an occupational immediate-type reaction to locusts with a possible cross-reactivity between desert locust (Schistocerca gregaria) and migratory locust (Locusta migratoria).
Background: SARS-CoV-2 has massively changed the care situation in hospitals worldwide. Although tumour care should not be affected, initial reports from European countries were suggestive for a decrease in skin cancer during the first pandemic wave and only limited data are available thereafter.
Objectives: The aim of this study was to investigate skin cancer cases and surgeries in a nationwide inpatient dataset in Germany.
Methods: Comparative analyses were performed in a prepandemic (18 March 2019 until 17 March 2020) and a pandemic cohort (18 March 2020 until 17 March 2021). Cases were identified and analysed using the WHO international classification of diseases codes (ICDs) and process key codes (OPSs).
Results: Comparing the first year of the pandemic with the same period 1 year before, a persistent decrease of 14% in skin cancer cases (n = 19 063) was observed. The largest decrease of 24% was seen in non-invasive in situ tumours (n = 1665), followed by non-melanoma skin cancer (NMSC) with a decrease of 16% (n = 15 310) and malignant melanoma (MM) with a reduction of 7% (n = 2088). Subgroup analysis showed significant differences in the distribution of sex, age, hospital carrier type and hospital volume. There was a decrease of 17% in surgical procedures (n = 22 548), which was more pronounced in minor surgical procedures with a decrease of 24.6% compared to extended skin surgery including micrographic surgery with a decrease of 15.9%.
Conclusions: Hospital admissions and surgical procedures decreased persistently since the beginning of the pandemic in Germany for skin cancer patients. The higher decrease in NMSC cases compared to MM might reflect a prioritization effect. Further evidence from tumour registries is needed to investigate the consequences of the therapy delay and identify the upcoming challenges in skin cancer care.
Introduction. Balapiravir (R1626, RG1626) is the prodrug of a nucleoside analogue inhibitor of the hepatitis C virus (HCV) RNA-dependent RNA polymerase (R1479, RG1479). This phase 2, double-blind international trial evaluated the optimal treatment regimen of balapiravir plus peginterferon alfa-2a (40KD)/ribavirin.
Material and methods. Treatment-naive genotype 1 patients (N = 516) were randomized to one of seven treatment groups in which they received balapiravir 500, 1,000, or 1,500 mg twice daily, peginterferon alfa2a (40KD) 180 or 90 Mg/week and ribavirin 1,000/1,200 mg/day or peginterferon alfa-2a (40KD)/ribavirin. The planned treatment duration with balapiravir was reduced from 24 to 12 weeks due to safety concerns.
Results. The percentage of patients with undetectable HCV RNA was consistently higher in all balapiravir groups from week 2 to 12. However, high rates of dose modifications and discontinuations of one/all study drugs compromised the efficacy assessment and resulted in similar sustained virological response rates in the balapiravir groups (range 32-50%) and the peginterferon alfa-2a (40KD)/ribavirin group (43%). Balapiravir was discontinued for safety reasons in 28-36% of patients (most often for lymphopenia) and the percentage of patients with serious adverse events (especially hematological, infection, ocular events) was dose related. Serious hematological adverse events (particularly neutropenia, lymphopenia) were more common in balapiravir recipients. Two deaths in the balapiravir/peginterferon alfa-2a/ribavirin combination groups were considered possibly related to study medication.
Conclusion. Further development of balapiravir for the treatment of chronic hepatitis C has been halted because of the unacceptable benefit to risk ratio revealed in this study (www.ClinicalTrials.gov NCT 00517439).
Background: Cirrhosis is known to have a high prevalence and mortality worldwide. However, in Europe, the epidemiology of cirrhosis is possibly undergoing demographic changes, and etiologies may have changed due to improvements in standard of care. The aim of this population-based study was to analyze the trends and the course of liver cirrhosis and its complications in recent years in Germany.
Methods: We analyzed the data of all hospital admissions in Germany within diagnosis-related groups from 2005 to 2018. The diagnostic records of cirrhosis and other categories of diseases were based on ICD-10-GM codes. The primary outcome measurement was in-hospital mortality. Trends were analyzed through Poisson regression of annual number of admissions. The impact of cirrhosis on overall in-hospital mortality were assessed through the multivariate multilevel logistic regression model adjusted for age, sex, and comorbidities.
Findings: Of the 248,085,936 admissions recorded between 2005 and 2018, a total of 2,302,171(0•94%) were admitted with the diagnosis of cirrhosis, mainly as a comorbidity. Compared with other chronic diseases, patients admitted with cirrhosis were younger, mainly male and had the highest in-hospital mortality rate. Diagnosis of cirrhosis was an independent risk factor of in-hospital mortality with the highest odds ratio (OR:6•2[95%CI:6.1-6•3]) among all diagnoses. The prevalence of non-alcoholic fatty liver disease has increased four times from 2005 to 2018, while alcoholic cirrhosis is 20 times than other etiologies. Bleeding was found to be decreasing over time, but ascites remained the most common complication and was increasing.
Interpretation: This nationwide study demonstrates that cirrhosis represents a considerable healthcare burden, as shown by the increasing in-hospital mortality, also in combination with other chronic diseases. Alcohol-related cirrhosis and complications are on the rise. More resources and better management strategies are warranted.
While patients with chronic hepatitis C virus (HCV) infection are treated in order to prevent liver-related morbidity and mortality, we rely on sustained virological response (SVR) as a virological biomarker to evaluate treatment efficacy in both clinical practice as well as in drug development. However, conclusive evidence for the clinical benefit of antiviral therapy or validity of SVR as surrogate marker, as derived from trials randomizing patients to a treatment or control arm, is lacking. In fact, the Hepatitis C Antiviral Long-term Treatment Against Cirrhosis (HALT-C) trial recently showed an increased mortality rate among interferon-treated patients compared to untreated controls. Consequently, the recommendation to treat patients with chronic HCV infection was challenged.
Here, we argue that the possible harmful effect of long-term low-dose pegylated interferon mono therapy, as was observed in the HALT-C trial cohort, cannot be extrapolated to potentially curative short-term treatment regimens. Furthermore, we discuss SVR as a surrogate biomarker, based on numerous studies which indicated an association between SVR and improvements in health-related quality of life, hepatic inflammation and fibrosis, and portal pressure as well as a reduced risk for hepatocellular carcinoma (HCC), liver failure and mortality.
The hepatitis C virus (HCV) was discovered in the late 1980s. Interferon (IFN)-α was proposed as an antiviral treatment for chronic hepatitis C at about the same time. Successive improvements in IFN-α-based therapy (dose finding, pegylation, addition of ribavirin) increased the rates of sustained virologic response, i.e. the rates of curing HCV infection. These rates were further improved by adding the first available direct-acting antiviral (DAA) drugs to the combination of pegylated IFN-α and ribavirin. An IFN-free era finally started in 2014, yielding rates of sustained virologic response over 90% in patients treated for 8 to 24 weeks with all-oral regimens. Major challenges however remain in implementation of these new treatment strategies, not only in low- to middle-income countries, but also in high-income countries where the price of these therapies is still prohibitive. Elimination of HCV infection through treatment in certain areas is possible but raises major public health issues.
Purpose: Filler injections for aesthetic purposes are very popular, but can have far-reaching and irreversible consequences. This report describes the course of a patient with devastating complications after glabellar hyaluronic acid injection, their pathomechanism, management and outcome.
Observations: A healthy, 43-year-old woman underwent her first hyaluronic acid injection in the glabella and went blind on her left eye immediately thereafter. Massaging of the injection area and observation were performed, before she presented with swelling of the left forehead and upper lid, ptosis, complete ophthalmoplegia and blindness in our hospital. Immediate massaging of the globe and systemic therapy including acetylsalicylic acid, tinzaparin sodium and cortisone was initiated and hyaluronidase injections in the injection area were performed. In the further course, the patient developed necrotic and hemorrhagic skin and mucosal lesions, lagophthalmos, anterior and posterior segment ischemia and globe hypotonia with consecutive globe deformation. In the follow-up of 2.5 months, lid swelling, lagophthalmos and ptosis resolved and keratopathy improved but blindness, skin lesions and strabismus with reduced eye motility were still present and madarosis and early enophthalmos were detected.
Conclusions and Importance: The outcome of ophthalmic artery occlusion after hyaluronic acid filler injection is poor. Sufficient knowledge about facial anatomy, the implementation of filler injections and the management of complications is essential for the practitioner. The patient should be clarified about potential and even rare risks of these procedures.
Purpose: The use of a non-diffractive extended-depth-of-focus (EDOF) intraocular lens (IOL) with slight myopia of −0.5 D on the non-dominant eye increases the spectacle independence and has good subjective tolerance with optical phenomena comparable to those of a monofocal IOL. This case report describes the course of a myopic patient who underwent refractive lens exchange, didn't tolerate mini-monovision and received IOL exchange therefore.
Observations: A healthy, 62-year-old male with myopia of approximately −5 D underwent refractive lens exchange with a non-diffractive EDOF-IOL on both eyes with slight myopia on the non-dominant left eye (mini-monovision). The operation was performed without any complications, postoperative treatment was due to the clinic's standard procedure. Two weeks postoperative the patient presented with uncorrected distance visual acuity of 0.0 logMAR, a subjective refraction of −0.25/−0.25/142° and corrected distance visual acuity of 0.1 logMAR on the right eye. On the left eye, distance visual acuity was 0.4 logMAR with a subjective refraction of −0.5/−0.75/9° (intended mini-monovision) and corrected distance visual acuity of 0.0 logMAR. Binocular distance visual acuity was 0.0 logMAR. The patient complained about the occurrence of optical phenomena at dim light while driving a car and subjective reduced visual acuity. After an IOL exchange on the left eye with the implantation of the same type of non-diffractive EDOF-IOL aimed for emmetropia, the patient was symptom-free and reported no more subjective complaints.
Conclusions: Despite the satisfying subjective and objective visual outcome which is proven in multiple studies, the subjective perception of a mini-monovision with a non-diffractive EDOF-IOL can vary individually. A preoperative assessment of the patient's needs and tolerance of a mini-monovision is crucial for a satisfying postoperative outcome.
Purpose: To report a case of autoimmune keratitis in a patient with mycobacterium tuberculosis (MBT).
Methods: An 84-year-old male with pulmonary tuberculosis (TB) was admitted with chronic, non-healing bilateral ulcerations of the inferior peripheral cornea associated with stromal and subconjunctival nodules.
Results: Clinical examination revealed circumscribed peripheral corneal ulceration with whitish nodules in adjacent stromal and subconjunctival tissue. Microbiological cultures of the corneal tissue were negative for MBT and other microbial pathogens; however, enzyme-linked immunosorbent assay (ELISA) of blood and corneal samples showed significantly elevated levels of IgM and IgA against MBT. In addition to systemic anti-tuberculosis therapy, the patient was treated topically with Polyspectran® eye drops, Dexamethasone eye drops, and Bepanthen® ointment, for 2 weeks. Both eyes showed dramatic improvement after 2 weeks.
Conclusion: The present report demonstrates that MBT is able to initiate delayed autoimmune response within the corneal tissue during an intensive phase of anti-tuberculosis treatment.
Objectives: Stenosis of the biliary anastomosis predisposes liver graft recipients to bacterial cholangitis. Antibiotic therapy (AT) is performed according to individual clinical judgment, but duration of AT remains unclear.
Methods: All liver graft recipients with acute cholangitis according to the Tokyo criteria grade 1 and 2 after endoscopic retrograde cholangiography (ERC) were included. Outcome of patients treated with short AT (<7 days) was compared to long AT (>6 days). Recurrent cholangitis (RC) within 28 days was the primary end point.
Results: In total, 30 patients were included with a median of 313 (range 34–9849) days after liver transplantation until first proven cholangitis. Among 62 cases in total, 51/62 (82%) were graded as Tokyo-1 and 11/62 (18%) as Tokyo-2. Overall median duration of AT was 6 days (1–14) with 36 cases (58%) receiving short AT and 26 (42%) receiving long AT. RC was observed in 10 (16%) cases, without significant difference in occurrence of RC in short versus long AT cases. CRP and bilirubin were significantly higher in patients with long AT, while low serum albumin and low platelets were associated with risk of RC.
Conclusion: A shorter antibiotic course than 7 days shows good results in selected, ERC-treated patients for post-transplantation biliary strictures.
Classic Hodgkin lymphoma (cHL) is usually characterized by a low tumour cell content, derived from crippled germinal centre B cells. Rare cases have been described in which the tumour cells show clonal T-cell receptor rearrangements. From a clinicopathological perspective, it is unclear if these cases should be classified as cHL or anaplastic large T-cell lymphoma (ALCL). Since we recently observed differences in the motility of ALCL and cHL tumour cells, here, we aimed to obtain a better understanding of T-cell-derived cHL by investigating their global proteomic profiles and their motility. In a proteomics analysis, when only motility-associated proteins were regarded, T-cell-derived cHL cell lines showed the highest similarity to ALK− ALCL cell lines. In contrast, T-cell-derived cHL cell lines presented a very low overall motility, similar to that observed in conventional cHL. Whereas all ALCL cell lines, as well as T-cell-derived cHL, predominantly presented an amoeboid migration pattern with uropod at the rear, conventional cHL never presented with uropods. The migration of ALCL cell lines was strongly impaired upon application of different inhibitors. This effect was less pronounced in cHL cell lines and almost invisible in T-cell-derived cHL. In summary, our cell line-derived data suggest that based on proteomics and migration behaviour, T-cell-derived cHL is a neoplasm that shares features with both cHL and ALCL and is not an ALCL with low tumour cell content. Complementary clinical studies on this lymphoma are warranted.
OBJECTIVE: The role of supraglottic airway devices in emergency airway management is highlighted in international airway management guidelines. We evaluated the application of the new generation laryngeal tube suction (LTS-II/LTS-D) in the management of in-hospital unexpected difficult airway and cardiopulmonary resuscitation.
METHODS: During a seven-year period, patients treated with a laryngeal tube who received routine anesthesia and had an unexpected difficult airway (Cormack Lehane Grade 3-4), who underwent cardiopulmonary resuscitation, or who underwent cardiopulmonary resuscitation outside the operating room and had a difficult airway were evaluated. Successful placement of the LTS II/LTS-D, sufficient ventilation, time to placement, number of placement attempts, stomach content, peripheral oxygen saturation/end-tidal carbon dioxide development (SpO2/etCO2) over 5 minutes, subjective overall assessment and complications were recorded.
RESULTS: In total, 106 adult patients were treated using an LTS-II/LTS-D. The main indication for placement was a difficult airway (75%, n=80), followed by cardiopulmonary resuscitation (25%, n=26) or an overlap between both (18%, n=19). In 94% of patients (n=100), users placed the laryngeal tube during the first attempt. In 93% of patients (n=98), the tube was placed within 30 seconds. A significant increase in SpO2 from 97% (0-100) to 99% (5-100) was observed in the whole population and in cardiopulmonary resuscitation patients. The average initial etCO2 of 39.5 mmHg (0-100 mmHg) decreased significantly to an average of 38.4 mmHg (10-62 mmHg) after 5 minutes. A comparison of cardiopulmonary resuscitation patients with non-cardiopulmonary resuscitation patients regarding gastric contents showed no significant difference.
CONCLUSIONS: LTS-D/LTS-II use for in-hospital unexpected difficult airway management provides a secure method for primary airway management until other options such as video laryngoscopy or fiber optic intubation become available.
Background: Spondylodiscitis is a potentially life-threatening infection of the intervertebral disk and adjacent vertebral bodies, with a mortality rate of 2–20%. Given the aging population, the increase in immunosuppression, and intravenous drug use in England, the incidence of spondylodiscitis is postulated to be increasing; however, the exact epidemiological trend in England remains unknown.
Objective: The Hospital Episode Statistics (HES) database contains details of all secondary care admissions across NHS hospitals in England. This study aimed to use HES data to characterise the annual activity and longitudinal change of spondylodiscitis in England.
Methods: The HES database was interrogated for all cases of spondylodiscitis between 2012 and 2019. Data for the length of stay, waiting time, age-stratified admissions, and ‘Finished Consultant Episodes’ (FCEs), which correspond to a patient's hospital care under a lead clinician, were analysed.
Results: In total, 43135 FCEs for spondylodiscitis were identified between 2012 and 2022, of which 97.1% were adults. Overall admissions for spondylodiscitis have risen from 3 per 100,000 population in 2012/13 to 4.4 per 100,000 population in 2020/21. Similarly, FCEs have increased from 5.8 to 10.3 per 100,000 population, in 2012–2013 and 2020/21 respectively. The highest increase in admissions from 2012 to 2021 was recorded for those aged 70–74 (117% increase) and aged 75-59 (133% increase), among those of working age for those aged 60–64 years (91% increase).
Conclusion: Population-adjusted admissions for spondylodiscitis in England have risen by 44% between 2012 and 2021. Healthcare policymakers and providers must acknowledge the increasing burden of spondylodiscitis and make spondylodiscitis a research priority.
Metabolic adaptation and signal integration in response to hypoxic conditions is mainly regulated by hypoxia-inducible factors (HIFs). At the same time, hypoxia induces ROS formation and activates the unfolded protein response (UPR), indicative of endoplasmic reticulum (ER) stress. However, whether ER stress would affect the hypoxia response remains ill-defined. Here we report that feeding mice a high fat diet causes ER stress and attenuates the response to hypoxia. Mechanistically, ER stress promotes HIF-1α and HIF-2α degradation independent of ROS, Ca2+, and the von Hippel-Lindau (VHL) pathway, involving GSK3β and the ubiquitin ligase FBXW1A/βTrCP. Thereby, we reveal a previously unknown function of the GSK3β/HIFα/βTrCP1 axis in ER homeostasis and demonstrate that inhibition of the HIF-1 and HIF-2 response and genetic deficiency of GSK3β affects proliferation, migration, and sensitizes cells for ER stress promoted apoptosis. Vice versa, we show that hypoxia affects the ER stress response mainly through the PERK-arm of the UPR. Overall, we discovered previously unrecognized links between the HIF pathway and the ER stress response and uncovered an essential survival pathway for cells under ER stress.
Ferroptosis is an iron-dependent form of cell death, which is triggered by disturbed membrane integrity due to an overproduction of lipid peroxides. Induction of ferroptosis comprises several alterations, i.e. altered iron metabolism, response to oxidative stress, or lipid peroxide production. At the physiological level transcription, translation, and microRNAs add to the appearance and/or activity of building blocks that negatively or positively balance ferroptosis. Ferroptosis contributes to tissue damage in the case of, e.g., brain and heart injury but may be desirable to overcome chemotherapy resistance. For a more complete picture, it is crucial to also consider the cellular microenvironment, which during inflammation and in the tumor context is dominated by hypoxia. This graphical review visualizes basic mechanisms of ferroptosis, categorizes general inducers and inhibitors of ferroptosis, and puts a focus on microRNAs, iron homeostasis, and hypoxia as regulatory components.
Optogenetic stimulation of inhibitory interneurons has become a commonly used strategy for silencing neuronal activity. This is typically achieved using transgenic mice expressing excitatory opsins in inhibitory interneurons throughout the brain, raising the question of how spatially extensive the resulting inhibition is. Here, we characterize neuronal silencing in VGAT-ChR2 mice, which express channelrhodopsin-2 in inhibitory interneurons, as a function of light intensity and distance from the light source in several cortical and subcortical regions. We show that light stimulation, even at relatively low intensities, causes inhibition not only in brain regions targeted for silencing but also in their subjacent areas. In contrast, virus-mediated expression of an inhibitory opsin enables robust silencing that is restricted to the region of opsin expression. Our results reveal important constraints on using inhibitory interneuron activation to silence neuronal activity and emphasize the necessity of carefully controlling light stimulation parameters when using this silencing strategy.
Pattern recognition applied to whole-brain neuroimaging data, such as functional Magnetic Resonance Imaging (fMRI), has proved successful at discriminating psychiatric patients from healthy participants. However, predictive patterns obtained from whole-brain voxel-based features are difficult to interpret in terms of the underlying neurobiology. Many psychiatric disorders, such as depression and schizophrenia, are thought to be brain connectivity disorders. Therefore, pattern recognition based on network models might provide deeper insights and potentially more powerful predictions than whole-brain voxel-based approaches. Here, we build a novel sparse network-based discriminative modeling framework, based on Gaussian graphical models and L1-norm regularized linear Support Vector Machines (SVM). In addition, the proposed framework is optimized in terms of both predictive power and reproducibility/stability of the patterns. Our approach aims to provide better pattern interpretation than voxel-based whole-brain approaches by yielding stable brain connectivity patterns that underlie discriminative changes in brain function between the groups. We illustrate our technique by classifying patients with major depressive disorder (MDD) and healthy participants, in two (event- and block-related) fMRI datasets acquired while participants performed a gender discrimination and emotional task, respectively, during the visualization of emotional valent faces.