Article
Refine
Year of publication
- 2019 (498) (remove)
Document Type
- Article (498) (remove)
Has Fulltext
- yes (498)
Is part of the Bibliography
- no (498)
Keywords
- inflammation (10)
- glioblastoma (8)
- cancer (7)
- Breast cancer (6)
- biomarker (6)
- Inflammation (5)
- breast cancer (5)
- immunotherapy (5)
- mTOR (5)
- Biomarkers (4)
Institute
- Medizin (498) (remove)
Background: Computerized virtual patients (VP) have spread into many areas of healthcare delivery and medical education. They provide various advantages like flexibility in pace and space of learning, a high degree of teaching reproducibility and a cost effectiveness. However, the educational benefit of VP as an additive or also as an alternative to traditional teaching formats remains unclear. Moreover, there are no randomized-controlled studies that investigated the use of VP in a dental curriculum. Therefore, this study investigates VP as an alternative to lecturer-led small-group teaching in a curricular, randomized and controlled setting.
Methods: Randomized and controlled cohort study. Four VP cases were created according to previously published design principles and compared with lecturer-led small group teaching (SGT) within the Oral and Maxillofacial Surgery clerkship for dental students at the Department for Cranio-, Oral and Maxillofacial Plastic Surgery, Goethe University, Frankfurt, Germany. Clinical competence was measured prior (T0), directly (T1) and 6 weeks (T2) after the intervention using theoretical tests and a self-assessment questionnaire. Furthermore, VP design was evaluated using a validated toolkit.
Results: Fifty-seven students (VP = 32; SGT = 25) agreed to participate in the study. No competence differences were found at T0 (p = 0.56). The VP group outperformed (p < .0001) the SGT group at T1. At T2 there was no difference between both groups (p = 0.55). Both interventions led to a significant growth in self-assessed competence. The VP group felt better prepared to diagnose and treat real patients and regarded VP cases as a rewarding learning experience.
Conclusions: VP cases are an effective alternative to lecture-led SGT in terms of learning efficacy in the short and long-term as well as self-assessed competence growth and student satisfaction. Furthermore, integrating VP cases within a curricular Oral and Maxillofacial Surgery Clerkship is feasible and leads to substantial growth of clinical competence in undergraduate dental students.
Diagnosing and treating acute severe and recurrent antivenom-related anaphylaxis (ARA) is challenging and reported experience is limited. Herein, we describe our experience of severe ARA in patients with neurotoxic snakebite envenoming in Nepal. Patients were enrolled in a randomised, double-blind trial of high vs. low dose antivenom, given by intravenous (IV) push, followed by infusion. Training in ARA management emphasised stopping antivenom and giving intramuscular (IM) adrenaline, IV hydrocortisone, and IV chlorphenamine at the first sign/s of ARA. Later, IV adrenaline infusion (IVAI) was introduced for patients with antecedent ARA requiring additional antivenom infusions. Preantivenom subcutaneous adrenaline (SCAd) was introduced in the second study year (2012). Of 155 envenomed patients who received ≥ 1 antivenom dose, 13 (8.4%), three children (aged 5−11 years) and 10 adults (18−52 years), developed clinical features consistent with severe ARA, including six with overlapping signs of severe envenoming. Four and nine patients received low and high dose antivenom, respectively, and six had received SCAd. Principal signs of severe ARA were dyspnoea alone (n=5 patients), dyspnoea with wheezing (n=3), hypotension (n=3), shock (n=3), restlessness (n=3), respiratory/cardiorespiratory arrest (n=7), and early (n=1) and late laryngeal oedema (n=1); rash was associated with severe ARA in 10 patients. Four patients were given IVAI. Of the 8 (5.1%) deaths, three occurred in transit to hospital. Severe ARA was common and recurrent and had overlapping signs with severe neurotoxic envenoming. Optimising the management of ARA at different healthy system levels needs more research. This trial is registered with NCT01284855.
Purpose: To examine whether applying case management in general practices reduces thromboembolic events requiring hospitalization and major bleeding events (combined primary outcome). Secondary endpoints were mortality, frequency and duration of hospitalization, severe treatment interactions, adverse events, quality of anticoagulation, health-related quality of life and intervention costs, patients’ assessment of chronic illness care, self-reported adherence to medication, GP and HCA knowledge, patient knowledge and satisfaction with shared decision-making.
Methods: Cluster-randomized controlled trial undertaken at 52 general practices in Germany with adult patients with a long-term indication for oral anticoagulation. The complex intervention included training for healthcare assistants, information and quality circles for general practitioners and 24 months of case management for patients. Assessment was after 12 and 24 months. The intention-to-treat population included all randomized practices and patients, while the per-protocol analysis included only those that received treatment without major protocol violations.
Results: The mean (SD) age of the 736 patients was 73.5 (9.4) years and 597 (81.1%) had atrial fibrillation. After 24 months, the primary endpoint had occurred in 40 (11.0%) intervention and 48 (12.9%) control patients (hazard ratio 0.83, 95% CI 0.55 to 1.25; P = .37). Patients’ perceived quality of care, their knowledge, and HCAs’ knowledge, had improved significantly at 24 months. The other secondary endpoints did not differ between groups. In the intervention group, hospital admissions were significantly reduced in patients that received treatment without major protocol deviations.
Conclusions: Even though the main outcomes did not differ significantly, the intervention appears to have positively influenced several process parameters under "real-world conditions".
Lipoxygenases (LOXs) catalyze the stereo-specific peroxidation of polyunsaturated fatty acids (PUFAs) to their corresponding hydroperoxy derivatives. Human macrophages express two arachidonic acid (AA) 15-lipoxygenating enzymes classified as ALOX15 and ALOX15B. ALOX15, which was first described in 1975, has been extensively characterized and its biological functions have been investigated in a number of cellular systems and animal models. In macrophages, ALOX15 functions to generate specific phospholipid (PL) oxidation products crucial for orchestrating the nonimmunogenic removal of apoptotic cells (ACs) as well as synthesizing precursor lipids required for production of specialized pro-resolving mediators (SPMs) that facilitate inflammation resolution. The discovery of ALOX15B in 1997 was followed by comprehensive analyses of its structural properties and reaction specificities with PUFA substrates. Although its enzymatic properties are well described, the biological functions of ALOX15B are not fully understood. In contrast to ALOX15 whose expression in human monocyte-derived macrophages is strictly dependent on Th2 cytokines IL-4 and IL-13, ALOX15B is constitutively expressed. This review aims to summarize the current knowledge on the regulation and functions of ALOX15 and ALOX15B in human macrophages.
Introduction: Epoxyeicosatrienoic acids (EETs) are able to enhance angiogenesis and regulate inflammation that is especially important in wound healing under ischemic conditions. Thus, we evaluated the effect of local EET application on ischemic wounds in mice.
Methods: Ischemia was induced by cautherization of two of the three supplying vessels to the mouse ear. Wounding was performed on the ear three days later. Wounds were treated either with 11,12 or 14,15 EET and compared to untreated control and normal wounds. Epithelialization was measured every second day. VEGF, TNF-α, TGF-β, matrix metalloproteinases (MMP), tissue inhibitors of metalloproteinases (TIMP), Ki67, and SDF-1α were evaluated immunohistochemically in wounds on day 3, 6, and 9.
Results: Ischemia delayed wound closure (12.8 days ± 1.9 standard deviation (SD) for ischemia and 8.0 days ± 0.94 SD for control). 11,12 and14,15 EET application ameliorated deteriorated wound healing on ischemic ears (7.6 ± 1.3 SD for 11,12 EET and 9.2 ± 1.4 SD for 14,15 EET). Ischemia did not change VEGF, TNF-α, TGF-β, SDF-1α, TIMP, MMP7 or MMP9 level significantly compared to control. Local application of 11,12 as well as 14,15 EET induced a significant elevation of VEGF, TGF-β, and SDF-1α expression as well as proliferation during the whole phase of wound healing compared to control and ischemia alone.
Conclusion: In summary, EET improve impaired wound healing caused by ischemia as they enhance neovascularization and alter inflammatory response in wounds. Thus elevating lipid mediator level as 11,12 and 14,15 EET in wounds might be a successful strategy for amelioration of deranged wound healing under ischemia.
This is a randomized trial (ATHENA study) in de novo kidney transplant patients to compare everolimus versus mycophenolic acid (MPA) with similar tacrolimus exposure in both groups, or everolimus with concomitant tacrolimus or cyclosporine (CsA), in an unselected population. In this 12-month, multicenter, open-label study, de novo kidney transplant recipients were randomized to everolimus with tacrolimus (EVR/TAC), everolimus with CsA (EVR/CsA) or MPA with tacrolimus (MPA/TAC), with similar tacrolimus exposure in both groups. Non-inferiority of the primary end point (estimated glomerular filtration rate [eGFR] at month 12), assessed in the per-protocol population of 338 patients, was not shown for EVR/TAC or EVR/CsA versus MPA/TAC. In 123 patients with TAC levels within the protocol-specified range, eGFR outcomes were comparable between groups. The mean increase in eGFR during months 1 to 12 post-transplant, analyzed post hoc, was similar with EVR/TAC or EVR/CsA versus MPA/TAC. The incidence of treatment failure (biopsy proven acute rejection, graft loss or death) was not significant for EVR/TAC but significant for EVR/CsA versus MPA/TAC. Most biopsy-proven acute rejection events in this study were graded mild (BANFF IA). There were no differences in proteinuria between groups. Cytomegalovirus and BK virus infection were significantly more frequent with MPA/TAC. Thus, everolimus with TAC or CsA showed comparable efficacy to MPA/TAC in de novo kidney transplant patients. Non-inferiority of renal function, when pre-specified, was not shown, but the mean increase in eGFR from month 1 to 12 was comparable to MPA/TAC.
Purpose: Artificial intelligence (AI) has accelerated novel discoveries across multiple disciplines including medicine. Clinical medicine suffers from a lack of AI-based applications, potentially due to lack of awareness of AI methodology. Future collaboration between computer scientists and clinicians is critical to maximize the benefits of transformative technology in this field for patients. To illustrate, we describe AI-based advances in the diagnosis and management of gliomas, the most common primary central nervous system (CNS) malignancy.
Methods: Presented is a succinct description of foundational concepts of AI approaches and their relevance to clinical medicine, geared toward clinicians without computer science backgrounds. We also review novel AI approaches in the diagnosis and management of glioma.
Results: Novel AI approaches in gliomas have been developed to predict the grading and genomics from imaging, automate the diagnosis from histopathology, and provide insight into prognosis.
Conclusion: Novel AI approaches offer acceptable performance in gliomas. Further investigation is necessary to improve the methodology and determine the full clinical utility of these novel approaches.
Background: Iron deficiency anemia is common in pregnancy with a prevalence of approximately 16% in Austria; however, international guideline recommendations on screening and subsequent treatment with iron preparations are inconsistent. The aim of this study was to find out how often pregnant women take iron-containing supplements, and who recommended them. As hemoglobin data were available for a sub-group of women, hemoglobin status during pregnancy and associated consumption of iron-containing medications were also recorded.
Methods: This cross-sectional study was conducted at the Mother-Child-Booklet service center of the Styrian Health Insurance Fund in Graz, Austria. A questionnaire containing seven questions was developed. Absolute and relative numbers were determined, and corresponding 95% confidence intervals calculated using bootstrapping techniques.
Results: A total of 325 women completed the questionnaire, 11% had been diagnosed with anemia before becoming pregnant, 67% reported taking iron-containing compounds. The women reported taking 45 different products but 61% took 1 of 3 different supplements. Overall, 185 (57%) women had not been diagnosed with anemia before becoming pregnant but reported taking an iron-containing supplement and 89% of the women took supplements on the recommendation of their physician. Of the 202 women whose hemoglobin status was assessed, 92% were found not to be anemic.
Conclusion: Overall, 67% of pregnant women took iron-containing compounds, irrespective of whether they were deficient in iron. Physicians were generally responsible for advising them to take them. No standardized procedure is available on which to base the decision whether to take iron during pregnancy, even in guidelines. As most guidelines only recommend taking iron supplements in cases of anemia, the high percentage of women taking them in Austria is incomprehensible.e
Background: Computer-assisted implant planning has become an important diagnostic and therapeutic tool in modern dentistry. This case report emphasizes the possibilities in modern implantology combining virtual implant planning, guided surgery with tooth and implant supported templates, immediate implant placement and loading.
Case presentation: A straight forward approach was followed for the mandible presenting with hopeless lower incisors. Diagnosis, decision making and treatment approach were based on clinical findings and detailed virtual three-dimensional implant planning. Extractions of the hopeless mandibular incisors, immediate and guided implant placement of six standard implants, and immediate loading with a provisional fixed dental prosthesis (FDP) were performed fulfilling patient’s functional and esthetic demands. The final computer assisted design / computer assisted manufacturing (CAD/CAM) FDP with a titanium framework and composite veneering was delivered after 6 months. At the 1-year recall the FDP was free of technical complications. Stable bony conditions and a healthy peri-implant mucosa could be observed.
Conclusions: Computer assisted implantology including three-dimensional virtual implant planning, guided surgery, and CAD/CAM fabrication of provisional and final reconstructions allowed for a concise treatment workflow with predictable esthetic and functional outcomes in this mandibular full-arch case. The combination of immediate implant placement and immediate loading was considerably more complex and required a high level of organization between implantologist, technician and patient. After the usage of a first tooth-supported surgical template with subsequent extraction of the supporting teeth, a second surgical template stabilized on the previously inserted implants helped to transfer the planned implant position in the extraction sites with a guided approach.
Background: Previous trials of PCSK9 (proprotein convertase subtilisin-kexin type 9) inhibitors demonstrated reductions in major adverse cardiovascular events, but not death. We assessed the effects of alirocumab on death after index acute coronary syndrome.
Methods: ODYSSEY OUTCOMES (Evaluation of Cardiovascular Outcomes After an Acute Coronary Syndrome During Treatment With Alirocumab) was a double-blind, randomized comparison of alirocumab or placebo in 18 924 patients who had an ACS 1 to 12 months previously and elevated atherogenic lipoproteins despite intensive statin therapy. Alirocumab dose was blindly titrated to target achieved low-density lipoprotein cholesterol (LDL-C) between 25 and 50 mg/dL. We examined the effects of treatment on all-cause death and its components, cardiovascular and noncardiovascular death, with log-rank testing. Joint semiparametric models tested associations between nonfatal cardiovascular events and cardiovascular or noncardiovascular death.
Results: Median follow-up was 2.8 years. Death occurred in 334 (3.5%) and 392 (4.1%) patients, respectively, in the alirocumab and placebo groups (hazard ratio [HR], 0.85; 95% CI, 0.73 to 0.98; P=0.03, nominal P value). This resulted from nonsignificantly fewer cardiovascular (240 [2.5%] vs 271 [2.9%]; HR, 0.88; 95% CI, 0.74 to 1.05; P=0.15) and noncardiovascular (94 [1.0%] vs 121 [1.3%]; HR, 0.77; 95% CI, 0.59 to 1.01; P=0.06) deaths with alirocumab. In a prespecified analysis of 8242 patients eligible for ≥3 years follow-up, alirocumab reduced death (HR, 0.78; 95% CI, 0.65 to 0.94; P=0.01). Patients with nonfatal cardiovascular events were at increased risk for cardiovascular and noncardiovascular deaths (P<0.0001 for the associations). Alirocumab reduced total nonfatal cardiovascular events (P<0.001) and thereby may have attenuated the number of cardiovascular and noncardiovascular deaths. A post hoc analysis found that, compared to patients with lower LDL-C, patients with baseline LDL-C ≥100 mg/dL (2.59 mmol/L) had a greater absolute risk of death and a larger mortality benefit from alirocumab (HR, 0.71; 95% CI, 0.56 to 0.90; Pinteraction=0.007). In the alirocumab group, all-cause death declined with achieved LDL-C at 4 months of treatment, to a level of approximately 30 mg/dL (adjusted P=0.017 for linear trend).
Conclusions: Alirocumab added to intensive statin therapy has the potential to reduce death after acute coronary syndrome, particularly if treatment is maintained for ≥3 years, if baseline LDL-C is ≥100 mg/dL, or if achieved LDL-C is low.
Clinical Trial Registration: URL: https://www.clinicaltrials.gov. Unique identifier: NCT01663402.
Previous research indicates that anxiety disorders are characterized by an overgeneralization of conditioned fear as compared with healthy participants. Therefore, fear generalization is considered a key mechanism for the development of anxiety disorders. However, systematic investigations on the variance in fear generalization are lacking. Therefore, the current study aims at identifying distinctive phenotypes of fear generalization among healthy participants. To this end, 1175 participants completed a differential fear conditioning phase followed by a generalization test. To identify patterns of fear generalization, we used a k-means clustering algorithm based on individual arousal generalization gradients. Subsequently, we examined the reliability and validity of the clusters and phenotypical differences between subgroups on the basis of psychometric data and markers of fear expression. Cluster analysis reliably revealed five clusters that systematically differed in mean responses, differentiation between conditioned threat and safety, and linearity of the generalization gradients, though mean response levels accounted for most variance. Remarkably, the patterns of mean responses were already evident during fear acquisition and corresponded most closely to psychometric measures of anxiety traits. The identified clusters reliably described subgroups of healthy individuals with distinct response characteristics in a fear generalization test. Following a dimensional view of psychopathology, these clusters likely delineate risk factors for anxiety disorders. As crucial group characteristics were already evident during fear acquisition, our results emphasize the importance of average fear responses and differentiation between conditioned threat and safety as risk factors for anxiety disorders.
During erythropoiesis, haematopoietic stem cells (HSCs) differentiate in successive steps of commitment and specification to mature erythrocytes. This differentiation process is controlled by transcription factors that establish stage- and cell type-specific gene expression. In this study, we demonstrate that FUSE binding protein 1 (FUBP1), a transcriptional regulator important for HSC self-renewal and survival, is regulated by T-cell acute lymphocytic leukaemia 1 (TAL1) in erythroid progenitor cells. TAL1 directly activates the FUBP1 promoter, leading to increased FUBP1 expression during erythroid differentiation. The binding of TAL1 to the FUBP1 promoter is highly dependent on an intact GATA sequence in a combined E-box/GATA motif. We found that FUBP1 expression is required for efficient erythropoiesis, as FUBP1-deficient progenitor cells were limited in their potential of erythroid differentiation. Thus, the finding of an interconnection between GATA1/TAL1 and FUBP1 reveals a molecular mechanism that is part of the switch from progenitor- to erythrocyte-specific gene expression. In summary, we identified a TAL1/FUBP1 transcriptional relationship, whose physiological function in haematopoiesis is connected to proper erythropoiesis.
Background: Transcatheter aortic valve replacement (TAVR) is a therapeutic option for patients with aortic valve stenosis at increased surgical risk. Telomeres are an established marker for cellular senescence and have served to evaluate cardiovascular diseases including severe aortic valve stenosis. In our study, we hypothesized that telomere length may be a predictor for outcome and associated with comorbidities in patients with TAVR.
Methods and results: We analyzed leucocyte telomere length from 155 patients who underwent TAVR and correlated the results with 1-year mortality and severe comorbidities. The cohort was subdivided into 3 groups according to telomere length. Although a trend for a positive correlation of telomere length with a lower EuroSCORE could be found, telomere length was not associated with survival, aortic valve opening area or cardiovascular comorbidities (peripheral, coronary or cerebrovascular disease). Interestingly, long telomeres were significantly correlated to a reduced left ventricular ejection fraction (LVEF).
Conclusion: In elderly patients with severe aortic valve stenosis, leucocyte telomere length did not predict post-procedural survival. The correlation between long telomere length and reduced LVEF in these patients deserves further attention.
Circadian oscillations in circulating leukocyte subsets including immature hematopoietic cells have been appreciated; the origin and nature of these alterations remain elusive. Our analysis of wild-type C57BL/6 mice under constant darkness confirmed circadian fluctuations of circulating leukocytes and clonogenic cells in blood and spleen but not bone marrow. Clock gene deficient Bmal1-/- mice lacked this regulation. Cell cycle analyses in the different hematopoietic compartments excluded circadian changes in total cell numbers, rather favoring shifting hematopoietic cell redistribution as the underlying mechanism. Transplant chimeras demonstrate that circadian rhythms within the stroma mediate the oscillations independently of hematopoietic-intrinsic cues. We provide evidence of circadian CXCL12 regulation via clock genes in vitro and were able to confirm CXCL12 oscillation in bone marrow and blood in vivo. Our studies further implicate cortisol as the conveyor of circadian input to bone marrow stroma and mediator of the circadian leukocyte oscillation. In summary, we establish hematopoietic-extrinsic cues as causal for circadian redistribution of circulating mature/immature blood cells.
Discovery of key whole-brain transitions and dynamics during human wakefulness and non-REM sleep
(2019)
The modern understanding of sleep is based on the classification of sleep into stages defined by their electroencephalography (EEG) signatures, but the underlying brain dynamics remain unclear. Here we aimed to move significantly beyond the current state-of-the-art description of sleep, and in particular to characterise the spatiotemporal complexity of whole-brain networks and state transitions during sleep. In order to obtain the most unbiased estimate of how whole-brain network states evolve through the human sleep cycle, we used a Markovian data-driven analysis of continuous neuroimaging data from 57 healthy participants falling asleep during simultaneous functional magnetic resonance imaging (fMRI) and EEG. This Hidden Markov Model (HMM) facilitated discovery of the dynamic choreography between different whole-brain networks across the wake-non-REM sleep cycle. Notably, our results reveal key trajectories to switch within and between EEG-based sleep stages, while highlighting the heterogeneities of stage N1 sleep and wakefulness before and after sleep.
Platelet function (PF) plays a pivotal role in both hemostasis and thrombosis, and manual light transmission aggregometry (LTA) is considered the standard of care for platelet function testing but is an error-prone and time-consuming procedure. We aimed to test the agreement regarding maximum aggregation (MA), velocity (VEL), and lag-phase (LagP) of platelet aggregation of the automated Sysmex CS-2100i analyzer (Siemens, Germany) against the APACT 4004 (Elitech, France) in samples derived from healthy participants and patients with hemostaseologic disorders. In total, 123 patient-derived samples were investigated, including 42 patients with acetylsalicylic acid and/or clopidogrel intake and 20 patients with other hemostaseologic disorders. Both MA and VEL showed good or excellent intermethod correlation. Agreement between the testing methods was only partially achieved, and values were indicative for a systematic bias to lower measurements below a threshold of 50% MA with the CS-2100i compared to the APACT 4004. All patients with impaired PF in the APACT 4004 were successfully identified with the CS-2100i, and reference values for automated LTA are provided. Conclusively, automated LTA with the CS-2100i is a highly standardized and reliable PF testing method and represents a decisive step in the simplification of platelet function testing in clinical routine.
Introduction: Dravet syndrome (DS) is a rare developmental and epileptic encephalopathy. This study estimated cost, cost-driving factors and quality of life (QoL) in patients with Dravet syndrome and their caregivers in a prospective, multicenter study in Germany.
Methods: A validated 3–12-month retrospective questionnaire and a prospective 3-month diary assessing clinical characteristics, QoL, and direct, indirect and out-of-pocket (OOP) costs were administered to caregivers of patients with DS throughout Germany.
Results: Caregivers of 93 patients (mean age 10.1 years, ±7.1, range 15 months–33.7 years) submitted questionnaires and 77 prospective diaries. The majority of patients (95%) experienced at least one seizure during the previous 12 months and 77% a status epilepticus (SE) at least once in their lives. Over 70% of patients had behavioural problems and delayed speech development and over 80% attention deficit symptoms and disturbance of motor skills and movement coordination. Patient QoL was lower than in the general population and 45% of caregivers had some form of depressive symptoms. Direct health care costs per three months were a mean of €6,043 ± €5,825 (median €4054, CI €4935-€7350) per patient. Inpatient costs formed the single most important cost category (28%, €1,702 ± €4,315), followed by care grade benefits (19%, €1,130 ± €805), anti-epileptic drug (AED) costs (15%, €892 ± €1,017) and ancillary treatments (9%, €559 ± €503). Total indirect costs were €4,399 ±€ 4,989 (median €0, CI €3466-€5551) in mothers and €391 ± €1,352 (median €0, CI €195-€841) in fathers. In univariate analysis seizure frequency, experience of SE, nursing care level and severe additional symptoms were found to be associated with total direct healthcare costs. Severe additional symptoms was the single independently significant explanatory factor in a multivariate analysis.
Conclusions: This study over a period up to 15 months revealed substantial direct and indirect healthcare costs of DS in Germany and highlights the relatively low patient and caregiver QoL compared with the general population.
Blunt thoracic trauma (TxT) deteriorates clinical post-injury outcomes. Ongoing inflammatory changes promote the development of post-traumatic complications, frequently causing Acute Lung Injury (ALI). Club Cell Protein (CC)16, a pulmonary anti-inflammatory protein, correlates with lung damage following TxT. Whether CC16-neutralization influences the inflammatory course during ALI is elusive. Ninety-six male CL57BL/6N mice underwent a double hit model of TxT and cecal ligation puncture (CLP, 24 h post-TxT). Shams underwent surgical procedures. CC16 was neutralized by the intratracheal application of an anti-CC16-antibody, either after TxT (early) or following CLP (late). Euthanasia was performed at 6 or 24 h post-CLP. Systemic and pulmonary levels of IL-6, IL-1β, and CXCL5 were determined, the neutrophils were quantified in the bronchoalveolar lavage fluid, and histomorphological lung damage was assessed. ALI induced a significant systemic IL-6 increase among all groups, while the local inflammatory response was most prominent after 24 h in the double-hit groups as compared to the shams. Significantly increased neutrophilic infiltration upon double hit was paralleled with the enhanced lung damage in all groups as compared to the sham, after 6 and 24 h. Neutralization of CC16 did not change the systemic inflammation. However, early CC16-neutralization increased the neutrophilic infiltration and lung injury at 6 h post-CLP, while 24 h later, the lung injury was reduced. Late CC16-neutralization increased neutrophilic infiltration, 24 h post-CLP, and was concurrent with an enhanced lung injury. The data confirmed the anti-inflammatory potential of endogenous CC16 in the murine double-hit model of ALI.
In solid tumors, tumor‐associated macrophages (TAMs) commonly accumulate within hypoxic areas. Adaptations to such environments evoke transcriptional changes by the hypoxia‐inducible factors (HIFs). While HIF‐1α is ubiquitously expressed, HIF‐2α appears tissue‐specific with consequences of HIF‐2α expression in TAMs only being poorly characterized. An E0771 allograft breast tumor model revealed faster tumor growth in myeloid HIF‐2α knockout (HIF‐2αLysM−/−) compared with wildtype (wt) mice. In an RNA‐sequencing approach of FACS sorted wt and HIF‐2α LysM−/− TAMs, serine protease inhibitor, Kunitz type‐1 ( Spint1) emerged as a promising candidate for HIF‐2α‐dependent regulation. We validated reduced Spint1 messenger RNA expression and concomitant Spint1 protein secretion under hypoxia in HIF‐2α‐deficient bone marrow–derived macrophages (BMDMs) compared with wt BMDMs. In line with the physiological function of Spint1 as an inhibitor of hepatocyte growth factor (HGF) activation, supernatants of hypoxic HIF‐2α knockout BMDMs, not containing Spint1, were able to release proliferative properties of inactive pro‐HGF on breast tumor cells. In contrast, hypoxic wt BMDM supernatants containing abundant Spint1 amounts failed to do so. We propose that Spint1 contributes to the tumor‐suppressive function of HIF‐2α in TAMs in breast tumor development.
MicroRNAs (miRs) significantly contribute to the regulation of gene expression, by virtue of their ability to interact with a broad, yet specific set of target genes. MiRs are produced and released by almost every cell type and play an important role in horizontal gene regulation in the tumor microenvironment (TME). In the TME, both tumor and stroma cells cross-communicate via diverse factors including miRs, which are taking central stage as a therapeutic target of anti-tumor therapy. One of the immune escape strategies adopted by tumor cells is to release miRs as a Trojan horse to hijack circulating or tumor-localized monocytes/macrophages to tune them for pro-tumoral functions. On the other hand, macrophage-derived miRs exert anti-tumor functions. The transfer of miRs from host to recipient cells depends on the supramolecular structure and composition of miR carriers, which determine the distinct uptake mechanism by recipient cells. In this review, we provide a recent update on the miR-mediated crosstalk between tumor cells and macrophages and their mode of uptake in the TME.
Macrophage S1PR1 signaling alters angiogenesis and lymphangiogenesis during skin inflammation
(2019)
The bioactive lipid sphingosine-1-phosphate (S1P), along with its receptors, modulates lymphocyte trafficking and immune responses to regulate skin inflammation. Macrophages are important in the pathogenesis of psoriasiform skin inflammation and express various S1P receptors. How they respond to S1P in skin inflammation remains unknown. We show that myeloid specific S1P receptor 1 (S1PR1) deletion enhances early inflammation in a mouse model of imiquimod-induced psoriasis, without altering the immune cell infiltrate. Mechanistically, myeloid S1PR1 deletion altered the formation of IL-1β, VEGF-A, and VEGF-C, and their receptors’ expression in psoriatic skin, which subsequently lead to reciprocal regulation of neoangiogenesis and neolymphangiogenesis. Experimental findings were corroborated in human clinical datasets and in knockout macrophages in vitro. Increased blood vessel but reduced lymph vessel density may explain the exacerbated inflammatory phenotype in conditional knockout mice. These findings assign a novel role to macrophage S1PR1 and provide a rationale for therapeutically targeting local S1P during skin inflammation.
Range variability in cmr feature tracking multilayer strain across different stages of heart failure
(2019)
Heart failure (HF) is associated with progressive ventricular remodeling and impaired contraction that affects distinctly various regions of the myocardium. Our study applied cardiac magnetic resonance (CMR) feature tracking (FT) to assess comparatively myocardial strain at 3 distinct levels: subendocardial (Endo-), mid (Myo-) and subepicardial (Epi-) myocardium across an extended spectrum of patients with HF. 59 patients with HF, divided into 3 subgroups as follows: preserved ejection fraction (HFpEF, N = 18), HF with mid-range ejection fraction (HFmrEF, N = 21), HF with reduced ejection fraction (HFrEF, N = 20) and a group of age- gender- matched volunteers (N = 17) were included. Using CMR FT we assessed systolic longitudinal and circumferential strain and strain-rate at Endo-, Myo- and Epi- levels. Strain values were the highest in the Endo- layer and progressively lower in the Myo- and Epi- layers respectively, this gradient was present in all the patients groups analyzed but decreased progressively in HFmrEF and further on in HFrEF groups. GLS decreased with the severity of the disease in all 3 layers: Normal > HFpEF > HFmrEF > HFrEF (Endo-: −23.0 ± 3.5 > −20.0 ± 3.3 > −16.4 ± 2.2 > −11.0 ± 3.2, p < 0.001, Myo-: −20.7 ± 2.4 > −17.5.0 ± 2.6 > −14.5 ± 2.1 > −9.6 ± 2.7, p < 0.001; Epi-: −15.7 ± 1.9 > −12.2 ± 2.1 > −10.6 ± 2.3 > −7.7 ± 2.3, p < 0.001). In contrast, GCS was not different between the Normal and HFpEF (Endo-: −34.5 ± 6.2 vs −33.9 ± 5.7, p = 0.51; Myo-: −21.9 ± 3.8 vs −21.3 ± 2.2, p = 0.39, Epi-: −11.4 ± 2.0 vs −10.9 ± 2.3, p = 0.54) but was, as well, markedly lower in the systolic heart failure groups: Normal > HFmrEF > HFrEF (Endo-: −34.5 ± 6.2 > −20.0 ± 4.2 > 12.3 ± 4.2, p < 0.001; Myo-: −21.9 ± 3.8 > −13.0 ± 3.4 > −8.0 ± 2.7. p < 0.001; Epi-: −11.4 ± 2.0 > −7.9 ± 2.3 > −4.5 ± 1.9. p < 0.001). CMR feature tracking multilayer strain assessment identifies large range differences between distinct myocardial regions. Our data emphasizes the importance of sub-endocardial myocardium for cardiac contraction and thus, its predilect role in imaging detection of functional impairment. CMR feature tracking offers a convenient, readily available, platform to evaluate myocardial contraction with excellent spatial resolution, rendering further details about discrete areas of the myocardium. Using this technique across distinct groups of patients with heart failure (HF), we demonstrate that subendocardial regions of the myocardium exhibit much higher strain values than mid-myocardium or subepicardial and are more sensitive to detect contractile impairment. We also show comparatively higher values of circumferential strain compared with longitudinal and a higher sensitivity to detect contractile impairment. A newly characterized group of patients, HF with mid-range ejection fraction (EF), shows similar traits of decompensation but has relatively higher strain values as patients with HF with reduced EF.
Of the hepatic cell lines developed for in vitro studies of hepatic functions as alternatives to primary human hepatocytes, many have lost major liver-like functions, but not HepaRG cells. The increasing use of the latter worldwide raises the need for establishing the reference functional status of early biobanked HepaRG cells. Using deep proteome and secretome analyses, the levels of master regulators of the hepatic phenotype and of the structural elements ensuring biliary polarity were found to be close to those in primary hepatocytes. HepaRG cells proved to be highly differentiated, with functional mitochondria, hepatokine secretion abilities, and an adequate response to insulin. Among differences between primary human hepatocytes and HepaRG cells, the factors that possibly support HepaRG transdifferentiation properties are discussed. The HepaRG cell system thus appears as a robust surrogate for primary hepatocytes, which is versatile enough to study not only xenobiotic detoxification, but also the control of hepatic energy metabolism, secretory function and disease-related mechanisms.
The spectrum of alcoholic liver disease (ALD) is broad and includes alcoholic fatty liver, alcoholic steatohepatitis, alcoholic hepatitis, alcoholic fibrosis, alcoholic cirrhosis, and alcoholic hepatocellular carcinoma, best explained as a five-hit sequelae of injurious steps. ALD is not primarily the result of malnutrition as assumed for many decades but due to the ingested alcohol and its metabolic consequences although malnutrition may marginally contribute to disease aggravation. Ethanol is metabolized in the liver to the heavily reactive acetaldehyde via the alcohol dehydrogenase (ADH) and the cytochrome P450 isoform 2E1 of the microsomal ethanol-oxidizing system (MEOS). The resulting disturbances modify not only the liver parenchymal cells but also non-parenchymal cells such as Kupffer cells (KCs), hepatic stellate cells (HSCs), and liver sinusoidal endothelial cells (LSECs). These are activated by acetaldehyde, reactive oxygen species (ROS), and endotoxins, which are produced from bacteria in the gut and reach the liver due to gut leakage. A variety of intrahepatic signaling pathways and innate or acquired immune reactions are under discussion contributing to the pathogenesis of ALD via the five injurious hits responsible for disease aggravation. As some of the mechanistic steps are based on studies with in vitro cell systems or animal models, respective proposals for humans may be considered as tentative. However, sufficient evidence is provided for clinical risk factors that include the amount of alcohol used daily for more than a decade, gender differences with higher susceptibility of women, genetic predisposition, and preexisting liver disease. In essence, efforts within the last years were devoted to shed more light in the pathogenesis of ALD, much has been achieved but issues remain to what extent results obtained from experimental studies can be transferred to humans.
One of the most difficult challenges in clinical hepatology is the diagnosis of a drug-induced liver injury (DILI). The timing of the events, exclusion of alternative causes, and taking into account the clinical context should be systematically assessed and scored in a transparent manner. RUCAM (Roussel Uclaf Causality Assessment Method) is a well-established diagnostic algorithm and scale to assess causality in patients with suspected DILI. First published in 1993 and updated in 2016, RUCAM is now the worldwide most commonly used causality assessment method (CAM) for DILI. The following manuscript highlights the recent implementation of RUCAM around the world, by reviewing the literature for publications that utilized RUCAM, and provides a review of “best practices” for the use of RUCAM in cases of suspected DILI. The worldwide appreciation of RUCAM is substantiated by the current analysis of 46,266 DILI cases, all tested for causality using RUCAM. These cases derived from 31 reports published from 2014 to early 2019. Their first authors came from 10 countries, with China on top, followed by the US, and Germany on the third rank. Importantly, all RUCAM-based DILI reports were published in high profile journals. Many other reports were published earlier from 1993 up to 2013 in support of RUCAM. Although most of the studies were of high quality, the current case analysis revealed shortcomings in few studies, not at the level of RUCAM itself but rather associated with the work of the users. To ensure in future DILI cases a better performance by the users, a list of essential elements is proposed. As an example, all suspected DILI cases should be evaluated 1) by the updated RUCAM to facilitate result comparisons, 2) according to a prospective study protocol to ensure complete data sets, 3) after exclusion of cases with herb induced liver injury (HILI) from a DILI cohort to prevent confounding variables, and 4) according to inclusion of DILI cases with RUCAM-based causality gradings of highly probable or probable, in order to increase the specificity of the results. In conclusion, RUCAM benefits from its high appreciation and performs well provided the users adhere to published recommendations to prevent confounding variability.
The current Special Issue is devoted to the broad spectrum of hepatotoxicity with its molecular mechanisms and pathophysiology, presented in eight publications. Scientists were from various countries, including the US, Mexico, the Czech Republic, Germany, Portugal, China, and Japan. Contributions considered various types of experimental and human liver injury, elicited by a number of causal conditions and substances. ...
Liver injuries caused by the use of exogenous compounds such as drugs, herbs, and alcohol are commonly well diagnosed using laboratory tests, toxin analyses, or eventually reactive intermediates generated during metabolic degradation of the respective chemical in the liver and subject to covalent binding by target proteins. Conditions are somewhat different for idiosyncratic drug induced liver injury (DILI), for which metabolic intermediates as diagnostic aids are rarely available. Although the diagnosis of idiosyncratic DILI can well be established using the validated, liver specific, structured, and quantitative RUCAM (Roussel Uclaf Causality Assessment Method), there is an ongoing search for new diagnostic biomarkers that could assist in and also confirm RUCAM-based DILI diagnoses. With respect to idiosyncratic DILI and following previous regulatory letters of recommendations, selected biomarkers reached the clinical focus, including microRNA-122, microRNA-192, cytokeratin analogues, glutamate dehydrogenase, total HMGB-1 (High Mobility Group Box), and hyperacetylated HMGB-1 proteins. However, the new parameters total HMGB-1, and even more so the acetylated HMGB-1, came under critical scientific fire after misconduct at one of the collaborating partner centers, leading the EMA to recommend no longer the exploratory hyperacetylated HMGB1 isoform biomarkers in clinical studies. The overall promising nature of the recommended biomarkers was considered by EMA as highly dependent on the outstanding results of the now incriminated biomarker hyperacetylated HMGB-1. The EMA therefore correctly decided to officially retract its Letter of Support affecting all biomarkers listed above. New biomarkers are now under heavy scrutiny that will require re-evaluations prior to newly adapted recommendations. With Integrin beta 3 (ITGB3), however, a new diagnostic biomarker may emerge, possibly being drug specific but tested in only 16 patients; due to substantial remaining uncertainties, final recommendations would be premature. In conclusion, most of the currently recommended new biomarkers have lost regulatory support due to scientific misconduct, requiring now innovative approaches and re-evaluation before they can be assimilated into clinical practice.
Background: Neonatal manifestation of life-threatening hyperammonemic encephalopathy in urea cycle disorders (UCD) is often misdiagnosed as neonatal sepsis, resulting in significantly delayed start of specific treatment and poor outcome. The major aim of this study was to identify specific initial symptoms or signs to clinically distinguish hyperammonemic encephalopathy in neonates from neonatal sepsis in order to identify affected individuals with UCD and to start metabolic therapy without delay. Furthermore, we evaluated the impact of diagnostic delay, peak plasma ammonium (NH4+) concentration, mode of emergency treatment and transfer to a tertiary referral center on the outcome.
Methods: Detailed information of 17 patients (born between 1994 and 2012) with confirmed diagnosis of UCD and neonatal hyperammonemic encephalopathy were collected from the original medical records.
Results: The initially suspected diagnosis was neonatal sepsis in all patients, but was not confirmed in any of them. Unlike neonatal sepsis and not previously reported blood pressure increased above the 95th percentile in 13 (81%) of UCD patients before emergency treatment was started. Respiratory alkalosis was found in 11 (65%) of UCD patients, and in 14 (81%) plasma NH4+concentrations further increased despite initiation of metabolic therapy.
Conclusion: Detection of high blood pressure could be a valuable parameter for distinguishing neonatal sepsis from neonatal manifestation of UCD. Since high blood pressure is not typical for neonatal sepsis, other reasons such as encephalopathy and especially hyperammonemic encephalopathy (caused by e.g. UCD) should be searched for immediately. However, our result that the majority of newborns with UCD initially present with high blood pressure has to be evaluated in larger patient cohorts.
Immunotherapy involving checkpoint blockades of inhibitory co-receptors is effective in combating cancer. Despite this, the full range of mediators that inhibit T-cell activation and influence anti-tumor immunity is unclear. Here, we identify the GTPase-activating protein (GAP) Rasal1 as a novel TCR-ZAP-70 binding protein that negatively regulates T-cell activation and tumor immunity. Rasal1 inhibits via two pathways, the binding and inhibition of the kinase domain of ZAP-70, and GAP inhibition of the p21ras-ERK pathway. It is expressed in activated CD4 + and CD8 + T-cells, and inhibits CD4 + T-cell responses to antigenic peptides presented by dendritic cells as well as CD4 + T-cell responses to peptide antigens in vivo. Furthermore, siRNA reduction of Rasal1 expression in T-cells shrinks B16 melanoma and EL-4 lymphoma tumors, concurrent with an increase in CD8 + tumor-infiltrating T-cells expressing granzyme B and interferon γ-1. Our findings identify ZAP-70-associated Rasal1 as a new negative regulator of T-cell activation and tumor immunity.
Background: Persistent antiphospholipid antibodies (aPL) constitute the serological hallmark of the antiphospholipid syndrome (APS). Recently, various new assay technologies for the detection of aPL better suited to multiplex reaction environments than ELISAs emerged. We evaluated the diagnostic performance of such a novel line immunoassay (LIA) for the simultaneous detection of 10 different aPL.
Methods: Fifty-three APS patients and 34 healthy controls were investigated for criteria (antibodies against cardiolipin [aCL], β2-glycoprotein I [aβ2-GPI]) and non-criteria aPL (antibodies against phosphatidic acid [aPA], phosphatidyl-choline [aPC], -ethanolamine [aPE], -glycerol [aPG], -inositol [aPI], -serine [aPS], annexin V [aAnnV], prothrombin [aPT]) IgG and IgM by LIA. Criteria aPL were additionally determined with the established Alegria (ALE), AcuStar (ACU), UniCap (UNI), and AESKULISA (AES) systems and non-criteria aPL with the AES system. Diagnostic performance was evaluated with a gold standard for criteria aPL derived from the results of the four established assays via latent class analysis and with the clinical diagnosis as gold standard for non-criteria aPL.
Results: Assay performance of the LIA for criteria aPL was comparable to that of ALE, ACU, UNI, and AES. For non-criteria aPL, sensitivities of the LIA for aPA-, aPI-, aPS-IgG and aPA-IgM were significantly higher and for aPC-, aPE-, aAnnV-IgG and aPC- and aPE-IgM significantly lower than AES. Specificities did not differ significantly.
Conclusions: The LIA constitutes a valuable diagnostic tool for aPL profiling. It offers increased sensitivity for the detection of aPL against anionic phospholipids. In contrast, ELISAs exhibit strengths for the sensitive detection of aPL against neutral phospholipids.
Secukinumab is a fully human monoclonal antibody that selectively neutralizes interleukin 17A, a key cytokine involved in the development of psoriasis. Secukinumab has shown long-lasting efficacy and safety in the complete spectrum of psoriatic disease, including disease localized to nails, scalp, palms and soles, and joints (peripheral and axial arthritis). Given the chronic and relapsing nature of psoriasis, long-term data might help to fully characterize the efficacy and safety profile of secukinumab as well as its impact on quality of life.
Introduction and Objectives: Surgical techniques such as preservation of the full functional-length of the urethral sphincter (FFLU) have a positive impact on postoperative continence rates. Thereby, data on very early continence rates after radical prostatectomy (RP) are scarce. The aim of the present study was to analyze very early continence rates in patients undergoing FFLU during RP.
Materials and Methods: Very early-continence was assessed by using the PAD-test within 24 h after removal of the transurethral catheter. The PAD-test is a validated test that measures the amount of involuntary urine loss while performing predefined physical activities within 1 h (e.g., coughing, walking, climbing stairs). Full continence was defined as a urine loss below 1 g. Mild, moderate, and severe incontinence was defined as urine loss of 1–10 g, 11–50 g, and >50 g, respectively.
Results: 90 patients were prospectively analyzed. Removal of the catheter was performed on the 6th postoperative day. Proportions for no, mild, moderate and severe incontinence were 18.9, 45.5, 20.0, and 15.6%, respectively. In logistic regression younger age was associated with significant better continence (HR 2.52, p = 0.04), while bilateral nerve-sparing (HR 2.56, p = 0.057) and organ-confined tumor (HR 2.22, p = 0.078) showed lower urine loss, although the effect was statistically not significant. In MVA, similar results were recorded.
Conclusion: Overall, 64.4% of patients were continent or suffered only from mild incontinence at 24 h after catheter removal. In general, reduced urine loss was recorded in younger patients, patients with organ-confined tumor and in patients with bilateral nerve sparing. Severe incontinence rates were remarkably low with 15.6%.
Multisensory integration strongly depends on the temporal proximity between two inputs. In the audio-visual domain, stimulus pairs with delays up to a few hundred milliseconds can be perceived as simultaneous and integrated into a unified percept. Previous research has shown that the size of this temporal window of integration can be narrowed by feedback-guided training on an audio-visual simultaneity judgment task. Yet, it has remained uncertain how the neural network that processes audio-visual asynchronies is affected by the training. In the present study, participants were trained on a 2-interval forced choice audio-visual simultaneity judgment task. We recorded their neural activity with magnetoencephalography in response to three different stimulus onset asynchronies (0 ms, each participant’s individual binding window, 300 ms) before, and one day following training. The Individual Window stimulus onset asynchrony condition was derived by assessing each participant’s point of subjective simultaneity. Training improved performance in both asynchronous stimulus onset conditions (300 ms, Individual Window). Furthermore, beta-band amplitude (12–30 Hz) increased from pre-compared to post-training sessions. This increase moved across central, parietal, and temporal sensors during the time window of 80–410 ms post-stimulus onset. Considering the putative role of beta oscillations in carrying feedback from higher to lower cortical areas, these findings suggest that enhanced top-down modulation of sensory processing is responsible for the improved temporal acuity after training. As beta oscillations can be assumed to also preferentially support neural communication over longer conduction delays, the widespread topography of our effect could indicate that training modulates not only processing within primary sensory cortex, but rather the communication within a large-scale network.
The current problem of increasing antibiotic resistance and the resurgence of numerous infections indicate the need for novel vaccination strategies more than ever. In vaccine development, the search for and the selection of adequate vaccine antigens is the first important step. In recent years, bacterial outer membrane proteins have become of major interest, as they are the main proteins interacting with the extracellular environment. Trimeric autotransporter adhesins (TAAs) are important virulence factors in many Gram-negative bacteria, are localised on the bacterial surface, and mediate the first adherence to host cells in the course of infection. One example is the Neisseria adhesin A (NadA), which is currently used as a subunit in a licensed vaccine against Neisseria meningitidis. Other TAAs that seem promising vaccine candidates are the Acinetobacter trimeric autotransporter (Ata), the Haemophilus influenzae adhesin (Hia), and TAAs of the genus Bartonella. Here, we review the suitability of various TAAs as vaccine candidates.
Systematic protein localization and protein-protein interaction studies to characterize specific protein functions are most effectively performed using tag-based assays. Ideally, protein tags are introduced into a gene of interest by homologous recombination to ensure expression from endogenous control elements. However, inefficient homologous recombination makes this approach difficult in mammalian cells. Although gene targeting efficiency by homologous recombination increased dramatically with the development of designer endonuclease systems such as CRISPR/Cas9 capable of inducing DNA double-strand breaks with unprecedented accuracy, the strategies still require synthesis or cloning of homology templates for every single gene. Recent developments have shown that endogenous protein tagging can be achieved efficiently in a homology independent manner. Hence, combinations between CRISPR/Cas9 and generic tag-donor plasmids have been used successfully for targeted gene modifications in mammalian cells. Here, we developed a tool kit comprising a CRISPR/Cas9 expression vector with several EGFP encoding plasmids that should enable tagging of almost every protein expressed in mammalian cells. By performing protein-protein interaction and subcellular localization studies of mTORC1 signal transduction pathway-related proteins expressed in HEK293T cells, we show that tagged proteins faithfully reflect the behavior of their native counterparts under physiological conditions.
Background & Aims: Renal function assessed by creatinine is a key prognostic factor in cirrhotic patients. However, creatinine is influenced by several factors, rendering interpretation difficult in some situations. This is especially important in early stages of renal dysfunction where renal impairment might not be accompanied by an increase in creatinine. Other parameters, such as cystatin C (CysC) and beta‐trace protein (BTP), have been evaluated to fill this gap. However, none of these studies have considered the role of the patient's sex. The present study analysed CysC and BTP to evaluate their prognostic value and differentiate them according to sex.
Patients and methods: CysC and BTP were measured in 173 transjugular intrahepatic portosystemic shunt (TIPS)‐patients from the NEPTUN‐STUDY(NCT03628807) and analysed their relationship with mortality and sex. Propensity score for age, MELD, etiology and TIPS indication was used.
Results_ Cystatin C and BTP showed excellent correlations with creatinine values at baseline and follow‐up. CysC was an independent predictor of overall mortality (HR = 1.66(1.33‐2.06)) with an AUC of 0.75 and identified a cut‐off of 1.55 mg/L in the whole cohort. Interestingly, CysC was significantly lower in females, also after propensity score matching. In males, the only independent predictor was the creatinine level (HR = 1.54(1.25‐1.58)), while in females CysC levels independently predicted mortality (HR = 3.17(1.34‐7.52)).
Conclusion: This study demonstrates for the first time that in TIPS‐patients creatinine predicts mortality in males better than in females, whereas CysC is a better predictor of mortality in females. These results may influence future clinical decisions on therapeutic options for example, allocation for liver transplantation in TIPS‐patients.
Background: Patients with acutely decompensated cirrhosis (AD) may or may not develop acute-on-chronic liver failure (ACLF). ACLF is characterized by high-grade systemic inflammation, organ failures (OF) and high short-term mortality. Although patients with AD cirrhosis exhibit distinct clinical phenotypes at baseline, they have low short-term mortality, unless ACLF develops during follow-up. Because little is known about the association of profile of systemic inflammation with clinical phenotypes of patients with AD cirrhosis, we aimed to investigate a battery of markers of systemic inflammation in these patients.
Methods: Upon hospital admission baseline plasma levels of 15 markers (cytokines, chemokines, and oxidized albumin) were measured in 40 healthy controls, 39 compensated cirrhosis, 342 AD cirrhosis, and 161 ACLF. According to EASL-CLIF criteria, AD cirrhosis was divided into three distinct clinical phenotypes (AD-1: Creatinine<1.5, no HE, no OF; AD-2: creatinine 1.5–2, and or HE grade I/II, no OF; AD-3: Creatinine<1.5, no HE, non-renal OF).
Results: Most markers were slightly abnormal in compensated cirrhosis, but markedly increased in AD. Patients with ACLF exhibited the largest number of abnormal markers, indicating “full-blown” systemic inflammation (all markers). AD-patients exhibited distinct systemic inflammation profiles across three different clinical phenotypes. In each phenotype, activation of systemic inflammation was only partial (30% of the markers). Mortality related to each clinical AD-phenotype was significantly lower than mortality associated with ACLF (p < 0.0001 by gray test). Among AD-patients baseline systemic inflammation (especially IL-8, IL-6, IL-1ra, HNA2 independently associated) was more intense in those who had poor 28-day outcomes (ACLF, death) than those who did not experience these outcomes.
Conclusions: Although AD-patients exhibit distinct profiles of systemic inflammation depending on their clinical phenotypes, all these patients have only partial activation of systemic inflammation. However, those with the most extended baseline systemic inflammation had the highest the risk of ACLF development and death.
Background: The DIAMOND study of de novo liver transplant patients showed that prolonged-release tacrolimus exposure in the acute post-transplant period maintained renal function over 24 weeks of treatment. To assess these findings further, we performed a post-hoc analysis in patients according to baseline kidney function, Model for End-stage Liver Disease [MELD] scores, and donor age.
Material/Methods: Patients received prolonged-release tacrolimus (initial-dose, Arm 1: 0.2 mg/kg/day, Arm 2: 0.15-0.175 mg/kg/day, Arm 3: 0.2 mg/kg/day delayed until Day 5), mycophenolate mofetil and 1 steroid bolus. Arms 2 and 3 also received basiliximab. The recommended tacrolimus target trough levels to Day 42 post-transplantation were 5-15 ng/mL in all arms. In this post-hoc analysis, change in renal outcome, based on estimated glomerular filtration rate (eGFR), Modified Diet in Renal Disease-4 (MDRD4), values from baseline to Week 24 post-transplantation, were assessed according to baseline patient factors: eGFR (≥60 and ˂60 mL/min/1.73 m²), MELD score (˂25 and ≥25) and donor age (˂50 and ≥50 years).
Results: Baseline characteristics were comparable (Arms 1-3: n=283, n=287, n=274, respectively). Patients with baseline renal function, eGFR ≥60 mL/min/1.73 m², experienced a decrease in eGFR in all tacrolimus treatment arms. In patients with lower baseline renal function (eGFR ˂60 mL/min/1.73 m²), an advantage for renal function was observed with both the early lower-dose and delayed higher-dose tacrolimus regimens compared with the early introduction of higher-dose tacrolimus. At Week 24, renal function was higher in the early-lower tacrolimus arm with older donors, and the delayed higher-dose tacrolimus arm with younger donors, both compared with early higher-dose tacrolimus.
Conclusions: Pre-transplantation factors, such as renal function and donor age, could guide the choice of prolonged-release tacrolimus regimen following liver transplantation.
Purpose: To review the role of radiotherapy (RT) in the treatment of renal cell cancer (RCC) in the curative and palliative setting.
Content: Details related to the clinical outcomes of primary, preoperative, postoperative and palliative RT are discussed, along with a presentation of the established role of surgery and systemic therapy. An overview of data derived from mono- and multi-institutional trials is provided.
Conclusion: Radiotherapy has been shown to provide good symptom palliation and local control in RCC depending on the dose that can be delivered. There is emerging data suggesting that with the use of high-precision RT methods the indication spectrum of RT can be exploited covering different clinical situations particularly for unresectable local recurrences and oligometastatic disease.
The ecological role of bacterial seed endophytes associated with wild cabbage in the United Kingdom
(2019)
Endophytic bacteria are known for their ability in promoting plant growth and defense against biotic and abiotic stress. However, very little is known about the microbial endophytes living in the spermosphere. Here, we isolated bacteria from the seeds of five different populations of wild cabbage (Brassica oleracea L) that grow within 15 km of each other along the Dorset coast in the UK. The seeds of each plant population contained a unique microbiome. Sequencing of the 16S rRNA genes revealed that these bacteria belong to three different phyla (Actinobacteria, Firmicutes, and Proteobacteria). Isolated endophytic bacteria were grown in monocultures or mixtures and the effects of bacterial volatile organic compounds (VOCs) on the growth and development on B. oleracea and on resistance against a insect herbivore was evaluated. Our results reveal that the VOCs emitted by the endophytic bacteria had a profound effect on plant development but only a minor effect on resistance against an herbivore of B. oleracea. Plants exposed to bacterial VOCs showed faster seed germination and seedling development. Furthermore, seed endophytic bacteria exhibited activity via volatiles against the plant pathogen F. culmorum. Hence, our results illustrate the ecological importance of the bacterial seed microbiome for host plant health and development.
Background: Pancreatic surgery demands complex multidisciplinary management. Clinical pathways (CPs) are a tool to facilitate this task, but evidence for their utility in pancreatic surgery is scarce. This study evaluated the effect of CPs on quality of care for pancreatoduodenectomy.
Methods: Data of all consecutive patients who underwent pancreatoduodenectomy before (n = 147) or after (n = 148) CP introduction were evaluated regarding catheter and drain management, postoperative mobilization, pancreatic enzyme substitution, resumption of diet and length of stay. Outcome quality was assessed using glycaemia management, morbidity, mortality, reoperation and readmission rates.
Results: Catheters and abdominal drainages were removed significantly earlier in patients treated with CP (p < 0.0001). First intake of liquids, nutritional supplement and solids was significantly earlier in the CP group (p < 0.0001). Exocrine insufficiency was significantly less common after CP implementation (47.3% vs. 69.7%, p < 0.0001). The number of patients receiving intraoperative transfusion dropped significantly after CP implementation (p = 0.0005) and transfusion rate was more frequent in the pre-CP group (p = 0.05). The median number of days with maximum pain level >3 was significantly higher in the CP group (p < 0.0001). There was no significant difference in mortality, morbidity, reoperation and readmission rates.
Conclusions: Following implementation of a CP for pancreatoduodenectomy, several indicators of process and outcome quality improved, while others such as mortality and reoperation rates remained unchanged. CPs are a promising tool to improve quality of care in pancreatic surgery.
Background: By performing case management, general practitioners and health care assistants can provide additional benefits to their chronically ill patients. However, the economic effects of such case management interventions often remain unclear although how to manage the burden of chronic disease is a key question for policy-makers. This analysis aimed to compare the cost-effectiveness of 24 months of primary care case management for patients with a long-term indication for oral anticoagulation therapy with usual care.
Methods: This analysis is part of the cluster-randomized controlled Primary Care Management for Optimized Antithrombotic Treatment (PICANT) trial. A sample of 680 patients with German statutory health insurance was initially considered for the cost analysis (92% of all participants at baseline). Costs included all disease-related direct health care costs from the payer’s perspective (German statutory health insurers) plus case management costs for the intervention group. A-Quality Adjusted Life Year (QALY) measurement (EQ-5D-3 L instrument) was used to evaluate utility, and incremental cost-effectiveness ratio (ICER) to assess cost-effectiveness. Mean differences were calculated and displayed with 95%-confidence intervals (CI) from non-parametric bootstrapping (1000 replicates).
Results: N = 505 patients (505/680, 74%) were included in the cost analysis (complete case analysis with a follow-up after 12 and 24 months as well as information on cost and QALY). After two years, the mean difference of direct health care costs per patient (€115, 95% CI [− 201; 406]) and QALYs (0.03, 95% CI [− 0.04; 0.11]) in the two groups was small and not significant. The costs of case management in the intervention group caused mean total costs per patient in this group to rise significantly (mean difference €503, 95% CI [188; 794]). The ICER was €16,767 per QALY. Regardless of the willingness of insurers to pay per QALY, the probability of the intervention being cost-effective never rose above 70%.
Conclusions: A primary care case management for patients with a long-term indication for oral anticoagulation therapy improved QALYs compared to usual care, but was more costly. However, the results may help professionals and policy-makers allocate scarce health care resources in such a way that the overall quality of care is improved at moderate costs, particularly for chronically ill patients.
Trial registration: Current Controlled Trials ISRCTN41847489.
Background: Biliary rhabdomyosarcoma (RMS) is the most common biliary tumor in children. The management of affected patients contains unique challenges because of the rarity of this tumor entity and its critical location at the porta hepatis, which can make achievement of a radical resection very difficult.
Methods: In a retrospective chart analysis we analysed children suffering from biliary RMS who were registered in three different CWS trials (CWS-96, CWS-2002P, and SoTiSaR registry).
Results: Seventeen patients (12 female, 5 male) with a median age of 4.3 years were assessed. The median follow-up was 42.2 months (10.7–202.5). The 5-year overall (OS) and event free survival (EFS) rates were 58% (45–71) and 47% (34–50), respectively. Patients > 10 years of age and those with alveolar histology had the worst prognosis (OS 0%). Patients with botryoid histology had an excellent survival (OS 100%) compared to those with non-botryoid histology (OS 38%, 22–54, p = 0.047). Microscopic complete tumor resection was achieved in almost all patients who received initial tumor biopsy followed by chemotherapy and delayed surgery.
Conclusion: Positive predictive factors for survival of children with biliary RMS are age ≤ 10 years and botryoid tumor histology. Primary surgery with intention of tumor resection should be avoided.
The capacity of pathogenic microorganisms to adhere to host cells and avoid clearance by the host immune system is the initial and most decisive step leading to infections. Bacteria have developed different strategies to attach to diverse host surface structures. One important strategy is the adhesion to extracellular matrix (ECM) proteins (e.g., collagen, fibronectin, laminin) that are highly abundant in connective tissue and basement membranes. Gram-negative bacteria express variable outer membrane proteins (adhesins) to attach to the host and to initiate the process of infection. Understanding the underlying molecular mechanisms of bacterial adhesion is a prerequisite for targeting this interaction by “anti-ligands” to prevent colonization or infection of the host. Future development of such “anti-ligands” (specifically interfering with bacteria-host matrix interactions) might result in the development of a new class of anti-infective drugs for the therapy of infections caused by multidrug-resistant Gram-negative bacteria. This review summarizes our current knowledge about the manifold interactions of adhesins expressed by Gram-negative bacteria with ECM proteins and the use of this information for the generation of novel therapeutic antivirulence strategies.
Prescribing practice of pregabalin/gabapentin in pain therapy : an evaluation of German claim data
(2019)
Objectives: To analyse the prevalence and incidence of pregabalin and gabapentin (P/G) prescriptions, typical therapeutic uses of P/G with special attention to pain-related diagnoses and discontinuation rates.
Design: Secondary data analysis.
Setting: Primary and secondary care in Germany.
Participants: Four million patients in the years 2009–2015 (anonymous health insurance data).
Intervention: None.
Primary and secondary outcome measures: P/G prescribing rates, P/G prescribing rates associated with pain therapy, analysis of pain-related diagnoses leading to new P/G prescriptions and the discontinuation rate of P/G.
Results: In 2015, 1.6% of insured persons received P/G prescriptions. Among the patients with pain first treated with P/G, as few as 25.7% were diagnosed with a typical neuropathic pain disorder. The remaining 74.3% had either not received a diagnosis of neuropathic pain or showed a neuropathic component that was pathophysiologically conceivable but did not support the prescription of P/G. High discontinuation rates were observed (85%). Among the patients who had discontinued the drug, 61.1% did not receive follow-up prescriptions within 2 years.
Conclusion: The results show that P/G is widely prescribed in cases of chronic pain irrespective of neuropathic pain diagnoses. The high discontinuation rate indicates a lack of therapeutic benefits and/or the occurrence of adverse effects.
The thrombopoietin receptor agonist eltrombopag was successfully used against human cytomegalovirus (HCMV)-associated thrombocytopenia refractory to immunomodulatory and antiviral drugs. These effects were ascribed to the effects of eltrombopag on megakaryocytes. Here, we tested whether eltrombopag may also exert direct antiviral effects. Therapeutic eltrombopag concentrations inhibited HCMV replication in human fibroblasts and adult mesenchymal stem cells infected with six different virus strains and drug-resistant clinical isolates. Eltrombopag also synergistically increased the anti-HCMV activity of the mainstay drug ganciclovir. Time-of-addition experiments suggested that eltrombopag interfered with HCMV replication after virus entry. Eltrombopag was effective in thrombopoietin receptor-negative cells, and the addition of Fe3+ prevented the anti-HCMV effects, indicating that it inhibits HCMV replication via iron chelation. This may be of particular interest for the treatment of cytopenias after hematopoietic stem cell transplantation, as HCMV reactivation is a major reason for transplantation failure. Since therapeutic eltrombopag concentrations are effective against drug-resistant viruses, and synergistically increase the effects of ganciclovir, eltrombopag is also a drug-repurposing candidate for the treatment of therapy-refractory HCMV diseas.
Background: The effects of blood flow restriction (training) may serve as a model of peripheral artery disease. In both conditions, circulating micro RNAs (miRNAs) are suggested to play a crucial role during exercise-induced arteriogenesis. We aimed to determine whether the profile of circulating miRNAs is altered after acute resistance training during blood flow restriction (BFR) as compared with unrestricted low- and high-volume training, and we hypothesized that miRNA that are relevant for arteriogenesis are affected after resistance training.
Methods: Eighteen healthy volunteers (aged 25 ± 2 years) were enrolled in this three-arm, randomized-balanced crossover study. The arms were single bouts of leg flexion/extension resistance training at (1) 70% of the individual single-repetition maximum (1RM), (2) at 30% of the 1RM, and (3) at 30% of the 1RM with BFR (artificially applied by a cuff at 300 mm Hg). Before the first exercise intervention, the individual 1RM (N) and the blood flow velocity (m/s) used to validate the BFR application were determined. During each training intervention, load-associated outcomes (fatigue, heart rate, and exhaustion) were monitored. Acute effects (circulating miRNAs, lactate) were determined using pre-and post-intervention measurements.
Results: All training interventions increased lactate concentration and heart rate (p < 0.001). The high-intensity intervention (HI) resulted in a higher lactate concentration than both lower-intensity training protocols with BFR (LI-BFR) and without (LI) (LI, p = 0.003; 30% LI-BFR, p = 0.008). The level of miR-143-3p was down-regulated by LI-BFR, and miR-139-5p, miR-143-3p, miR-195-5p, miR-197-3p, miR-30a-5p, and miR-10b-5p were up-regulated after HI. The lactate concentration and miR-143-3p expression showed a significant positive linear correlation (p = 0.009, r = 0.52). A partial correlation (intervention partialized) showed a systematic impact of the type of training (LI-BFR vs. HI) on the association (r = 0.35 remaining after partialization of training type).
Conclusions: The strong effects of LI-BFR and HI on lactate- and arteriogenesis-associated miRNA-143-3p in young and healthy athletes are consistent with an important role of this particular miRNA in metabolic processes during (here) artificial blood flow restriction. BFR may be able to mimic the occlusion of a larger artery which leads to increased collateral flow, and it may therefore serve as an external stimulus of arteriogenesis.
Background: Hemorrhagic shock can lead to intestinal damage with subsequent hyperinflammation and multiple organ dysfunction syndrome (MODS). The intestinal fatty acid-binding protein (I-FABP) is solely expressed in the intestine and is released extracellulary after tissue damage. This study evaluates the validity of I-FABP as an early biomarker to detect hemorrhagic shock and abdominal injury.
Patients and methods: Severely injured patients with an Injury Severity Score (ISS) ≥ 16 points and an age ≥ 18 years, admitted from January 2010 to December 2016, were included. Overall, 26 patients retrospectively presented with hemorrhagic shock to the emergency room (ER): 8 patients without abdominal injury ("HS noAbd") and 18 patients with abdominal injury ("HS Abd"). Furthermore, 16 severely injured patients without hemorrhagic shock and without abdominal injury ("noHS noAbd") were retrospectively selected as controls. Plasma I-FABP levels were measured at admission to the ER and up to 3 days posttraumatic (d1-d3).
Results: Median I-FABP levels were significantly higher in the "HS Abd" group compared with the "HS noAbd" group (28,637.0 pg/ml [IQR = 6372.4-55,550.0] vs. 7292.3 pg/ml [IQR = 1282.5-11,159.5], p < 0.05). Furthermore, I-FABP levels of both hemorrhagic shock groups were significantly higher compared with the "noHS noAbd" group (844.4 pg/ml [IQR = 530.0-1432.9], p < 0.05). The time course of I-FABP levels showed a peak on the day of admission with a subsequent decline in the post-traumatic course. Furthermore, significant correlations between I-FABP levels and clinical parameters of hemorrhagic shock, such as hemoglobin, lactate value, systolic blood pressure (SBP), and shock index, were found.The optimal cut-off level of I-FABP for detection of hemorrhagic shock was 1761.9 pg/ml with a sensitivity of 85% and a specificity of 81%.
Conclusion: This study confirmed our previous observation that I-FABP might be used as a suitable early biomarker for the detection of abdominal injuries in general. In addition, I-FABP may also be a useful and a promising parameter in the diagnosis of hemorrhagic shock, because of reflecting low intestinal perfusion.
Cerebral radiation necrosis is a common complication of the radiotherapy of brain tumours that can cause significant mortality. Corticosteroids are the standard of care, but their efficacy is limited and the consequences of long-term steroid therapy are problematic, including the risk of adrenal insufficiency (AI). Off-label treatment with the vascular endothelial growth factor A antibody bevacizumab is highly effective in steroid-resistant radiation necrosis. Both the preservation of neural tissue integrity and the cessation of steroid therapy are key goals of bevacizumab treatment. However, the withdrawal of steroids may be impossible in patients who develop AI. In order to elucidate the frequency of AI in patients with cerebral radiation necrosis after treatment with corticosteroids and bevacizumab, we performed a retrospective study at our institution’s brain tumour centre. We obtained data on the tumour histology, age, duration and maximum dose of dexamethasone, radiologic response to bevacizumab, serum cortisol, and the need for hydrocortisone substitution for AI. We identified 17 patients with cerebral radiation necrosis who had received treatment with bevacizumab and had at least one available cortisol analysis. Fifteen patients (88%) had a radiologic response to bevacizumab. Five of the 17 patients (29%) fulfilled criteria for AI and required hormone substitution. Age, duration of dexamethasone treatment, and time since radiation were not statistically associated with the development of AI. In summary, despite the highly effective treatment of cerebral radiation necrosis with bevacizumab, steroids could yet not be discontinued due to the development of AI in roughly one-third of patients. Vigilance to spot the clinical and laboratory signs of AI and appropriate testing and management are, therefore, mandated.
We investigate system-size effects on the rotational diffusion of membrane proteins and other membrane-embedded molecules in molecular dynamics simulations. We find that the rotational diffusion coefficient slows down relative to the infinite-system value by a factor of one minus the ratio of protein and box areas. This correction factor follows from the hydrodynamics of rotational flows under periodic boundary conditions and is rationalized in terms of Taylor-Couette flow. For membrane proteins like transporters, channels, or receptors in typical simulation setups, the protein-covered area tends to be relatively large, requiring a significant finite-size correction. Molecular dynamics simulations of the protein adenine nucleotide translocase (ANT1) and of a carbon nanotube porin in lipid membranes show that the hydrodynamic finite-size correction for rotational diffusion is accurate in standard-use cases. The dependence of the rotational diffusion on box size can be used to determine the membrane viscosity.