Institutes
Refine
Year of publication
- 2020 (246) (remove)
Document Type
- Article (197)
- Preprint (39)
- Doctoral Thesis (10)
Language
- English (246) (remove)
Has Fulltext
- yes (246)
Is part of the Bibliography
- no (246)
Keywords
- COVID-19 (9)
- SARS-CoV-2 (5)
- bladder cancer (4)
- Aortic stenosis (3)
- Hypoxia (3)
- Inflammation (3)
- Macrophages (3)
- Mortality (3)
- Peri-implantitis (3)
- Postural control (3)
Institute
- Medizin (246)
- Biowissenschaften (3)
- Ernst Strüngmann Institut (3)
- Frankfurt Institute for Advanced Studies (FIAS) (3)
- MPI für Hirnforschung (3)
- Psychologie (2)
- Biochemie und Chemie (1)
- Biochemie, Chemie und Pharmazie (1)
- Buchmann Institut für Molekulare Lebenswissenschaften (BMLS) (1)
- Informatik und Mathematik (1)
- MPI für Biophysik (1)
- Psychologie und Sportwissenschaften (1)
- Zentrum für Arzneimittelforschung, Entwicklung und Sicherheit (ZAFES) (1)
- Zentrum für Biomolekulare Magnetische Resonanz (BMRZ) (1)
Background: Previous studies have demonstrated that CF (Cystic Fibrosis) prognosis is dependent of three major parameters: FEV1 (Forced Expiratory Pressure in one second), BMI (Body Mass Index) and need of intravenous antibiotic therapy. The CF centres of Frankfurt, Germany, and Moscow, Russia, care for cystic fibrosis patients. We decided to investigate and compare both centers from 1990 to 2015. No comparable study has been published so far.
Method: German patient data was collected from the national cystic fibrosis database “Muko.web”. Missing values were extracted from the Hospital Information System. Russian patient data were taken directly from the medical records in Moscow. In a descriptive statistical analysis with Bias and R Studio the values were compared.
Result: A total of 428 patients from Moscow (217 male, 211 female; 348 (81,3%) were P. aeruginosa positive) and 159 patients from Frankfurt (92 male, 67 female; 137 (86,2%) with P. aeruginosa positive) were compared with regard to P. aeruginosa positivity, BMI, FEV1 and need of intravenous antibiotic therapy. CF patients in Moscow stratified by age groups had lower BMI than CF patients in Frankfurt (age 16-18: p=0,003; age 19-22: p=0,004; age 23-29: p<0,001; age 30-35: p<0,001; age 36-66: p=0,024). In a matching pairs analysis including 100 patients from Frankfurt and 100 patients from Moscow for the year 2015 FEV1 was significantly lower in Moscow patients (p<0,001).
Conclusion: BMI, FEV1 and need of intravenous therapy have significant impact on survival and on quality of life of CF patients. A lower BMI and a lower FEV1 result in a worse survival and determine the prognosis. This study showed a significant difference in prognostic parameters between Frankfurt and Moscow in the crosssectional analysis for the year 2015. A further study should evaluate this difference to show whether this difference will be found over a longer period of time.
MicroRNA miR-181 - a rheostat for TCR signaling in thymic selection and peripheral T-Cell function
(2020)
The selection of T cells during intra-thymic d evelopment is crucial to obtain a functional and simultaneously not self-reactive peripheral T cell repertoire. However, selection is a complex process dependent on T cell receptor (TCR) thresholds that remain incompletely understood. In peripheral T cells, activation, clonal expansion, and contraction of the active T cell pool, as well as other processes depend on TCR signal strength. Members of the microRNA (miRNA) miR-181 family have been shown to be dynamically regulated during T cell development as well as dependent on the activation stage of T cells. Indeed, it has been shown that expression of miR-181a leads to the downregulation of multiple phosphatases, implicating miR-181a as ‘‘rheostat’’ of TCR signaling. Consistently, genetic models have revealed an essential role of miR-181a/b-1 for the generation of unconventional T cells as well as a function in tuning TCR sensitivity in peripheral T cells during aging. Here, we review these broad roles of miR-181 family members in T cell function via modulating TCR signal strength.
MicroRNAs (miRNAs) have emerged as critical posttranscriptional regulators of the immune system, including function and development of regulatory T (Treg) cells. Although this critical role has been firmly demonstrated through genetic models, key mechanisms of miRNA function in vivo remain elusive. Here, we review the role of miRNAs in Treg cell development and function. In particular, we focus on the question what the study of miRNAs in this context reveals about miRNA biology in general, including context-dependent function and the role of individual targets vs. complex co-targeting networks. In addition, we highlight potential technical pitfalls and state-of-the-art approaches to improve the mechanistic understanding of miRNA biology in a physiological context.
Background: Radiotherapy dose and target volume prescriptions for anal squamous cell carcinoma (ASCC) vary considerably in daily practice and guidelines, including those from NCCN, UK, Australasian, and ESMO. We conducted a pattern-of-care survey to assess the patient management in German speaking countries.
Methods: We developed an anonymous questionnaire comprising 18 questions on diagnosis and treatment of ASCC. The survey was sent to 361 DEGRO-associated institutions, including 41 university hospitals, 118 non-university institutions, and 202 private practices.
Results: We received a total of 101 (28%) surveys, including 20 (19.8%) from university, 36 (35.6%) from non-university clinics, and 45 (44.6%) from private practices. A total of 28 (27.8%) institutions reported to treat more than 5 patients with early-stage ASCC and 42 (41.6%) institutions treat more than 5 patients with locoregionally-advanced ASCC per year. Biopsy of suspicious inguinal nodes was advocated in only 12 (11.8%) centers. Screening for human immunodeficiency virus (HIV) is done in 28 (27.7%). Intensity modulated radiotherapy or similar techniques are used in 97%. The elective lymph node dose ranged from 30.6 Gy to 52.8 Gy, whereas 87% prescribed 50.4–55. 8 Gy (range: 30.6 to 59.4 Gy) to the involved lymph nodes. The dose to gross disease of cT1 or cT2 ASCC ranged from 50 to ≥60 Gy. For cT3 or cT4 tumors the target dose ranged from 54 Gy to more than 60 Gy, with 76 (75.2%) institutions prescribing 59.4 Gy. The preferred concurrent chemotherapy regimen was 5-FU/Mitomycin C, whereas 6 (6%) prescribed Capecitabine/Mitomycin C. HIV-positive patients are treated with full-dose CRT in 87 (86.1%) institutions. First assessment for clinical response is reported to be performed at 4–6 weeks after completion of CRT in 2 (2%) institutions, at 6–8 weeks in 20 (19.8%), and 79 (78%) institutions wait up to 5 months.
Conclusions: We observed marked differences in radiotherapy doses and treatment technique in patients with ASCC, and also variable approaches for patients with HIV. These data underline the need for an consensus treatment guideline for ASCC.
Purpose: Perfusion-weighted MRI (PWI) and O-(2-[18F]fluoroethyl-)-l-tyrosine ([18F]FET) PET are both applied to discriminate tumor progression (TP) from treatment-related changes (TRC) in patients with suspected recurrent glioma. While the combination of both methods has been reported to improve the diagnostic accuracy, the performance of a sequential implementation has not been further investigated. Therefore, we retrospectively analyzed the diagnostic value of consecutive PWI and [18F]FET PET.
Methods: We evaluated 104 patients with WHO grade II–IV glioma and suspected TP on conventional MRI using PWI and dynamic [18F]FET PET. Leakage corrected maximum relative cerebral blood volumes (rCBVmax) were obtained from dynamic susceptibility contrast PWI. Furthermore, we calculated static (i.e., maximum tumor to brain ratios; TBRmax) and dynamic [18F]FET PET parameters (i.e., Slope). Definitive diagnoses were based on histopathology (n = 42) or clinico-radiological follow-up (n = 62). The diagnostic performance of PWI and [18F]FET PET parameters to differentiate TP from TRC was evaluated by analyzing receiver operating characteristic and area under the curve (AUC).
Results: Across all patients, the differentiation of TP from TRC using rCBVmax or [18F]FET PET parameters was moderate (AUC = 0.69–0.75; p < 0.01). A rCBVmax cutoff > 2.85 had a positive predictive value for TP of 100%, enabling a correct TP diagnosis in 44 patients. In the remaining 60 patients, combined static and dynamic [18F]FET PET parameters (TBRmax, Slope) correctly discriminated TP and TRC in a significant 78% of patients, increasing the overall accuracy to 87%. A subgroup analysis of isocitrate dehydrogenase (IDH) mutant tumors indicated a superior performance of PWI to [18F]FET PET (AUC = 0.8/< 0.62, p < 0.01/≥ 0.3).
Conclusion: While marked hyperperfusion on PWI indicated TP, [18F]FET PET proved beneficial to discriminate TP from TRC when PWI remained inconclusive. Thus, our results highlight the clinical value of sequential use of PWI and [18F]FET PET, allowing an economical use of diagnostic methods. The impact of an IDH mutation needs further investigation.
Bipolar disorder (BD) and major depressive disorder (MDD) are severe mood disorders that belong to the most debilitating diseases worldwide. Differentiating both mood disorders often poses a major clinical challenge, leading to frequent misdiagnoses. Objective biomarkers able to differentiate individuals with BD and MDD therefore represent a psychiatric research field of utmost importance. Recent studies have applied resting-state fMRI paradigms and found promising results differentiating both disorders based on the acquired data. However, most of these studies have focused their efforts on acutely depressed patients. Thus, it remains unclear whether the aberrations remain in a symptomless disease state.
The here presented study addresses these issues by evaluating the ability to differentiate both disorders from one another by conducting a between-group comparison of functional brain network connectivity (FNC) obtained from resting-state fMRI data. Data were collected from 20 BD, 15 MDD patients and 30 age- and gender-matched healthy controls (HC). Graph theoretical analyses were applied to detect differences in functional network organization between the groups on a global and regional network level.
Network analysis detected frontal, temporal and subcortical nodes in emotion regulation areas such as the limbic system and associated regions exhibiting significant differences in network integration and segregation in BD compared to MDD patients and HC. Participants with MDD and HC only differed in frontal and insular network centrality.
These results indicate that a significantly altered brain network topology in the limbic system might be a trait marker specific to BD. Brain network analysis in these regions may therefore be used to differentiate euthymic BD not only from HC but also from patients with MDD.
Current research on medical biomaterials have shown that the physical and chemical characteristics of biomaterials determine the body inflammatory cellular reaction after their implantation. The aim of this study was to evaluate the individual effects of the physical characteristics over the initial biomaterial-cellular interaction and the inflammatory cellular reaction. For this purpose, an equine-derived collagen hemostatic sponge (E-CHS) was modified by pressing and evaluated using ex vivo, in vitro and in vivo methods.
The E-CHS was pressed by applying constant pressure (6.47± 0.85 N) for 2 min using a sterile stainless-steel cylinder and cut in segments of 1cm2. Subsequently, E-CHS and the pressed equine-derived collagen hemostatic sponge (P-E-CHS) were studied as two independent biomaterials and compared to a control group (CG).
A blood concentrate containing inflammatory cells known as platelet rich fibrin (PRF) was used to mimic the initial biomaterial-cell interaction and to measure the absorption coefficient of the biomaterials to liquid PRF (iPAC). Additionally, the biomaterials were cultivated together with PRF for 3 and 6 days to measure the induction of pro-inflammatory cytokines (TNF-α and IL-8). The results were obtained through enzyme-linked immunosorbent assay (ELISA) and histological methods. PRF cultivated without biomaterials served as the CG. Additionally, the biomaterials were evaluated in vivo using a subcutaneous model in Wistar rats and compared to sham operated animals (CG) representing physiologic wound healing. After 3, 15 and 30 days, the explanted samples were evaluated using histochemical and immunohistochemical (IHC) staining using the following markers: CD68 (pan macrophages), CCR7 (pro-inflammatory macrophages, M1), CD206 (pro-wound healing macrophages, M2) and α-Smooth Muscle Actin (α-SMA; vessel identification).
After the mixture of liquid PRF with both biomaterials for 15 minutes, the ex vivo results showed that E-CHS was penetrated by cells, whereas P-E-CHS was cell-occlusive. Additionally, P-E-CHS induced a higher release of pro-inflammatory cytokines compared to liquid PRF alone (CG) and E-CHS after 3 days (P< 0.05). Although the biomaterial was pressed, the difference of the iPAC value did not show statistical differences. In vivo, the CG induced at day 3 a higher inflammatory response compared to the experimental groups (EG) (P< 0.05). The intergroup comparison showed that P-E-CHS induced a higher presence of macrophages (CD68+/CC7+) compared to E-CHS at day 3 (P< 0.05). Only CD68+/CCR7+ mononuclear cells (MNCs) were observed without multinucleated giant cells (MNGCs). After 15 days, the presence of macrophages (CD68+ P<0.01 /CCR7+ P<0.001 /CD206+ P<0.05) reduced considerably in the CG. On the contrary, the inflammatory response increased in the EGs (CD68+/CCR7+). The intergroup comparison showed that this increment was statistically significant when comparing E-CHS and P-E-CHS to the CG at day 15 (P<0.01 and P< 0.05 respectively). At this time point, a reduced number of MNGCs were observed in the EGs. In the CG no MNGCs were observed. Furthermore, E-CHS showed a faster degradation rate and was fully invaded by cells and vessels formed in its interior region. On the other hand, P-E-CHS remained occlusive to cell penetration and vessels were formed only in the periphery. After 30 days, the cellular reaction shifted to a higher number of M2 macrophages (CD260+) in all groups and a reduced presence of CD68+ and CCR7+ MNCs. Both biomaterials degraded and only small fragments were found in the implantation bed surrounded by MNGCs (CCR7+).
These results are of high clinical relevance and show that changes in biomaterial properties have a significant impact on their interaction with the body. They also serve as insight into the possibility to develop versatile biomaterials with different applications. For example, E-CHs can be applied to support hemostasis in a bleeding alveolar socket and P-E-CHs by being cell occlusive and having a delayed degradation rate can be applied for guided bone and tissue regeneration.
Objective: To evaluate the incidence and risk factors of generalized convulsive seizure (GCS)-related fractures and injuries during video-EEG monitoring.
Methods: We analyzed all GCSs in patients undergoing video-EEG-monitoring between 2007 and 2019 at epilepsy centers in Frankfurt and Marburg in relation to injuries, falls and accidents associated with GCSs. Data were gathered using video material, EEG material, and a standardized reporting form.
Results: A total of 626 GCSs from 411 patients (mean age: 33.6 years; range 3–74 years; 45.0% female) were analyzed. Severe adverse events (SAEs) such as fractures, joint luxation, corneal erosion, and teeth loosening were observed in 13 patients resulting in a risk of 2.1% per GCS (95% CI 1.2–3.4%) and 3.2% per patient (95% CI 1.8–5.2%). Except for a nasal fracture due to a fall onto the face, no SAEs were caused by falls, and all occurred in patients lying in bed without evidence of external trauma. In seven patients, vertebral body compression fractures were confirmed by imaging. This resulted in a risk of 1.1% per GCS (95% CI 0.5–2.2%) and 1.7% per patient (95% CI 0.8–3.3%). These fractures occurred within the tonic phase of a GCS and were accompanied by a characteristic cracking noise. All affected patients reported back pain spontaneously, and an increase in pain on percussion of the affected spine section.
Conclusions: GCSs are associated with a substantial risk of fractures and shoulder dislocations that are not associated with falls. GCSs accompanied by audible cracking, and resulting in back pain, should prompt clinical and imaging evaluations.
miR-142-3p expression is predictive for severe traumatic brain injury (TBI) in trauma patients
(2020)
Background: Predictive biomarkers in biofluids are the most commonly used diagnostic method, but established markers in trauma diagnostics lack accuracy. This study investigates promisingmicroRNAs(miRNA)releasedfromaffectedtissueafterseveretraumathathavepredictive values for the effects of the injury.
Methods: A retrospective analysis of prospectively collected data and blood samples of n = 33 trauma patients (ISS≥16) is provided. Levels of miR-9-5p, -124-3p, -142-3p, -219a-5p, -338-3pand-423-3p inseverelyinjuredpatients (PT)withouttraumatic braininjury (TBI) or with severe TBI (PT + TBI) and patients with isolated TBI (isTBI) were measured within 6 h after trauma.
Results: The highest miR-423-3p expression was detected in patients with severe isTBI, followed by patients with PT + TBI, and lowest levels were found in PT patients without TBI (2−∆∆Ct,p = 0.009). ApositivecorrelationbetweenmiR-423-3plevelandincreasingAIShead (p = 0.001) and risk of mortality (RISC II, p = 0.062) in trauma patients (n = 33) was found. ROC analysis of miR-423-3p levels revealed them as statistically significant to predict the severity of brain injury in trauma patients (p = 0.006). miR-124-3p was only found in patients with severe TBI, miR-338-3p was shown in all trauma groups. miR-9-5p, miR-142-3p and miR-219a-5p could not be detected in any of the four groups. Conclusion: miR-423-3p expression is significantly elevated after isolated traumatic braininjuryandpredictableforsevereTBIinthefirsthoursaftertrauma. miR-423-3pcouldrepresent a promising new biomarker to identify severe isolated TBI.
Objective: Phenotypic (Sensititre Myco, pDST) and genotypic drug susceptibility testing (GenoType NTM DR, gDST) in M. avium complex (MAC) have become available as standardized assays, but comparable data is needed. This study aimed to investigate the phenotypic and genotypic drug susceptibility patterns in MAC clinical isolates.
Methods: Overall, 98 isolates from 85 patients were included. pDST and gDST were performed on all isolates and results compared regarding specificity and sensitivity using pDST as a reference method. The impact of drug instability on pDST results was studied using a biological assay over 14 days. In addition, the evolution of antimicrobial resistance was investigated in sequential isolates of 13 patients.
Results: Macrolide resistance was rare, 1.2% (95% CI 0.7–7.3) of isolates in the base cohort. No aminoglycoside resistances were found, but 14.1% of the studied isolates (95% CI 7.8–23.8) showed intermediate susceptibility. The GenoType NTM DR identified two out of four macrolide-resistant isolates. Antibiotic stability was demonstrated to be poor in rifampicin, rifabutin, and doxycycylin.
Conclusions: pDST results in NTM for unstable antibiotics must be interpreted with care. A combination of pDST and gDST will be useful for the guidance of antimicrobial therapy in MAC-disease.
Although the therapeutic armamentarium for bladder cancer has considerably widened in the last few years, severe side effects and the development of resistance hamper long-term treatment success. Thus, patients turn to natural plant products as alternative or complementary therapeutic options. One of these is curcumin, the principal component of Curcuma longa that has shown chemopreventive effects in experimental cancer models. Clinical and preclinical studies point to its role as a chemosensitizer, and it has been shown to protect organs from toxicity induced by chemotherapy. These properties indicate that curcumin could hold promise as a candidate for additive cancer treatment. This review evaluates the relevance of curcumin as an integral part of therapy for bladder cancer.
Aims: This post hoc analysis of ELIMINATE-AF evaluated requirements of unfractionated heparin (UFH) and procedure-related bleeding in atrial fibrillation (AF) patients undergoing ablation with uninterrupted edoxaban or vitamin K antagonist (VKA) therapy.
Methods and results: Patients were randomized 2:1 to once-daily edoxaban 60 mg (or dose-reduced 30 mg) or dose-adjusted VKA (target international normalized ratio: 2.0–3.0). Uninterrupted anticoagulation was mandated for 21–28 days’ pre-ablation and 90 days’ post-ablation. During ablation, UFH administration targeted an activated clotting time (ACT) of 300–400 s. Periprocedural bleeding was differentiated between procedure-related (bleeding at puncture side, cardiac tamponade) and unrelated events. Of 614 randomized patients, 553 received study drug and underwent catheter ablation (edoxaban n = 375; VKA n = 178). The median (Q1–Q3) time from last dose to ablation procedure was 14.8 (13.3–16.5) vs. 16.5 (14.8–19.5) h (edoxaban vs. VKA group, respectively). Mean ACT (SD) ≥300 s was observed in 52% edoxaban- vs. 76% VKA-treated patients, despite a higher mean (SD) UFH dose in the edoxaban vs. VKA group [14 261 (6397) IU vs. 11 473 (4300) IU; exploratory P-value < 0.0001]. In the edoxaban group, 13 patients (3.5%) had procedure-related bleeds of whom 9 had received an UFH dose above the median (13 000 IU). In the VKA arm, 7 patients (3.9%) had procedure-related bleeds of whom 3 had received an UFH dose above the median (10 225 IU).
Conclusion: The rate of procedure-related major/clinically relevant non-major bleeding did not differ between the treatment arms despite higher doses of UFH used with edoxaban vs. VKA to achieve a target ACT during AF ablation.
Circulating monocytes contribute to inflammatory processes. We here validate abnormal expression of inflammation-related genes in monocytes of a large and well-characterised group of MDD patients, and relate the outcomes to pertinent clinical characteristics. Thirty-two genes of a previously established inflammation-related gene signature were assessed in 197 patients with MDD, and 151 controls collected during the EU-MOODINFLAME project. Monocyte gene- expression data were related to age, sex, BMI, depression severity, childhood adversity (CA) and suicide risk (SR). Three distinct gene profiles were identified within the MDD group (downregulated, mixed upregulated and strongly upregulated genes). Patients in the merged upregulated groups had a significantly higher prevalence of CA and high SR. Using hierarchical clustering of the genes, we found a cluster of mainly cytokine (production)-related genes; patients with SR had a significantly higher expression of this cluster than patients without SR (particularly for IL-6, IL1A and IL1B). Such difference did not emerge for patients with and without CA. A downregulated gene profile was found for patients not exposed to CA and without SR (particularly for glucocorticoid-signalling genes NR3C1a and HSPA1/B). No inflammatory changes were observed for healthy controls exposed to CA. Our data show that inflammatory activation in MDD is not uniform, and that immunologically discernible phenotypes of depression can be linked to CA and high SR. The absence of monocyte inflammatory activation in healthy controls exposed to CA suggests an inflammatory involvement in MDD-prone individuals exposed to early stressors, but not healthy controls.
Fasting Ramadan is known to influence patients’ medication adherence. Data on patients’ behavior to oral anticoagulant (OAC) drug intake during Ramadan is missing. We aimed to determine patient-guided modifications of OAC medication regimen during Ramadan and to evaluate its consequences. A multicenter cross-sectional study conducted in Saudi Arabia. Data were collected shortly after Ramadan 2019. Participants were patients who fasted Ramadan and who were on long-term anticoagulation. Patient-guided medication changes during Ramadan in comparison to the regular intake schedule before Ramadan were recorded. Modification behavior was compared between twice daily (BID) and once daily (QD) treatment regimens. Rates of hospital admission during Ramadan were determined. We included 808 patients. During Ramadan, 53.1% modified their intake schedule (31.1% adjusted intake time, 13.2% skipped intakes, 2.2% took double dosing). A higher frequency of patient-guided modification was observed in patients on BID regimen compared to QD regimen. During Ramadan, 11.3% of patients were admitted to hospital. Patient-guided modification was a strong predictor for hospital admission. Patient-guided modification of OAC intake during Ramadan is common, particularly in patients on BID regimen. It increases the risk of hospital admission during Ramadan. Planning of OAC intake during Ramadan and patient education on the risk of low adherence are advisable.
Minimal residual disease (MRD) is the strongest predictor of relapse in B-cell precursor acute lymphoblastic leukemia (BCP-ALL). In BLAST study (NCT01207388), adults with BCP-ALL in remission with MRD after chemotherapy received blinatumomab, a CD19 BiTE® immuno-oncotherapy, 15 µg/m2/day for up to four 6-week cycles (4 weeks continuous infusion, 2 weeks off). Survival was evaluated for 110 patients, including 74 who received HSCT in continuous complete remission. With a median follow-up of 59·8 months, median survival (months) was 36·5 (95% CI: 22.0–not reached [NR]). Median survival was NR (29.5–NR) for complete MRD responders (n = 84) and 14.4 (3.8–32.3) for MRD non-responders (n = 23; p = 0.002); after blinatumomab and HSCT, median survival was NR (25.7–NR) (n = 61) and 16.5 (1.1–NR) (n = 10; p = 0.065), respectively. This final analysis suggests complete MRD response during blinatumomab treatment is curative. Post-hoc analysis of study data suggests while post blinatumomab HSCT may be beneficial in appropriate patients, long-term survival without HSCT is also possible.
Background and Objectives: Patient blood (more accurately: haemoglobin, Hb) management (PBM) aims to optimize endogenous Hb production and to minimize iatrogenic Hb loss while maintaining patient safety and optimal effectiveness of medical interventions. PBM was adopted as policy for patients by the World Health Organization (WHO), and, all the more, should be applied to healthy donors. Materials and Methods: Observational data from 489 bone marrow (BM) donors were retrospectively analysed, and principles of patient blood management were applied to healthy volunteer BM donations. Results and Conclusion: We managed to render BM aspiration safe for donors, notably completely avoiding the collection of autologous blood units and blood transfusions through iron management, establishment and curation of high-yield aspiration technique, limitation of collection volume to 1·5% of donor body weight and development of volume prediction algorithms for the requested cell dose.
Purpose: Despite the high number of patients with phalangeal fractures, evidence-based recommendations for the treatment of specific phalangeal fractures could not be concluded from the literature. The purpose of the present study was to assess current epidemiological data, classification of the fracture type, and mode of treatment.
Methods: This study presents a retrospective review of 261 patients with 283 phalangeal fractures ≥ 18 years of age who were treated in our level I trauma centre between 2017 and 2018. The data were obtained by the analysis of the institution’s database, and radiological examinations.
Results: The average age of the patients was 40.4 years (range 18–98). The ratio of male to female patients was 2.7:1. The two most typical injury mechanisms were crush injuries (33%) and falls (23%). Most phalangeal fractures occurred in the distal phalanx (P3 43%). The 4th ray (D4 29%) was most frequently affected. The P3 tuft fractures, and the middle phalanx (P2) base fractures each accounted for 25% of fracture types. A total of 74% of fractures were treated conservatively, and 26% required surgery, with Kirschner wire(s) (37%) as the preferred surgical treatment. The decision for surgical treatment correlated with the degree of angular and/or rotational deformity, intraarticular step, and sub-/luxation of specific phalangeal fractures, but not with age and gender.
Conclusions: Our findings demonstrated the popularity of conservative treatment of phalangeal fractures, while surgery was only required in properly selected cases. The correct definition of precise fracture pattern in addition to topography is essential to facilitate treatment decision-making.
Objective: Deep brain stimulation (DBS) of the ventral intermediate nucleus (VIM) is a mainstay treatment for severe and drug-refractory essential tremor (ET). Although stimulation-induced dysarthria has been extensively described, possible impairment of swallowing has not been systematically investigated yet. Methods: Twelve patients with ET and bilateral VIM-DBS with self-reported dysphagia after VIM-DBS were included. Swallowing function was assessed clinically and using by flexible endoscopic evaluation of swallowing in the stim-ON and in the stim-OFF condition. Presence, severity, and improvement of dysphagia were recorded. Results: During stim-ON, the presence of dysphagia could be objectified in all patients, 42% showing mild, 42% moderate, and 16 % severe dysphagia. During stim-OFF, all patients experienced a statistically significant improvement of swallowing function. Interpretation: VIM-DBS may have an impact on swallowing physiology in ET-patients. Further studies to elucidate the prevalence and underlying pathophysiological mechanisms are warranted.
Ischemic lesion location based on the ASPECT score for risk assessment of neurogenic dysphagia
(2020)
Dysphagia is common in patients with middle cerebral artery (MCA) infarctions and associated with malnutrition, pneumonia, and mortality. Besides bedside screening tools, brain imaging findings may help to timely identify patients with swallowing disorders. We investigated whether the Alberta stroke program early CT score (ASPECTS) allows for the correlation of distinct ischemic lesion patterns with dysphagia. We prospectively examined 113 consecutive patients with acute MCA infarctions. Fiberoptic endoscopic evaluation of swallowing (FEES) was performed within 24 h after admission for validation of dysphagia. Brain imaging (CT or MRI) was rated for ischemic changes according to the ASPECT score. 62 patients (54.9%) had FEES-proven dysphagia. In left hemispheric strokes, the strongest associations between the ASPECTS sectors and dysphagia were found for the lentiform nucleus (odds ratio 0.113 [CI 0.028–0.433; p = 0.001), the insula (0.275 [0.102–0.742]; p = 0.011), and the frontal operculum (0.280 [CI 0.094–0.834]; p = 0.022). A combination of two or even all three of these sectors together increased relative dysphagia frequency up to 100%. For right hemispheric strokes, only non-significant associations were found which were strongest for the insula region. The distribution of early ischemic changes in the MCA territory according to ASPECTS may be used as risk indicator of neurogenic dysphagia in MCA infarction, particularly when the left hemisphere is affected. However, due to the exploratory nature of this research, external validation studies of these findings are warranted in future.
Background:Aedes aegypti is a potential vector for several arboviruses including dengue and Zika viruses. The species seems to be restricted to subtropical/tropical habitats and has difficulties in establishing permanent populations in southern Europe, probably due to constraints during the winter season. The aim of this study was to systematically analyze the cold tolerance (CT) of Ae. aegypti in its most cold-resistant life stage, the eggs.
Methods: The CT of Ae. aegypti eggs was compared with that of Ae. albopictus which is well established in large parts of Europe. By systematically studying the literature (meta-analysis), we recognized that CT has been rarely tested in Ae. aegypti eggs, but eggs can survive at zero and sub-zero temperatures for certain exposure periods. To overcome potential bias from experimental differences between studies, we then conducted species comparisons using a harmonized high-resolution CT measuring method. From subtropical populations of the same origin, the survival (hatching in %) and emergence of adults of both species were measured after zero and sub-zero temperature exposures for up to 9 days (3 °C, 0 °C and − 2 °C: ≤ 9 days; − 6 °C: ≤ 2 days).
Results: Our data show that Ae. aegypti eggs can survive low and sub-zero temperatures for a short time period similar to or even better than those of Ae. albopictus. Moreover, after short sub-zero exposures of eggs of both species, individuals still developed into viable adults (Ae. aegypti: 3 adults emerged after 6 days at − 2 °C, Ae. albopictus: 1 adult emerged after 1 day at − 6 °C).
Conclusions: Thus, both the literature and the present experimental data indicate that a cold winter may not be the preventing factor for the re-establishment of the dengue vector Ae. aegypti in southern Europe.
The risk of increasing dengue (DEN) and chikungunya (CHIK) epidemics impacts 240 million people, health systems, and the economy in the Hindu Kush Himalayan (HKH) region. The aim of this systematic review is to monitor trends in the distribution and spread of DEN/CHIK over time and geographically for future reliable vector and disease control in the HKH region. We conducted a systematic review of the literature on the spatiotemporal distribution of DEN/CHIK in HKH published up to 23 January 2020, following Preferred Reporting Items for Systematic Reviews and Meta-Analysis (PRISMA) guidelines. In total, we found 61 articles that focused on the spatial and temporal distribution of 72,715 DEN and 2334 CHIK cases in the HKH region from 1951 to 2020. DEN incidence occurs in seven HKH countries, i.e., India, Nepal, Bhutan, Pakistan, Bangladesh, Afghanistan, and Myanmar, and CHIK occurs in four HKH countries, i.e., India, Nepal, Bhutan, and Myanmar, out of eight HKH countries. DEN is highly seasonal and starts with the onset of the monsoon (July in India and June in Nepal) and with the onset of spring (May in Bhutan) and peaks in the postmonsoon season (September to November). This current trend of increasing numbers of both diseases in many countries of the HKH region requires coordination of response efforts to prevent and control the future expansion of those vector-borne diseases to nonendemic areas, across national borders.
Rationale: Dysregulation of dopaminergic neurotransmission, specifically altered reward processing assessed via the reward anticipation in the MID task, plays a central role in the etiopathogenesis of neuropsychiatric disorders. Objectives: We hypothesized to find a difference in the activity level of the reward system (measured by the proxy reward anticipation) under drug administration versus placebo, in that amisulpride reduces, and L-DOPA enhances, its activity. Methods: We studied the influence of dopamine agonist L-DOPA and the antagonist amisulpride on the reward system using functional magnetic resonance imaging (fMRI) during a monetary incentive delay (MID) task in n = 45 healthy volunteers in a randomized, blinded, cross-over study. Results: The MID paradigm elicits strong activation in reward-dependent structures (such as ventral striatum, putamen, caudate, anterior insula) during reward anticipation. The placebo effect demonstrated the expected significant blood oxygen level–dependent activity in reward-dependent brain regions. Neither amisulpride nor L-DOPA led to significant changes in comparison with the placebo condition. This was true for whole-brain analysis as well as analysis of a pre-defined nucleus accumbens region-of-interest mask. Conclusion: The present results cast doubt on the sensitivity of reward anticipation contrast in the MID task for assessing dopamine-specific changes in healthy volunteers by pharmaco-fMRI. While our task was not well-suited for detailed analysis of the outcome phase, we provide reasonable arguments that the lack of effect in the anticipation phase is not due to an inefficient task but points to unexpected behavior of the reward system during pharmacological challenge. Group differences of reward anticipation should therefore not be seen as simple representatives of dopaminergic states.
The mammalian target of rapamycin and the integrated stress response are central cellular hubs regulating translation upon stress. The precise proteins and pathway specificity of translation targets of these pathways remained largely unclear. We recently described a new method for quantitative translation proteomics and found that both pathways control translation of the same sets of proteins.
Molecular and cellular research modalities for the study of liver pathologies have been tremendously improved over the recent decades. Advanced technologies offer novel opportunities to establish cell isolation techniques with excellent purity, paving the path for 2D and 3D microscopy and high-throughput assays (e.g., bulk or single-cell RNA sequencing). The use of stem cell and organoid research will help to decipher the pathophysiology of liver diseases and the interaction between various parenchymal and non-parenchymal liver cells. Furthermore, sophisticated animal models of liver disease allow for the in vivo assessment of fibrogenesis, portal hypertension and hepatocellular carcinoma (HCC) and for the preclinical testing of therapeutic strategies. The purpose of this review is to portray in detail novel in vitro and in vivo methods for the study of liver cell biology that had been presented at the workshop of the 8th meeting of the European Club for Liver Cell Biology (ECLCB-8) in October of 2018 in Bonn, Germany.
The identification of unknown bodies is the fulfilment of a moral obligation towards the deceased, serves to maintain legal security within a society, and gives families the certainty they need to mourn. Taking into account respective local conditions, the aim should always be to achieve a secure and quick identification. To achieve this goal, a functioning cooperation between investigating authorities and forensic sciences is essential. The main objective of this study was to clarify the potential role of tattoos in the identification process of unknown deceased persons in the state of Jalisco, Mexico. Post-mortem data of 2045 bodies from the Instituto Jaliscience de Ciencias Forenses in Guadalajara were evaluated. Of the deceased 46% were tattooed (male: 47%, female: 39%), with 29% of all bodies (male: 29%, female: 26%) showing tattoos at body locations usually visible in everyday life (i.e. head and neck, forearms and hands). The male bodies were most frequently tattooed on the shoulders and upper arms, followed by the forearms and hands and the torso. Female bodies mostly showed tattoos on the forearms and hands, followed by the torso and legs. Taking local tattooing habits into account, the authors developed a classification for tattoo motives. With decreasing frequency, the following keywords could be assigned to the motives: letters and/or numbers, human, symbol (other), plant, symbol (religious), animal, object, tribal/ornament/geometry, fantasy/demon/comic, other. Results of the study indicate the great importance of tattoos as a possible mean of identification in Jalisco, Mexico – either as a stand-alone identification method, as a complementary tool or for planning and prioritizing subsequent investigations.
Hepatic inflammasome activation as origin of Interleukin-1α and Interleukin-1β in liver cirrhosis
(2020)
Objective: To assess the influence of biphasic calcium phosphate materials with different surface topographies on bone formation and osseointegration of titanium implants in standardized alveolar ridge defects.
Materials and methods: Standardized alveolar ridge defects (6 × 6 mm) were created in the mandible of 8 minipigs and filled with three biphasic calcium phosphate materials (BCP1–3, 90% tricalcium phosphate/10% hydroxyapatite) with different surface properties (micro- and macroporosities) as well as a bovine-derived natural bone mineral (NBM) as a control. At 12 weeks, implants were placed into the augmented defects. After further 8 weeks of healing, dissected blocks were processed for histological analysis (e.g., mineralized (MT), residual bone graft material (BS), bone-to-implant contact (BIC)).
Results: All four biomaterials showed well-integrated graft particles and new bone formation within the defect area. MT values were comparable in all groups. BS values were highest in the NBM group (21.25 ± 13.52%) and markedly reduced in the different BCP groups, reaching statistical significance at BCP1-treated sites (9.2 ± 3.28%). All test and control groups investigated revealed comparable and statistically not significant different BIC values, ranging from 73.38 ± 20.5% (BCP2) to 84.11 ± 7.84% (BCP1), respectively.
Conclusion* All bone graft materials facilitated new bone formation and osseointegration after 12 + 8 weeks of healing.
Objective: Many patients with localized prostate cancer (PCa) do not immediately undergo radical prostatectomy (RP) after biopsy confirmation. The aim of this study was to investigate the influence of “time-from-biopsy-to- prostatectomy” on adverse pathological outcomes.
Materials and Methods: Between January 2014 and December 2019, 437 patients with intermediate- and high risk PCa who underwent RP were retrospectively identified within our prospective institutional database. For the aim of our study, we focused on patients with intermediate- (n = 285) and high-risk (n = 151) PCa using D'Amico risk stratification. Endpoints were adverse pathological outcomes and proportion of nerve-sparing procedures after RP stratified by “time-from-biopsy-to-prostatectomy”: ≤3 months vs. >3 and < 6 months. Medians and interquartile ranges (IQR) were reported for continuously coded variables. The chi-square test examined the statistical significance of the differences in proportions while the Kruskal-Wallis test was used to examine differences in medians. Multivariable (ordered) logistic regressions, analyzing the impact of time between diagnosis and prostatectomy, were separately run for all relevant outcome variables (ISUP specimen, margin status, pathological stage, pathological nodal status, LVI, perineural invasion, nerve-sparing).
Results: We observed no difference between patients undergoing RP ≤3 months vs. >3 and <6 months after diagnosis for the following oncological endpoints: pT-stage, ISUP grading, probability of a positive surgical margin, probability of lymph node invasion (LNI), lymphovascular invasion (LVI), and perineural invasion (pn) in patients with intermediate- and high-risk PCa. Likewise, the rates of nerve sparing procedures were 84.3 vs. 87.4% (p = 0.778) and 61.0% vs. 78.8% (p = 0.211), for intermediate- and high-risk PCa patients undergoing surgery after ≤3 months vs. >3 and <6 months, respectively. In multivariable adjusted analyses, a time to surgery >3 months did not significantly worsen any of the outcome variables in patients with intermediate- or high-risk PCa (all p > 0.05).
Conclusion: A “time-from-biopsy-to-prostatectomy” of >3 and <6 months is neither associated with adverse pathological outcomes nor poorer chances of nerve sparing RP in intermediate- and high-risk PCa patients.
Purpose: Trauma is the leading cause of death in children. In adults, blood transfusion and fluid resuscitation protocols changed resulting in a decrease of morbidity and mortality over the past 2 decades. Here, transfusion and fluid resuscitation practices were analysed in severe injured children in Germany.
Methods: Severely injured children (maximum Abbreviated Injury Scale (AIS) ≥ 3) admitted to a certified trauma-centre (TraumaZentrum DGU®) between 2002 and 2017 and registered at the TraumaRegister DGU® were included and assessed regarding blood transfusion rates and fluid therapy.
Results: 5,118 children (aged 1–15 years) with a mean ISS 22 were analysed. Blood transfusion rates administered until ICU admission decreased from 18% (2002–2005) to 7% (2014–2017). Children who are transfused are increasingly seriously injured. ISS has increased for transfused children aged 1–15 years (2002–2005: mean 27.7–34.4 in 2014–2017). ISS in non-transfused children has decreased in children aged 1–15 years (2002–2005: mean 19.6 to mean 17.6 in 2014–2017). Mean prehospital fluid administration decreased from 980 to 549 ml without affecting hemodynamic instability.
Conclusion: Blood transfusion rates and amount of fluid resuscitation decreased in severe injured children over a 16-year period in Germany. Restrictive blood transfusion and fluid management has become common practice in severe injured children. A prehospital restrictive fluid management strategy in severely injured children is not associated with a worsened hemodynamic state, abnormal coagulation or base excess but leads to higher hemoglobin levels.
Psoriasis (PsO) is one of the common chronic inflammatory skin diseases. Approximately 3% of the European Caucasian population is affected. Psoriatic arthritis (PsA) is a chronic immune-mediated disease associated with PsO characterized by distinct musculoskeletal inflammation. Due to its heterogeneous clinical manifestations (e.g., oligo- or polyarthritis, enthesitis, dactylitis, and axial inflammation), early diagnosis of PsA is often difficult and delayed. Approximately 30% of PsO patients will develop PsA. The responsible triggers for the transition from PsO only to PsA are currently unclear, and the impacts of different factors (e.g., genetic, environmental) on disease development are currently discussed. There is a high medical need, recently unmet, to specifically detect those patients with an increased risk for the development of clinically evident PsA early to initiate sufficient treatment to inhibit disease progression and avoid structural damage and loss of function or even intercept disease development. Increased neoangiogenesis and enthesial inflammation are hypothesized to be early pathological findings in PsO patients with PsA development. Different disease states describe the transition from PsO to PsA. Two of those phases are of value for early detection of PsA at-risk patients to prevent later development of PsA as changes in biomarker profiles are detectable: the subclinical phase (soluble and imaging biomarkers detectable, no clinical symptoms) and the prodromal phase (imaging biomarkers detectable, unspecific musculoskeletal symptoms such as arthralgia and fatigue). To target the unmet need for early detection of this at-risk population and to identify the subgroup of patients who will transition from PsO to PsA, imaging plays an important role in characterizing patients precisely. Imaging techniques such as ultrasound (US), magnetic resonance imaging (MRI), and computerized tomography (CT) are advanced techniques to detect sensitively inflammatory changes or changes in bone structure. With the use of these techniques, anatomic structures involved in inflammatory processes can be identified. These techniques are complemented by fluorescence optical imaging as a sensitive method for detection of changes in vascularization, especially in longitudinal measures. Moreover, high-resolution peripheral quantitative CT (HR-pQCT) and dynamic contrast-enhanced MRI (DCE-MRI) may give the advantage to identify PsA-related early characteristics in PsO patients reflecting transition phases of the disease.
SARS-CoV-2 is the causative agent of COVID-19. Severe COVID-19 disease has been associated with disseminated intravascular coagulation and thrombosis, but the mechanisms underlying COVID-19-related coagulopathy remain unknown. The risk of severe COVID-19 disease is higher in males than in females and increases with age. To identify gene products that may contribute to COVID-19-related coagulopathy, we analyzed the expression of genes associated with the Gene Ontology (GO) term “blood coagulation” in the Genotype-Tissue Expression (GTEx) database and identified four procoagulants, whose expression is higher in males and increases with age (ADAMTS13, F11, HGFAC, KLKB1), and two anticoagulants, whose expression is higher in females and decreases with age (C1QTNF1, SERPINA5). However, the expression of none of these genes was regulated in a proteomics dataset of SARS-CoV-2-infected cells and none of the proteins have been identified as a binding partner of SARS-CoV-2 proteins. Hence, they may rather generally predispose individuals to thrombosis without directly contributing to COVID-19-related coagulopathy. In contrast, the expression of the procoagulant transferrin (not associated to the GO term “blood coagulation”) was higher in males, increased with age, and was upregulated upon SARS-CoV-2 infection. Hence, transferrin warrants further examination in ongoing clinic-pathological investigations.
Introduction: Quinolone prophylaxis is recommended for patients with advanced cirrhosis at high risk of spontaneous bacterial peritonitis (SBP) or with prior SBP. Yet, the impact of long-term antibiotic prophylaxis on the microbiome of these patients is poorly characterized.
Methods: Patients with liver cirrhosis receiving long-term quinolone prophylaxis to prevent SBP were prospectively included and sputum and stool samples were obtained at baseline, 1, 4 and 12 weeks thereafter. Both bacterial DNA and RNA were assessed with 16S rRNA sequencing. Relative abundance, alpha and beta diversity were calculated and correlated with clinical outcome.
Results: Overall, 35 stool and 19 sputum samples were obtained from 11 patients. Two patients died (day 9 and 12) all others were followed for 180 days. Reduction of Shannon diversity and bacterial richness was insignificant after initiation of quinolone prophylaxis (p > 0.05). Gut microbiota were significantly different between patients (p < 0.001) but non-significantly altered between the different time points before and after initiation of antibiotic prophylaxis (p > 0.05). A high relative abundance of Enterobacteriaceae > 20% during quinolone prophylaxis was found in three patients. Specific clinical scenarios (development of secondary infections during antibiotic prophylaxis or the detection of multidrug-resistant Enterobacteriaceae) characterized these patients. Sputum microbiota were not significantly altered in individuals during prophylaxis.
Conclusion: The present exploratory study with small sample size showed that inter-individual differences in diversity of gut microbiota were high at baseline, yet quinolone prophylaxis had only a moderate impact. High relative abundances of Enterobacteriaceae during follow-up might indicate failure of or non-adherence to quinolone prophylaxis. However, our results may not be clinically significant given the limitations of the study and therefore future studies are needed to further investigate this phenomenon.
Background: Breast cancer is the leading cause of cancer-related deaths in women, demanding new treatment options. With the advent of immune checkpoint blockade, immunotherapy emerged as a treatment option. In addition to lymphocytes, tumor-associated macrophages exert a significant, albeit controversial, impact on tumor development. Pro-inflammatory macrophages are thought to hinder, whereas anti-inflammatory macrophages promote tumor growth. However, molecular markers to identify prognostic macrophage populations remain elusive. Methods: We isolated two macrophage subsets, from 48 primary human breast tumors, distinguished by the expression of CD206. Their transcriptomes were analyzed via RNA-Seq, and potential prognostic macrophage markers were validated by PhenOptics in tissue microarrays of patients with invasive breast cancer. Results: Normal human breast tissue contained mainly CD206+ macrophages, while increased relative amounts of CD206− macrophages were observed in tumors. The presence of CD206+ macrophages correlated with a pronounced lymphocyte infiltrate and subsets of CD206+ macrophages, expressing SERPINH1 and collagen 1, or MORC4, were unexpectedly associated with improved survival of breast cancer patients. In contrast, MHCIIhi CD206− macrophages were linked with a poor survival prognosis. Conclusion: Our data highlight the heterogeneity of tumor-infiltrating macrophages and suggest the use of multiple phenotypic markers to predict the impact of macrophage subpopulations on cancer prognosis. We identified novel macrophage markers that correlate with the survival of patients with invasive mammary carcinoma.
Purpose: The management of patients with suspected appendicitis remains a challenge in daily clinical practice, and the optimal management algorithm is still being debated. Negative appendectomy rates (NAR) continue to range between 10 and 15%. This prospective study evaluated the accuracy of a diagnostic pathway in acute appendicitis using clinical risk stratification (Alvarado score), routine ultrasonography, gynecology consult for females, and selected CT after clinical reassessment.
Methods: Patients presenting with suspected appendicitis between November 2015 and September 2017 from age 18 years and above were included. Decision-making followed a clear management pathway. Patients were followed up for 6 months after discharge. The hypothesis was that the algorithm can reduce the NAR to a value of under 10%.
Results: A total of 183 patients were included. In 65 of 69 appendectomies, acute appendicitis was confirmed by histopathology, corresponding to a NAR of 5.8%. Notably, all 4 NAR appendectomies had other pathologies of the appendix. The perforation rate was 24.6%. Only 36 patients (19.7%) received a CT scan. The follow-up rate after 30 days achieved 69%, including no patients with missed appendicitis. The sensitivity and specificity of the diagnostic pathway was 100% and 96.6%, respectively. The potential saving in costs can be as much as 19.8 million €/100,000 cases presenting with the suspicion of appendicitis.
Conclusion: The risk-stratified diagnostic algorithm yields a high diagnostic accuracy for patients with suspicion of appendicitis. Its implementation can safely reduce the NAR, simultaneously minimizing the use of CT scans and optimizing healthcare-related costs in the treatment of acute appendicitis.
Background: Cerebral O2 saturation (ScO2) reflects cerebral perfusion and can be measured noninvasively by near-infrared spectroscopy (NIRS). Objectives: In this pilot study, we describe the dynamics of ScO2 during TAVI in nonventilated patients and its impact on procedural outcome. Methods and Results: We measured ScO2 of both frontal lobes continuously by NIRS in 50 consecutive analgo-sedated patients undergoing transfemoral TAVI (female 58%, mean age 80.8 years). Compared to baseline ScO2 dropped significantly during RVP (59.3% vs. 53.9%, p < .01). Five minutes after RVP ScO2 values normalized (post RVP 62.6% vs. 53.9% during RVP, p < .01; pre 61.6% vs. post RVP 62.6%, p = .53). Patients with an intraprocedural pathological ScO2 decline of >20% (n = 13) had higher EuroSCORE II (3.42% vs. 5.7%, p = .020) and experienced more often delirium (24% vs. 62%, p = .015) and stroke (0% vs. 23%, p < .01) after TAVI. Multivariable logistic regression revealed higher age and large ScO2 drops as independent risk factors for delirium. Conclusions: During RVP ScO2 significantly declined compared to baseline. A ScO2 decline of >20% is associated with a higher incidence of delirium and stroke and a valid cut-off value to screen for these complications. NIRS measurement during TAVI procedure may be an easy to implement diagnostic tool to detect patients at high risks for cerebrovascular complications and delirium.
Since 2010, an intensified ambulatory cardiology care programme has been implemented in southern Germany. To improve patient management, the structure of cardiac disease management was improved, guideline-recommended care was supported, new ambulatory medical services and a morbidity-adapted reimbursement system were set up. Our aim was to determine the effects of this programme on the mortality and hospitalisation of enrolled patients with cardiac disorders. We conducted a comparative observational study in 2015 and 2016, based on insurance claims data. Overall, 13,404 enrolled patients with chronic heart failure (CHF) and 19,537 with coronary artery disease (CAD) were compared, respectively, to 8,776 and 16,696 patients that were receiving usual ambulatory cardiology care. Compared to the control group, patients enrolled in the programme had lower mortality (Hazard Ratio: 0.84; 95% CI: 0.77–0.91) and fewer all-cause hospitalisations (Rate Ratio: 0.94; 95% CI: 0.90–0.97). CHF-related hospitalisations in patients with CHF were also reduced (Rate Ratio: 0.76; 95% CI: 0.69–0.84). CAD patients showed a similar reduction in mortality rates (Hazard Ratio: 0.81; 95% CI: 0.76–0.88) and all-cause hospitalisation (Rate Ratio: 0.94; 95% CI: 0.91–0.97), but there was no effect on CAD-related hospitalisation. We conclude that intensified ambulatory care reduced mortality and hospitalisation in cardiology patients.
In the current dismal situation of the COVID-19 pandemic, effective management of patients with pneumonia and acute respiratory distress syndrome is of vital importance. Due to the current lack of effective pharmacological concepts, this situation has caused interest in (re)considering historical reports on the treatment of patients with low-dose radiation therapy for pneumonia. Although these historical reports are of low-level evidence per se, hampering recommendations for decision-making in the clinical setting, they indicate effectiveness in the dose range between 0.3 and 1 Gy, similar to more recent dose concepts in the treatment of acute and chronic inflammatory/degenerative benign diseases with, e.g., a single dose per fraction of 0.5 Gy. This concise review aims to critically review the evidence for low-dose radiation treatment of COVID-19 pneumopathy and discuss whether it is worth investigating in the present clinical situation.
Background: The purpose of this pilot study was to create a valid and reliable set of assessment questions for examining Evidence-based Dentistry (EbD) knowledge. For this reason, we adapted and validated for dental students the Berlin Questionnaire (BQ), which assesses Evidence-based Medicine (EbM) abilities.
Methods: The Berlin Questionnaire was validated with medical residents. We adapted it for use in a dentistry setting. An expert panel reviewed the adapted BQ for content validity. A cross-sectional cohort representing four training levels (EbD-novice dental students, EbD-trained dental students, dentists, and EbM−/EbD-expert faculty) completed the questionnaire. A total of 140 participants comprised the validation set. Internal reliability, item difficulty and item discrimination were assessed. Construct validity was assessed by comparing the mean total scores of students to faculty and comparing proportions of students and faculty who passed each item.
Results: Among the 133 participants (52 EbD-novice dental students, 53 EbD-trained dental students, 12 dentists, and 16 EbM-/ EbD-expert faculty), a statistically significant (p < 0.001) difference was evident in the total score corresponding to the training level. The total score reliability and psychometric properties of items modified for discipline-specific content were acceptable. Cronbach’s alpha was 0.648.
Conclusion: The adapted Berlin Questionnaire is a reliable and valid instrument to assess competence in Evidence-based Dentistry in dental students. Future research will focus on refining the instrument further.
Aims: In primary central nervous system tumours, epithelial-to-mesenchymal transition (EMT) gene expression is associated with increased malignancy. However, it has also been shown that EMT factors in gliomas are almost exclusively expressed by glioma vessel-associated pericytes (GA-Peris). In this study, we aimed to identify the mechanism of EMT in GA-Peris and its impact on angiogenic processes.
Methods; In glioma patients, vascular density and the expression of the pericytic markers platelet derived growth factor receptor (PDGFR)-β and smooth muscle actin (αSMA) were examined in relation to the expression of the EMT transcription factor SLUG and were correlated with survival of patients with glioblastoma (GBM). Functional mechanisms of SLUG regulation and the effects on primary human brain vascular pericytes (HBVP) were studied in vitro by measuring proliferation, cell motility and growth characteristics.
Results: The number of PDGFR-β- and αSMA-positive pericytes did not change with increased malignancy nor showed an association with the survival of GBM patients. However, SLUG-expressing pericytes displayed considerable morphological changes in GBM-associated vessels, and TGF-β induced SLUG upregulation led to enhanced proliferation, motility and altered growth patterns in HBVP. Downregulation of SLUG or addition of a TGF-β antagonising antibody abolished these effects.
Conclusions: We provide evidence that in GA-Peris, elevated SLUG expression is mediated by TGF-β, a cytokine secreted by most glioma cells, indicating that the latter actively modulate neovascularisation not only by modulating endothelial cells, but also by influencing pericytes. This process might be responsible for the formation of an unstructured tumour vasculature as well as for the breakdown of the blood–brain barrier in GBM.
Reduced external knee adduction moments in the second half of stance after total hip replacement have been reported in hip osteoarthritis patients. This reduction is thought to shift the load from the medial to the lateral knee compartment and as such increase the risk for knee osteoarthritis. The knee adduction moment is a surrogate for the load distribution between the medial and lateral compartments of the knee and not a valid measure for the tibiofemoral contact forces which are the result of externally applied forces and muscle forces. The purpose of this study was to investigate whether the distribution of the tibiofemoral contact forces over the knee compartments in unilateral hip osteoarthritis patients 1 year after receiving a primary total hip replacement differs from healthy controls. Musculoskeletal modeling on gait was performed in OpenSim using the detailed knee model of Lerner et al. (2015) for 19 patients as well as for 15 healthy controls of similar age. Knee adduction moments were calculated by the inverse dynamics analysis, medial and lateral tibiofemoral contact forces with the joint reaction force analysis. Moments and contact forces of patients and controls were compared using Statistical Parametric Mapping two-sample t-tests. Knee adduction moments and medial tibiofemoral contact forces of both the ipsi- and contralateral leg were not significantly different compared to healthy controls. The contralateral leg showed 14% higher medial tibiofemoral contact forces compared to the ipsilateral (operated) leg during the second half of stance. During the first half of stance, the lateral tibiofemoral contact force of the contralateral leg was 39% lower and the ratio 32% lower compared to healthy controls. In contrast, during the second half of stance the forces were significantly higher (39 and 26%, respectively) compared to healthy controls. The higher ratio indicates a changed distribution whereas the increased lateral tibiofemoral contact forces indicate a higher lateral knee joint loading in the contralateral leg in OA patients after total hip replacement (THR). Musculoskeletal modeling using a detailed knee model can be useful to detect differences in the load distribution between the medial and lateral knee compartment which cannot be verified with the knee adduction moment.
Alcoholism is one of the leading and increasingly prevalent reasons of liver associated morbidity and mortality worldwide. Alcoholic hepatitis (AH) constitutes a severe disease with currently no satisfying treatment options. Lipoxin A4 (LXA4), a 15-lipoxygenase (ALOX15)-dependent lipid mediator involved in resolution of inflammation, showed promising pre-clinical results in the therapy of several inflammatory diseases. Since inflammation is a main driver of disease progression in alcoholic hepatitis, we investigated the impact of endogenous ALOX15-dependent lipid mediators and exogenously applied LXA4 on AH development. A mouse model for alcoholic steatohepatitis (NIAAA model) was tested in Alox12/15+/+ and Alox12/15−/− mice, with or without supplementation of LXA4. Absence of Alox12/15 aggravated parameters of liver disease, increased hepatic immune cell infiltration in AH, and elevated systemic neutrophils as a marker for systemic inflammation. Interestingly, i.p. injections of LXA4 significantly lowered transaminase levels only in Alox12/15−/− mice and reduced hepatic immune cell infiltration as well as systemic inflammatory cytokine expression in both genotypes, even though steatosis progressed. Thus, while LXA4 injection attenuated selected parameters of disease progression in Alox12/15−/− mice, its beneficial impact on immunity was also apparent in Alox12/15+/+ mice. In conclusion, pro-resolving lipid mediators may be beneficial to reduce inflammation in alcoholic hepatitis.
Background: While swallowing disorders are frequent sequela following posterior fossa tumor (PFT) surgery in children, data on dysphagia frequency, severity, and outcome in adults are lacking. The aim of this study was to investigate dysphagia before and after surgical removal of PFT. Additionally, we tried to identify clinical predictors for postsurgical swallowing disorders. Furthermore, this study explored the three-month outcome of dysphagic patients.
Methods: In a cohort of patients undergoing PFT surgery, dysphagia was prospectively assessed pre- and postoperatively using fiberoptic endoscopic evaluation of swallowing. Patients with severe dysphagia at discharge were re-evaluated after three months. Additionally, clinical and imaging data were collected to identify predictors for post-surgical dysphagia. Results: We included 26 patients of whom 15 had pre-operative swallowing disorders. After surgery, worsening of pre-existing dysphagia could be noticed in 7 patients whereas improvement was observed in 2 and full recovery in 3 subjects. New-onset dysphagia after surgery occurred in a minority of 3 cases. Postoperatively, 47% of dysphagic patients required nasogastric tube feeding. Re-evaluation after three months of follow-up revealed that all dysphagic patients had returned to full oral intake.
Conclusion: Dysphagia is a frequent finding in patients with PFT already before surgery. Surgical intervention can infer a deterioration of impaired swallowing function placing affected patients at temporary risk for aspiration. In contrast, surgery can also accomplish beneficial results resulting in both improvement and full recovery. Overall, our findings show the need of early dysphagia assessment to define the safest feeding route for the patient.
Objective: Spinal epidural abscess (SEA) is a severe and life-threatening disease. Although commonly performed, the effect of timing in surgical treatment on patient outcome is still unclear. With this study, we aim to provide evidence for early surgical treatment in patients with SEA.
Methods: Patients treated for SEA in the authors' department between 2007 and 2016 were included for analysis and retrospectively analyzed for basic clinical parameters and outcome. Pre- and postoperative neurological status were assessed using the American Spinal Injury Association Impairment Scale (AIS). The self-reported quality of life (QOL) based on the Short-Form Health Survey 36 (SF-36) was assessed prospectively. Surgery was defined as "early", when performed within 12 hours after admission and "late" when performed thereafter. Conservative therapy was preferred and recommend in patients without neurological deficits and in patients denying surgical intervention.
Results: One hundred and twenty-three patients were included in this study. Forty-nine patients (39.8%) underwent early, 47 patients (38.2%) delayed surgery and 27 (21.9%) conservative therapy. No significant differences were observed regarding mean age, sex, diabetes, prior history of spinal infection, and bony destruction. Patients undergoing early surgery revealed a significant better clinical outcome before discharge than patients undergoing late surgery (p=0.001) and conservative therapy. QOL based on SF-36 were significantly better in the early surgery cohort in two of four physical items (physical functioning and bodily pain) and in one of four psychological items (role limitation) after a mean follow-up period of 58 months. Readmission to the hospital and failure of conservative therapy were observed more often in patients undergoing conservative therapy.
Conclusion: Our data on both clinical outcome and QOL provide evidence for early surgery within 12 hours after admission in patients with SEA.
Background: Essential Tremor (ET) is a progressive neurological disorder characterized by postural and kinetic tremor most commonly affecting the hands and arms. Medically intractable ET can be treated by deep brain stimulation (DBS) of the ventral intermediate nucleus of thalamus (VIM). We investigated whether the location of the effective contact (most tremor suppression with at least side effects) in VIM-DBS for ET changes over time, indicating a distinct mechanism of loss of efficacy that goes beyond progression of tremor severity, or a mere reduction of DBS efficacy.
Methods: We performed programming sessions in 10 patients who underwent bilateral vim-DBS surgery between 2009 and 2017 at our department. In addition to the intraoperative (T1) and first clinical programming session (T2) a third programming session (T3) was performed to assess the effect- and side effect threshold (minimum voltage at which a tremor suppression or side effects occurred). Additionally, we compared the choice of the effective contact between T1 and T2 which might be affected by a surgical induced “brain shift.”
Discussion: Over a time span of about 4 years VIM-DBS in ET showed continuous efficacy in tremor suppression during stim-ON compared to stim-OFF. Compared to immediate postoperative programming sessions in ET-patients with DBS, long-term evaluationshowednorelevantchangeinthechoiceofcontactwithrespecttosideeffects andefficacy.InthemajorityofthecasestheactivecontactatT2didnotcorrespondtothe most effective intraoperative stimulation site T1, which might be explained by a brain-shift due to cerebral spinal fluid loss after neurosurgical procedure.
Background and purpose: Superficial siderosis of the central nervous system is a sporadic finding in magnetic resonance imaging, resulting from recurrent bleedings into the subarachnoid space. This study aimed to determine the frequency of spinal dural cerebrospinal fluid (CSF) leaks amongst patients with a symmetric infratentorial siderosis pattern. Methods: In all, 97,733 magnetic resonance images performed between 2007 and 2018 in our neurocenter were screened by a keyword search for “hemosiderosis” and “superficial siderosis.” Siderosis patterns on brain imaging were classified according to a previously published algorithm. Potential causative intracranial bleeding events were also assessed. Patients with a symmetric infratentorial siderosis pattern but without causative intracranial bleeding events in history were prospectively evaluated for spinal pathologies. Results: Forty-two patients with isolated supratentorial siderosis, 30 with symmetric infratentorial siderosis and 21 with limited (non-symmetric) infratentorial siderosis were identified. Amyloid angiopathy and subarachnoid hemorrhage were causes for isolated supratentorial siderosis. In all four patients with a symmetric infratentorial siderosis pattern but without a causative intracranial bleeding event in history, spinal dural abnormalities were detected. Dural leaks were searched for in patients with symmetric infratentorial siderosis and a history of intracranial bleeding event without known bleeding etiology, considering that spinal dural CSF leaks themselves may also cause intracranial hemorrhage, for example by inducing venous thrombosis due to low CSF pressure. Thereby, one additional spinal dural leak was detected. Conclusions: Persisting spinal dural CSF leaks can frequently be identified in patients with a symmetric infratentorial siderosis pattern. Diagnostic workup in these cases should include magnetic resonance imaging of the whole spine.
Background: Autism spectrum disorder (“autism”) is a highly heterogeneous neurodevelopmental condition with few effective treatments for core and associated features. To make progress we need to both identify and validate neural markers that help to parse heterogeneity to tailor therapies to specific neurobiological profiles. Atypical hemispheric lateralization is a stable feature across studies in autism, but its potential as a neural stratification marker has not been widely examined. Methods: In order to dissect heterogeneity in lateralization in autism, we used the large EU-AIMS (European Autism Interventions—A Multicentre Study for Developing New Medications) Longitudinal European Autism Project dataset comprising 352 individuals with autism and 233 neurotypical control subjects as well as a replication dataset from ABIDE (Autism Brain Imaging Data Exchange) (513 individuals with autism, 691 neurotypical subjects) using a promising approach that moves beyond mean group comparisons. We derived gray matter voxelwise laterality values for each subject and modeled individual deviations from the normative pattern of brain laterality across age using normative modeling. Results: Individuals with autism had highly individualized patterns of both extreme right- and leftward deviations, particularly in language, motor, and visuospatial regions, associated with symptom severity. Language delay explained most variance in extreme rightward patterns, whereas core autism symptom severity explained most variance in extreme leftward patterns. Follow-up analyses showed that a stepwise pattern emerged, with individuals with autism with language delay showing more pronounced rightward deviations than individuals with autism without language delay. Conclusions: Our analyses corroborate the need for novel (dimensional) approaches to delineate the heterogeneous neuroanatomy in autism and indicate that atypical lateralization may constitute a neurophenotype for clinically meaningful stratification in autism.
Background: Approximately one in three patients suffers from preoperative anaemia. Even though haemoglobin is measured before surgery, anaemia management is not implemented in every hospital. Objective: Here, we demonstrate the implementation of an anaemia walk-in clinic at an Orthopedic University Hospital. To improve the diagnosis of iron deficiency (ID), we examined whether reticulocyte haemoglobin (Ret-He) could be a useful additional parameter. Material and Methods: In August 2019, an anaemia walk-in clinic was established. Between September and December 2019, major orthopaedic surgical patients were screened for preoperative anaemia. The primary endpoint was the incidence of preoperative anaemia. Secondary endpoints included Ret-He level, red blood cell (RBC) transfusion rate, in-hospital length of stay and anaemia at hospital discharge. Results: A total of 104 patients were screened for anaemia. Preoperative anaemia rate was 20.6%. Intravenous iron was supplemented in 23 patients. Transfusion of RBC units per patient (1.7 ± 1.2 vs. 0.2 ± 0.9; p = 0.004) and hospital length of stay (13.1 ± 4.8 days vs. 10.6 ± 5.1 days; p = 0.068) was increased in anaemic patients compared to non-anaemic patients. Ret-He values were significantly lower in patients with ID anaemia (33.3 pg [28.6–40.2 pg]) compared to patients with ID (35.3 pg [28.9–38.6 pg]; p = 0.015) or patients without anaemia (35.4 pg [30.2–39.4 pg]; p = 0.001). Conclusion: Preoperative anaemia is common in orthopaedic patients. Our results proved the feasibility of an anaemia walk-in clinic to manage preoperative anaemia. Furthermore, our analysis supports the use of Ret-He as an additional parameter for the diagnosis of ID in surgical patients.
Manufacturing processes of custom implant abutments may contaminate their surfaces with micro wear deposits and generic pollutants. Such particulate debris, if not removed, might be detrimental and provoke inflammatory reactions in peri-implant tissues. Although regulatory guidelines for adequate cleaning, disinfection, or sterilization exist, there does not appear to be a consistent application and data on the amount and extent of such contaminants is lacking. The aim of the present in vitro study was to evaluate the quality and quantity of processing-related surface contamination of computer-aided design/computer-aided manufacturing (CAD/CAM) abutments in the state of delivery and after ultrasonic cleaning. A total of 28 CAD/CAM monotype and hybrid abutments were cleaned and disinfected applying a three-stage ultrasonic protocol (Finevo protocol). Before and after cleaning, the chemical composition and the contamination of the abutments were assessed using scanning electron microscopy (SEM), dispersive X-ray spectroscopy(EDX),andcomputer-aidedplanimetricmeasurement(CAPM).Inthedeliverycondition, monotype abutments showed a significantly higher amount of debris compared to hybrid abutments (4.86±6.10% vs. 0.03 ± 0.03%, p < 0.001). The polishing process applied in the laboratory after bonding the hybrid abutment components reduces the surface roughness and thus contributes substantially to their purity. The extent of contamination caused by computer-aided manufacturing of custom abutments can be substantially minimized using a three-stage ultrasonic protocol.
Treatment of large bone defects is one of the great challenges in contemporary orthopedic and traumatic surgery. Grafts are necessary to support bone healing. A well-established allograft is demineralized bone matrix (DBM) prepared from donated human bone tissue. In this study, a fibrous demineralized bone matrix (f-DBM) with a high surface-to-volume ratio has been analyzed for toxicity and immunogenicity. f-DBM was transplanted to a 5-mm, plate-stabilized, femoral critical-size-bone-defect in Sprague-Dawley (SD)-rats. Healthy animals were used as controls. After two months histology, hematological analyses, immunogenicity as well as serum biochemistry were performed. Evaluation of free radical release and hematological and biochemical analyses showed no significant differences between the control group and recipients of f-DBM. Histologically, there was no evidence of damage to liver and kidney and good bone healing was observed in the f-DBM group. Reactivity against human HLA class I and class II antigens was detected with mostly low fluorescence values both in the serum of untreated and treated animals, reflecting rather a background reaction. Taken together, these results provide evidence for no systemic toxicity and the first proof of no basic immunogenic reaction to bone allograft and no sensitization of the recipient.