Medizin
Refine
Year of publication
Document Type
- Article (3559) (remove)
Has Fulltext
- yes (3559)
Keywords
- inflammation (47)
- glioblastoma (29)
- apoptosis (27)
- breast cancer (26)
- cancer (26)
- Inflammation (19)
- biomarker (18)
- autophagy (17)
- COVID-19 (16)
- Depression (15)
Institute
- Medizin (3559)
- Biowissenschaften (68)
- Georg-Speyer-Haus (59)
- Sonderforschungsbereiche / Forschungskollegs (56)
- Zentrum für Arzneimittelforschung, Entwicklung und Sicherheit (56)
- Biochemie und Chemie (46)
- Pharmazie (46)
- Exzellenzcluster Makromolekulare Komplexe (45)
- Frankfurt Institute for Advanced Studies (FIAS) (43)
- Präsidium (34)
Despite being an essential consideration when deciding rule changes, injury prevention strategies, and athlete development models, there is little epidemiological data of U18 field hockey player injuries–something explicitly referred to in the 2015 International Olympic Committee’s Consensus Statement on Youth Athlete Development. The aim of this study was to quantify incidence and characteristics of injuries in elite youth field hockey players during a major international tournament. Standardized reporting forms detailing time, location on pitch, mechanism and anatomical location of injury were completed for new musculoskeletal conditions resulting in a time stoppage by the umpire and where a player was noticeably affected by an injury for up to 20 s regardless of time stoppage. Injury incidence was 1.35 and 2.20 injuries/match or 53 and 86 injuries per 1000 player match hours for boys (B) and girls (G) respectively; girls were over three times more likely to have a minor injury. Most injuries were contusions due to being hit by the ball or stick (B: 12, G: 27), with high numbers of injuries to the torso (B: 8) and head/face (G: 7). Injuries during the penalty corner (B: 3, G: 4) were to the lower limb and hand, and boys were less likely to wear facial protection (B: 65.9%, G: 86.4%). Results form an essential initial dataset of injuries in U18 field hockey players. Current reporting protocols under-report injuries and must be addressed by the international governing body. The high number of head/face injuries, particularly in females, requires further investigation.
Introduction: The clinical management of breech presentations at term is still a controversially discussed issue among clinicians. Clear predictive criteria for planned vaginal breech deliveries are desperately needed to prevent adverse fetal and maternal outcomes and to reduce elective cesarean section rates. The green-top guideline considers an estimated birth weight of 3.8 kg or more an indication to plan a cesarean section despite the lack of respective evidence.
Objective: To compare maternal and neonatal outcome of vaginal intended breech deliveries of births with children with a birth weight of 2.5 kg– 3.79 kg and children with a birth weight of 3.8 kg and more.
Design: Prospective cohort study.
Sample: All vaginal intended deliveries out of a breech position of newborns weighing between 2.5 kg and 4.5 kg at the Obstetrics department at Goethe University Hospital Frankfurt from January 2004 until December 2016
Methods: Neonatal and maternal outcome of a light weight group (LWG) (< 3.8 kg) was compared to and a high weight group (HWG) (≥ 3.8 kg) using Pearson’s Chi Square test and Fishers exact test. A logistic regression analysis was performed to detect an association between cesarean section rates, fetal outcome and the birth weight.
Results: No difference in neonatal morbidity was detected between the HWG (1.8%, n = 166) and the LWG (2.6%, n = 888). Cesarean section rate was significantly higher in the HWG with 45.2% in comparison to 28.8% in the LWG with an odds ratio of 1.57 (95% CI 1.29–1.91, p<0.0001). In vaginal deliveries, a high birth weight was not associated with an increased risk of maternal birth injuries (LWG in vaginal deliveries: 74.3%, HWG in vaginal deliveries: 73.6%; p = 0.887; OR = 1.9 (95% CI 0.9–1.1))
Conclusion: A fetal weight above 3.79 kg does not predict increased maternal or infant morbidity after delivery from breech presentation at term. Neither the literature nor our analyses document evidence for threshold of estimated birth weight that is associated with maternal and/or infant morbidity. However, patients should be informed about an increased likelihood of cesarean sections during labor when attempting vaginal birth from breech position at term in order to reach an informed shared decision concerning the birth strategy. Further investigations in multi center settings are needed to advance international guidelines on vaginal breech deliveries in the context of estimated birth weight and its impact on perinatal outcome.
Cancer metabolism is characterized by extensive glucose consumption through aerobic glycolysis. No effective therapy exploiting this cancer trait has emerged so far, in part, due to the substantial side effects of the investigated drugs. In this study, we examined the side effects of a combination of isocaloric ketogenic diet (KD) with the glycolysis inhibitor 2-deoxyglucose (2-DG). Two groups of eight athymic nude mice were either fed a standard diet (SD) or a caloric unrestricted KD with a ratio of 4 g fat to 1 g protein/carbohydrate. 2-DG was investigated in commonly employed doses of 0.5 to 4 g/kg and up to 8 g/kg. Ketosis was achieved under KD (ketone bodies: SD 0.5 ± 0.14 mmol/L, KD 1.38 ± 0.28 mmol/L, p < 0.01). The intraperitoneal application of 4 g/kg of 2-DG caused a significant increase in blood glucose, which was not prevented by KD. Sedation after the 2-DG treatment was observed and a behavioral test of spontaneous motion showed that KD reduced the sedation by 2-DG (p < 0.001). A 2-DG dose escalation to 8 g/kg was lethal for 50% of the mice in the SD and for 0% of the mice in the KD group (p < 0.01). A long-term combination of KD and an oral 1 or 2 g 2-DG/kg was well-tolerated. In conclusion, KD reduces the sedative effects of 2-DG and dramatically increases the maximum tolerated dose of 2-DG. A continued combination of KD and anti-glycolytic therapy is feasible. This is, to our knowledge, the first demonstration of increased tolerance to glycolysis inhibition by KD.
Background: Ever since it was discovered that zoophilic vectors can transmit malaria, zooprophylaxis has been used to prevent the disease. However, zoopotentiation has also been observed. Thus, the presence of livestock has been widely accepted as an important variable for the prevalence and risk of malaria, but the effectiveness of zooprophylaxis remained subject to debate. This study aims to critically analyse the effects of the presence of livestock on malaria prevalence using a large dataset from Indonesia.
Methods: This study is based on data from the Indonesia Basic Health Research ("Riskesdas") cross-sectional survey of 2007 organized by the National Institute of Health Research and Development of Indonesia’s Ministry of Health. The subset of data used in the present study included 259,885 research participants who reside in the rural areas of 176 regencies throughout the 15 provinces of Indonesia where the prevalence of malaria is higher than the national average. The variable "existence of livestock" and other independent demographic, social and behavioural variables were tested as potential determinants for malaria prevalence by multivariate logistic regressions.
Results: Raising medium-sized animals in the house was a significant predictor of malaria prevalence (OR = 2.980; 95% CI 2.348–3.782, P < 0.001) when compared to keeping such animals outside of the house (OR = 1.713; 95% CI 1.515–1.937, P < 0.001). After adjusting for gender, age, access to community health facility, sewage canal condition, use of mosquito nets and insecticide-treated bed nets, the participants who raised medium-sized animals inside their homes were 2.8 times more likely to contract malaria than respondents who did not (adjusted odds ratio = 2.809; 95% CI 2.207–3.575; P < 0.001).
Conclusions: The results of this study highlight the importance of livestock for malaria transmission, suggesting that keeping livestock in the house contributes to malaria risk rather than prophylaxis in Indonesia. Livestock-based interventions should therefore play a significant role in the implementation of malaria control programmes, and focus on households with a high proportion of medium-sized animals in rural areas. The implementation of a "One Health" strategy to eliminate malaria in Indonesia by 2030 is strongly recommended.
This guideline of the German Dermatology Society primarily focuses on the diagnosis and treatment of cutaneous manifestations of Lyme borreliosis. It has received consensus from 22 German medical societies and 2 German patient organisations. It is the first part of an AWMF (Arbeitsgemeinschaft der Wissenschaftlichen Medizinischen Fachgesellschaften e.V.) interdisciplinary guideline: "Lyme Borreliosis – Diagnosis and Treatment, development stage S3".
The guideline is directed at physicians in private practices and clinics who treat Lyme borreliosis. Objectives of this guideline are recommendations for confirming a clinical diagnosis, recommendations for a stage-related laboratory diagnosis (serological detection of IgM and IgG Borrelia antibodies using the 2-tiered ELISA/immunoblot process, sensible use of molecular diagnostic and culture procedures) and recommendations for the treatment of the localised, early-stage infection (erythema migrans, erythema chronicum migrans, and borrelial lymphocytoma), the disseminated early-stage infection (multiple erythemata migrantia, flu-like symptoms) and treatment of the late-stage infection (acrodermatitis chronica atrophicans with and without neurological manifestations). In addition, an information sheet for patients containing recommendations for the prevention of Lyme borreliosis is attached to the guideline.
Objective: Amyloid β (Aβ) depositions in plaques and cerebral amyloid angiopathy (CAA) represent common features of Alzheimer's disease (AD). Sequential deposition of post‐translationally modified Aβ in plaques characterizes distinct biochemical stages of Aβ maturation. However, the molecular composition of vascular Aβ deposits in CAA and its relation to plaques remain enigmatic.
Methods: Vascular and parenchymal deposits were immunohistochemically analyzed for pyroglutaminated and phosphorylated Aβ in the medial temporal and occipital lobe of 24 controls, 27 pathologically‐defined preclinical AD, and 20 symptomatic AD cases.
Results: Sequential deposition of Aβ in CAA resembled Aβ maturation in plaques and enabled the distinction of three biochemical stages of CAA. B‐CAA stage 1 was characterized by deposition of Aβ in the absence of pyroglutaminated AβN3pE and phosphorylated AβpS8. B‐CAA stage 2 showed additional AβN3pE and B‐CAA stage 3 additional AβpS8. Based on the Aβ maturation staging in CAA and plaques, three case groups for Aβ pathology could be distinguished: group 1 with advanced Aβ maturation in CAA; group 2 with equal Aβ maturation in CAA and plaques; group 3 with advanced Aβ maturation in plaques. All symptomatic AD cases presented with end‐stage plaque maturation, whereas CAA could exhibit immature Aβ deposits. Notably, Aβ pathology group 1 was associated with arterial hypertension, and group 2 with the development of dementia.
Interpretation: Balance of Aβ maturation in CAA and plaques defines distinct pathological subgroups of β‐amyloidosis. The association of CAA‐related Aβ maturation with cognitive decline, the individual contribution of CAA and plaque pathology to the development of dementia within the defined Aβ pathology subgroups, and the subgroup‐related association with arterial hypertension should be considered for differential diagnosis and therapeutic intervention.
Background: Sputum induction is an important noninvasive method for analyzing bronchial inflammation in patients with asthma and other respiratory diseases. Most frequently, ultrasonic nebulizers are used for sputum induction, but breath-controlled nebulizers may target the small airways more efficiently. This treatment may produce a cell distribution similar to bronchoalveolar lavage (less neutrophils and more macrophages) and provide deeper insights into the underlying lung pathology. The goal of the study was to compare both types of nebulizer devices and their efficacy in inducing sputum to measure bronchial inflammation, i.e., cell composition and cytokines, in patients with mild allergic asthma and healthy controls.
Methods: The population of this study consisted of 20 healthy control subjects with a median age of 17 years, range: 8–25 years, and 20 patients with a median age of 12 years, range: 8–24 years, presenting with mild, controlled allergic asthma who were not administered an inhaled steroid treatment. We induced sputum in every individual using both devices on two separate days. The sputum weight, the cell composition and cytokine levels were analyzed using a cytometric bead assay (CBA) and by real-time quantitative PCR (qRT-PCR).
Results: We did not observe significant differences in the weight, cell distribution or cytokine levels in the sputum samples induced by both devices. In addition, the Bland-Altman correlation revealed good concordance of the cell distribution. As expected, eosinophils and IL-5 levels were significantly elevated in patients with asthma.
Conclusions: The hypothesis that sputum induction with a breath-controlled "smart" nebulizer is more efficient and different from an ultrasonic nebulizer was not confirmed. The Bland-Altman correlations showed good concordance when comparing the two devices.
Trial registration: NCT01543516 Retrospective registration date: March 5, 2012.
Background: In intensive care units (ICU) octogenarians become a routine patients group with aggravated therapeutic and diagnostic decision-making. Due to increased mortality and a reduced quality of life in this high-risk population, medical decision-making a fortiori requires an optimum of risk stratification. Recently, the VIP-1 trial prospectively observed that the clinical frailty scale (CFS) performed well in ICU patients in overall-survival and short-term outcome prediction. However, it is known that healthcare systems differ in the 21 countries contributing to the VIP-1 trial. Hence, our main focus was to investigate whether the CFS is usable for risk stratification in octogenarians admitted to diversified and high tech German ICUs.
Methods: This multicentre prospective cohort study analyses very old patients admitted to 20 German ICUs as a sub-analysis of the VIP-1 trial. Three hundred and eight patients of 80 years of age or older admitted consecutively to participating ICUs. CFS, cause of admission, APACHE II, SAPS II and SOFA scores, use of ICU resources and ICU- and 30-day mortality were recorded. Multivariate logistic regression analysis was used to identify factors associated with 30-day mortality.
Results: Patients had a median age of 84 [IQR 82–87] years and a mean CFS of 4.75 (± 1.6 standard-deviation) points. More than half of the patients (53.6%) were classified as frail (CFS ≥ 5). ICU-mortality was 17.3% and 30-day mortality was 31.2%. The cause of admission (planned vs. unplanned), (OR 5.74) and the CFS (OR 1.44 per point increase) were independent predictors of 30-day survival.
Conclusions: The CFS is an easy determinable valuable tool for prediction of 30-day ICU survival in octogenarians, thus, it may facilitate decision-making for intensive care givers in Germany.
Trial registration: The VIP-1 study was retrospectively registered on ClinicalTrials.gov (ID: NCT03134807) on May 1, 2017.
Background: Antidepressant medication is commonly used to treat depression. However, many patients do not respond to the first medication prescribed and improvements in symptoms are generally only detectable by clinicians 4–6 weeks after the medication has been initiated. As a result, there is often a long delay between the decision to initiate an antidepressant medication and the identification of an effective treatment regimen.
Previous work has demonstrated that antidepressant medications alter subtle measures of affective cognition in depressed patients, such as the appraisal of facial expression. Furthermore, these cognitive effects of antidepressants are apparent early in the course of treatment and can also predict later clinical response. This trial will assess whether an electronic test of affective cognition and symptoms (the Predicting Response to Depression Treatment Test; PReDicT Test) can be used to guide antidepressant treatment in depressed patients and, therefore, hasten treatment response compared to a control group of patients treated as usual.
Methods/design: The study is a randomised, two-arm, multi-centre, open-label, clinical investigation of a medical device, the PReDicT Test. It will be conducted in five European countries (UK, France, Spain, Germany and the Netherlands) in depressed patients who are commencing antidepressant medication. Patients will be randomised to treatment guided by the PReDicT Test (PReDicT arm) or to Treatment as Usual (TaU arm). Patients in the TaU arm will be treated as per current standard guidelines in their particular country. Patients in the PReDicT arm will complete the PReDicT Test after 1 (and if necessary, 2) weeks of treatment. If the test indicates non-response to the treatment, physicians will be advised to immediately alter the patient’s antidepressant therapy by dose escalation or switching to another compound. The primary outcome of the study is the proportion of patients showing a clinical response (defined as 50% or greater decrease in baseline scores of depression measured using the Quick Inventory of Depressive Symptoms – Self-Rated questionnaire) at week 8. Health economic and acceptability data will also be collected and analysed.
Discussion: This trial will test the clinical efficacy, cost-effectiveness and acceptability of using the novel PReDicT Test to guide antidepressant treatment selection in depressed patients.
Trial registration: ClinicalTrials.gov, ID: NCT02790970. Registered on 30 March 2016.
Background: Reducing time and contrast agent doses are important goals to provide cost-efficient cardiovascular magnetic resonance (CMR) imaging. Limited information is available regarding the feasibility of evaluating left ventricular (LV) function after gadobutrol injection as well as defining the lowest dose for high quality scar imaging. We sought to evaluate both aspects separately and systematically to provide an optimized protocol for contrast-enhanced CMR (CE-CMR) using gadobutrol.
Methods: This is a prospective, randomized, single-blind cross-over study performed in two different populations. The first population consisted of 30 patients with general indications for a rest CE-CMR who underwent cine-imaging before and immediately after intravenous administration of 0.1 mmol/kg body-weight of gadobutrol. Quantitative assessment of LV volumes and function was performed by the same reader in a randomized and blinded fashion. The second population was composed of 30 patients with indication to late gadolinium enhancement (LGE) imaging, which was performed twice at different gadobutrol doses (0.1 mmol/kg vs. 0.2 mmol/kg) and at different time delays (5 and 10 min vs. 5, 10, 15 and 20 min), within a maximal interval of 21 days. LGE images were analysed qualitatively (contrast-to-noise ratio) and quantitatively (LGE%-of-mass).
Results: Excellent correlation between pre- and post-contrast cine-imaging was found, with no difference of LV stroke volume and ejection fraction (p = 0.538 and p = 0.095, respectively). End-diastolic-volume and end-systolic-volume were measured significantly larger after contrast injection (p = 0.008 and p = 0.001, respectively), with a mean difference of 3.7 ml and 2.9 ml, respectively. LGE imaging resulted in optimal contrast-to-noise ratios 10 min post-injection for a gadobutrol dose of 0.1 mmol/kg body-weight and 20 min for a dose of 0.2 mmol/kg body-weight. At these time points LGE quantification did not significantly differ (0.1 mmol/kg: 11% (16.4); 0.2 mmol/kg: 12% (14.5); p = 0.059), showing excellent correlation (ICC = 0.957; p < 0.001).
Conclusion: A standardized CE-CMR rest protocol giving a dose of 0.1 mmol/kg of gadobutrol before cine-imaging and performing LGE 10 min after injection represents a fast low-dose protocol without significant loss of information in comparison to a longer protocol with cine-imaging before contrast injection and a higher dose of gadobutrol. This approach allows to reduce examination time and costs as well as minimize contrast-agent exposure.