Medizin
Refine
Year of publication
- 2020 (426) (remove)
Document Type
- Article (394)
- Doctoral Thesis (17)
- Contribution to a Periodical (7)
- Part of Periodical (4)
- Book (3)
- Preprint (1)
Has Fulltext
- yes (426)
Keywords
- inflammation (14)
- COVID-19 (11)
- quality of life (7)
- SARS-CoV-2 (6)
- biomarker (6)
- ADHD (5)
- cancer (5)
- depression (5)
- obesity (5)
- ACLF (4)
Institute
- Medizin (426)
- Präsidium (8)
- Sportwissenschaften (7)
- Zentrum für Arzneimittelforschung, Entwicklung und Sicherheit (ZAFES) (6)
- Georg-Speyer-Haus (5)
- Biochemie und Chemie (4)
- Frankfurt Institute for Advanced Studies (FIAS) (4)
- Biowissenschaften (3)
- Informatik (3)
- MPI für Hirnforschung (3)
- Biochemie, Chemie und Pharmazie (1)
- Erziehungswissenschaften (1)
- Exzellenzcluster Makromolekulare Komplexe (1)
- MPI für empirische Ästhetik (1)
- Mathematik (1)
- Psychologie (1)
- Psychologie und Sportwissenschaften (1)
- Senckenbergische Naturforschende Gesellschaft (1)
- Wirtschaftswissenschaften (1)
As the current SARS-CoV-2 pandemic continues, serological assays are urgently needed for rapid diagnosis, contact tracing and for epidemiological studies. So far, there is little data on how commercially available tests perform with real patient samples and if detected IgG antibodies provide protective immunity. Focusing on IgG antibodies, we demonstrate the performance of two ELISA assays (Euroimmun SARS-CoV-2 IgG & Vircell COVID-19 ELISA IgG) in comparison to one lateral flow assay ((LFA) FaStep COVID-19 IgG/IgM Rapid Test Device) and two in-house developed assays (immunofluorescence assay (IFA) and plaque reduction neutralization test (PRNT)). We tested follow up serum/plasma samples of individuals PCR-diagnosed with COVID-19. Most of the SARS-CoV-2 samples were from individuals with moderate to severe clinical course, who required an in-patient hospital stay.
For all examined assays, the sensitivity ranged from 58.8 to 76.5% for the early phase of infection (days 5-9) and from 93.8 to 100% for the later period (days 10-18) after PCR-diagnosed with COVID-19. With exception of one sample, all positive tested samples in the analysed cohort, using the commercially available assays examined (including the in-house developed IFA), demonstrated neutralizing (protective) properties in the PRNT, indicating a potential protective immunity to SARS-CoV-2. Regarding specificity, there was evidence that samples of endemic coronavirus (HCoV-OC43, HCoV-229E) and Epstein Barr virus (EBV) infected individuals cross-reacted in the ELISA assays and IFA, in one case generating a false positive result (may giving a false sense of security). This need to be further investigated.
The incidence of FIX inhibitors in severe hemophilia B (SHB) is not well defined. Frequencies of 3-5% have been reported but most studies to date were small, including patients with different severities, and without prospective follow-up for inhibitor incidence. Study objective was to investigate inhibitor incidence in patients with SHB followed up to 500 exposure days (ED), the frequency of allergic reactions, and the relationship with genotypes. Consecutive previously untreated patients (PUPs) with SHB enrolled into the PedNet cohort were included. Detailed data was collected for the first 50 ED, followed by annual collection of inhibitor status and allergic reactions. Presence of inhibitors was defined by at least two consecutive positive samples. Additionally, data on factor IX gene mutation was collected. 154 PUPs with SHB were included; 75% were followed until 75 ED, and 43% until 500 ED. Inhibitors developed in 14 patients (7 high-titre). Median number of ED at inhibitor manifestation was 11 (IQR 6.5-36.5). Cumulative inhibitor incidence was 9.3% (95%CI 4.4-14.1) at 75 ED, and 10.2% (5.1-15.3) at 500 ED. Allergic reactions occurred in 4 (28.6%) inhibitor patients. Missense mutations were most frequent (46.8%) overall but not associated with inhibitors. Nonsense mutations and deletions with large structural changes comprised all mutations among inhibitor patients and were associated with an inhibitor risk of 26.9% and 33.3%, respectively. In an unselected, well-defined cohort of PUPs with SHB, cumulative inhibitor incidence was 10.2% at 500 ED. Nonsense mutations and large deletions were strongly associated with the risk of inhibitor development. The PedNet Registry is registered at clinicaltrials.gov; identifier: NCT02979119
Introduction: In recent years, resource-saving handling of allogeneic blood products and a reduction of transfusion rates in adults has been observed. However, comparable published national data for transfusion practices in pediatric patients are currently not available. In this study, the transfusion rates for children and adolescents were analyzed based on data from the Federal Statistical Office of Germany during the past 2 decades. Methods: Data were queried via the database of the Federal Statistical Office (Destasis). The period covered was from 2005 to 2018, and those in the sample group were children and adolescents aged 0–17 years receiving inpatient care. Operation and procedure codes (OPS) for transfusions, procedures, or interventions with increased transfusion risk were queried and evaluated in detail. Results: In Germany, 0.9% of the children and adolescents treated in hospital received a transfusion in 2018. A reduction in transfusion rates from 1.02% (2005) to 0.9% (2018) was observed for the total collective of children and adolescents receiving inpatient care. Increases in transfusion rates were recorded for 1- to 4- (1.41–1.45%) and 5- to 10-year-olds (1.24–1.33%). Children under 1 year of age were most frequently transfused (in 2018, 40.2% of the children were cared for in hospital). Transfusion-associated procedures such as chemotherapy or machine ventilation and respiratory support for newborns and infants are on the rise. Conclusion: Transfusion rates are declining in children and adolescents, but the reasons for increases in transfusion rates in other groups are unclear. Prospective studies to evaluate transfusion rates and triggers in children are urgently needed.
Background: Plasma transfusions are most commonly used therapeutically for bleeding or prophylactically in non-bleeding patients prior to invasive procedures or surgery. Although plasma transfusions generally seem to decline, plasma usage for indications that lack evidence of efficacy prevail. Summary: There is wide international, interinstitutional, and interindividual variance regarding the compliance with guidelines based on published references, supported by appropriate testing. There is furthermore a profound lack of evidence from randomized controlled trials comparing the effect of plasma transfusion with that of other therapeutic interventions for most indications, including massive bleeding. The expected benefit of a plasma transfusion needs to be balanced carefully against the associated risk of adverse events. In light of the heterogeneous nature of bleeding conditions and their rapid evolvement over time, fibrinogen and factor concentrate therapy, directed at specific phases of coagulation identified by alternative laboratory assays, may offer advantages over conventional blood product ratio-driven resuscitation. However, their outcome benefit has not been demonstrated in well-powered prospective trials. This systematic review will detail the current evidence base for plasma transfusion in adult surgical patients.
Management of decompensated cirrhosis is currently geared towards the treatment of complications once they occur. To date there is no established disease-modifying therapy aimed at halting progression of the disease and preventing the development of complications in patients with decompensated cirrhosis. The design of clinical trials to investigate new therapies for patients with decompensated cirrhosis is complex. The population of patients with decompensated cirrhosis is heterogeneous (i.e., different etiologies, comorbidities and disease severity), leading to the inclusion of diverse populations in clinical trials. In addition, primary endpoints selected for trials that include patients with decompensated cirrhosis are not homogeneous and at times may not be appropriate. This leads to difficulties in comparing results obtained from different trials. Against this background, the LiverHope Consortium organized a meeting of experts, the goal of which was to develop recommendations for the design of clinical trials and to define appropriate endpoints, both for trials aimed at modifying the natural history and preventing progression of decompensated cirrhosis, as well as for trials aimed at managing the individual complications of cirrhosis.
Objectives: An increasing number of treatment-determining biomarkers has been identified in non-small cell lung cancer (NSCLC) and molecular testing is recommended to enable optimal individualized treatment. However, data on implementation of these recommendations in the “real-world” setting are scarce. This study presents comprehensive details on the frequency, methodology and results of biomarker testing of advanced NSCLC in Germany.
Patients and methods: This analysis included 3,717 patients with advanced NSCLC (2,921 non-squamous; 796 squamous), recruited into the CRISP registry at start of systemic therapy by 150 German sites between December 2015 and June 2019. Evaluated were the molecular biomarkers EGFR, ALK, ROS1, BRAF, KRAS, MET, TP53, RET, HER2, as well as expression of PD-L1.
Results: In total, 90.5 % of the patients were tested for biomarkers. Testing rates were 92.2 % (non-squamous), 70.7 % (squamous) and increased from 83.2 % in 2015/16 to 94.2% in 2019. Overall testing rates for EGFR, ALK, ROS1, and BRAF were 72.5 %, 74.5 %, 66.1 %, and 53.0 %, respectively (non-squamous). Testing rates for PD-L1 expression were 64.5 % (non-squamous), and 58.5 % (squamous). The most common testing methods were immunohistochemistry (68.5 % non-squamous, 58.3 % squamous), and next-generation sequencing (38.7 % non-squamous, 14.4 % squamous). Reasons for not testing were insufficient tumor material or lack of guideline recommendations (squamous). No alteration was found in 37.8 % (non-squamous), and 57.9 % (squamous), respectively. Most common alterations in non-squamous tumors (all patients/all patients tested for the respective biomarker): KRAS (17.3 %/39.2 %), TP53 (14.1 %/51.4 %), and EGFR (11.0 %/15.1 %); in squamous tumors: TP53 (7.0 %/69.1 %), MET (1.5 %/11.1 %), and EGFR (1.1 %/4.4 %). Median PFS (non-squamous) was 8.7 months (95 % CI 7.4–10.4) with druggable EGFR mutation, and 8.0 months (95 % CI 3.9–9.2) with druggable ALK alterations.
Conclusion: Testing rates in Germany are high nationwide and acceptable in international comparison, but still leave out a significant portion of patients, who could potentially benefit. Thus, specific measures are needed to increase implementation.
Human lymph nodes play a central part of immune defense against infection agents and tumor cells. Lymphoid follicles are compartments of the lymph node which are spherical, mainly filled with B cells. B cells are cellular components of the adaptive immune systems. In the course of a specific immune response, lymphoid follicles pass different morphological differentiation stages. The morphology and the spatial distribution of lymphoid follicles can be sometimes associated to a particular causative agent and development stage of a disease. We report our new approach for the automatic detection of follicular regions in histological whole slide images of tissue sections immuno-stained with actin. The method is divided in two phases: (1) shock filter-based detection of transition points and (2) segmentation of follicular regions. Follicular regions in 10 whole slide images were manually annotated by visual inspection, and sample surveys were conducted by an expert pathologist. The results of our method were validated by comparing with the manual annotation. On average, we could achieve a Zijbendos similarity index of 0.71, with a standard deviation of 0.07.
Mental imagery provides an essential simulation tool for remembering the past and planning the future, with its strength affecting both cognition and mental health. Research suggests that neural activity spanning prefrontal, parietal, temporal, and visual areas supports the generation of mental images. Exactly how this network controls the strength of visual imagery remains unknown. Here, brain imaging and transcranial magnetic phosphene data show that lower resting activity and excitability levels in early visual cortex (V1-V3) predict stronger sensory imagery. Further, electrically decreasing visual cortex excitability using tDCS increases imagery strength, demonstrating a causative role of visual cortex excitability in controlling visual imagery. Together, these data suggest a neurophysiological mechanism of cortical excitability involved in controlling the strength of mental images.
Background: Berotralstat (BCX7353) is an oral, once-daily inhibitor of plasma kallikrein in development for the prophylaxis of hereditary angioedema (HAE) attacks.
Objective: Our aim was to determine the efficacy, safety, and tolerability of berotralstat in patients with HAE over a 24-week treatment period (the phase 3 APeX-2 trial).
Methods: APeX-2 was a double-blind, parallel-group study that randomized patients at 40 sites in 11 countries 1:1:1 to receive once-daily berotralstat in a dose of 110 mg or 150 mg or placebo (Clinicaltrials.gov identifier NCT03485911). Patients aged 12 years or older with HAE due to C1 inhibitor deficiency and at least 2 investigator-confirmed HAE attacks in the first 56 days of a prospective run-in period were eligible. The primary efficacy end point was the rate of investigator-confirmed HAE attacks during the 24-week treatment period.
Results: A total of 121 patients were randomized; 120 of them received at least 1 dose of the study drug (n = 41, 40, and 39 in the 110-mg dose of berotralstat, 150-mg of dose berotralstat, and placebo groups, respectively). Berotralstat demonstrated a significant reduction in attack rate at both 110 mg (1.65 attacks per month; P = .024) and 150 mg (1.31 attacks per month; P < .001) relative to placebo (2.35 attacks per month). The most frequent treatment-emergent adverse events that occurred more with berotralstat than with placebo were abdominal pain, vomiting, diarrhea, and back pain. No drug-related serious treatment-emergent adverse events occurred.
Conclusion: Both the 110-mg and 150-mg doses of berotralstat reduced HAE attack rates compared with placebo and were safe and generally well tolerated. The most favorable benefit-to-risk profile was observed at a dose of 150 mg per day.
Acute kidney injury (AKI) is still associated with high morbidity and mortality incidence rates, and also bears an elevated risk of subsequent chronic kidney disease. Although the kidney has a remarkable capacity for regeneration after injury and may recover completely depending on the type of renal lesions, the options for clinical intervention are restricted to fluid management and extracorporeal kidney support. The development of novel therapies to prevent AKI, to improve renal regeneration capacity after AKI, and to preserve renal function is urgently needed. The Special Issue covers research articles that investigated the molecular mechanisms of inflammation and injury during different renal pathologies, renal regeneration, diagnostics using new biomarkers, and the effects of different stimuli like medication or bacterial components on isolated renal cells or in vivo models. The Special Issue contains important reviews that consider the current knowledge of cell death and regeneration, inflammation, and the molecular mechanisms of kidney diseases. In addition, the potential of cell-based therapy approaches that use mesenchymal stromal/stem cells or their derivates is summarized. This edition is complemented by reviews that deal with the current data situation on other specific topics like diabetes and diabetic nephropathy or new therapeutic targets.
The in vivo firing patterns of ventral midbrain dopamine neurons are controlled by afferent and intrinsic activity to generate sensory cue and prediction error signals that are essential for reward-based learning. Given the absence of in vivo intracellular recordings during the last three decades, the subthreshold membrane potential events that cause changes in dopamine neuron firing patterns remain unknown. To address this, we established in vivo whole-cell recordings and obtained over 100 spontaneously active, immunocytochemically-defined midbrain dopamine neurons in isoflurane-anaesthetized adult mice. We identified a repertoire of subthreshold membrane potential signatures associated with distinct in vivo firing patterns. Dopamine neuron activity in vivo deviated from single-spike pacemaking by phasic increases in firing rate via two qualitatively distinct biophysical mechanisms: 1) a prolonged hyperpolarization preceding rebound bursts, accompanied by a hyperpolarizing shift in action potential threshold; and 2) a transient depolarization leading to high-frequency plateau bursts, associated with a depolarizing shift in action potential threshold. Our findings define a mechanistic framework for the biophysical implementation of dopamine neuron firing patterns in the intact brain.
Purpose of Review: To review the latest developments and the current role of the cardiac magnetic resonance (CMR) in pericardial diseases and their complications.
Recent Findings: Cardiac Magnetic Resonance (CMR) has the ability to incorporate anatomy, physiology, and “virtual histology” strategies to achieve the most accurate diagnosis for even the most demanding, pericardial diseases.
Summary: Acute, chronic, recurrent, and constrictive pericarditis as well as pericarditis related complications, pericardial masses and congenital pericardial defects are commonly encountered in clinical practice with relatively significant morbidity and mortality. Owing to the challenging diagnosis, CMR imaging is often employed in confirming the diagnosis and elucidating the underling pathophysiology. In this review we outline the common CMR techniques and their expected diagnostic outcomes.
Die Bestimmung von ACE im Serum oder Heparinplasma stellt einen wesentlichen Bestandteil der Diagnostik, Verlaufskontrolle und Therapieüberwachung von benignen Lungenerkrankungen dar. ACE ist ein Marker, der bei Sarkoidose wertvolle Aussagen zur Diagnosefindung ermöglicht. Hier zeichnet er sich durch hohe Sensitivität und Spezifität aus.
Purpose: In this study, we examined distress levels and quality of life (QoL) of patients with hematologic malignancies under treatment in an acute setting. We used external- and self-assessment instruments for distress. Additionally, we investigated the relation between distress and QoL as well as whether highly distressed patients differed from less distressed patients concerning their QoL.
Methods: A cross-sectional study with patients of the Medical Clinic II of the University Hospital Frankfurt was conducted. One hundred and nine patients were assessed with an expert rating scale and completed self-report questionnaires. Data were exploratively analyzed and group comparisons between patients who scored above the cut-off of the respective screening instruments and those who did not were conducted.
Results: Patients with hematologic malignancies experience high levels of distress and low QoL. Especially, role and social functioning are affected. Patients suffer most from fatigue, appetite loss, and insomnia. Using established cut-offs, all screening instruments were able to differentiate between patients regarding distress and QoL. Patients scoring above the cut-off were significantly more distressed and had a lower QoL. There was a medium-to-strong correlation between distress and QoL indicators.
Conclusion: Cancer-specific screening instruments seem to be able to identify treatment needs more specifically. They also allowed a better differentiation concerning QoL. The close link between distress and QoL needs to be recognized to enable a holistic approach to treatment and thereby optimize the quality of treatment.
Purpose: The coronavirus disease 2019 (COVID-19) poses major challenges to health-care systems worldwide. This pandemic demonstrates the importance of timely access to intensive care and, therefore, this study aims to explore the accessibility of intensive care beds in 14 European countries and its impact on the COVID-19 case fatality ratio (CFR).
Methods: We examined access to intensive care beds by deriving (1) a regional ratio of intensive care beds to 100,000 population capita (accessibility index, AI) and (2) the distance to the closest intensive care unit. The cross-sectional analysis was performed at a 5-by-5 km spatial resolution and results were summarized nationally for 14 European countries. The relationship between AI and CFR was analyzed at the regional level.
Results: We found national-level differences in the levels of access to intensive care beds. The AI was highest in Germany (AI = 35.3), followed by Estonia (AI = 33.5) and Austria (AI = 26.4), and lowest in Sweden (AI = 5) and Denmark (AI = 6.4). The average travel distance to the closest hospital was highest in Croatia (25.3 min by car) and lowest in Luxembourg (9.1 min). Subnational results illustrate that capacity was associated with population density and national-level inventories. The correlation analysis revealed a negative correlation of ICU accessibility and COVID-19 CFR (r = − 0.57; p < 0.001).
Conclusion: Geographical access to intensive care beds varies significantly across European countries and low ICU accessibility was associated with a higher proportion of COVID-19 deaths to cases (CFR). Important differences in access are due to the sizes of national resource inventories and the distribution of health-care facilities relative to the human population. Our findings provide a resource for officials planning public health responses beyond the current COVID-19 pandemic, such as identifying potential locations suitable for temporary facilities or establishing logistical plans for moving severely ill patients to facilities with available beds.
Improving spatial accessibility to hospitals is a major task for health care systems which can be facilitated using recent methodological improvements of spatial accessibility measures. We used the integrated floating catchment area (iFCA) method to analyze spatial accessibility of general inpatient care (internal medicine, surgery and neurology) on national level in Germany determining an accessibility index (AI) by integrating distances, hospital beds and morbidity data. The analysis of 358 million distances between hospitals and population locations revealed clusters of lower accessibility indices in areas in north east Germany. There was a correlation of urbanity and accessibility up to r = 0.31 (p < 0.001). Furthermore, 10% of the population lived in areas with significant clusters of low spatial accessibility for internal medicine and surgery (neurology: 20%). The analysis revealed the highest accessibility for heart failure (AI = 7.33) and the lowest accessibility for stroke (AI = 0.69). The method applied proofed to reveal important aspects of spatial accessibility i.e. geographic variations that need to be addressed. However, for the majority of the German population, accessibility of general inpatient care was either high or at least not significantly low, which suggests rather adequate allocation of hospital resources for most parts of Germany.
This focus issue of the European Journal of Trauma and Emergency Surgery compiles a collection of outstanding clinical research using the immense dataset of the German TraumaRegister DGU® (TR-DGU). The TR-DGU of the German Trauma Society (Deutsche Gesellschaft für Unfallchirurgie, DGU) was founded in 1993. Currently, approximately 40,000 cases from more than 600 hospitals are entered into the database every year. The selected articles of this focus on issue highlight the immense value the TR-DGU constitutes for the current, but also for the future trauma research.
Zielsetzung: Die Daten für das Jahr 2019 des Registers „Abdominelles Aortenaneurysma“ (AAA) des Deutschen Instituts für Gefäßmedizinische Gesundheitsforschung (DIGG) der Deutschen Gesellschaft für Gefäßchirurgie und Gefäßmedizin werden vorgestellt.
Methodik: Im Jahr 2019 beteiligten sich an dem Register insgesamt 109 Kliniken. Für die offene Versorgung (OR) des intakten AAA (iAAA) gaben 78 (71,6 %) Kliniken, für die endovaskuläre Versorgung (EVAR) des iAAA 102 (93,6 %) Kliniken Daten ein. Für das rupturierte AAA (rAAA) wurden von 36 Kliniken (33,0 %) (EVAR) bzw. 50 (45,9 %) Kliniken (OR) Patienten gemeldet. Ausgewertet wurden die Daten von 1967 stationär behandelten Patienten. Von den insgesamt 1793 iAAA waren 1501 infrarenal (83,7 %) und 292 (16,3 %) juxtarenal gelegen.
Ergebnisse: 1429 iAAA (79,7 %) wurden endovaskulär und 364 (20,3 %) offen versorgt. Bei den endovaskulär versorgten Patienten mit iAAA verlief der Eingriff in 86,3 % der Fälle komplikationslos. Es verstarben insgesamt 15 Patienten (1,0 %) bis zur Entlassung. Bei den offen versorgten Patienten wiesen 67,0 % der Patienten keine Komplikationen auf. Verstorben sind insgesamt 20 Patienten (5,5 %). Bei EVAR war die Klinikletalität bei Versorgung juxtarenaler AAA mit 3,7 % signifikant höher als bei Versorgung infrarenaler AAA mit 0,6 % (p = 0,002), bei OR konnten hingegen keine signifikanten Unterschiede hinsichtlich der Klinikletalität aufgezeigt werden (juxtarenal 4,8 %, infrarenal 5,8 %; p = 0,470). Von den 174 Patienten mit rAAA wurden 80 (46,0 %) endovaskulär und 94 (54,0 %) offen versorgt. Bei EVAR sind 20,0 % der Patienten während des stationären Aufenthalts verstorben, bei OR 36,2 %.
Schlussfolgerung: Die Ergebnisse des Jahres 2019 zu Klinikletalität und Morbidität bei endovaskulärer und offener Versorgung des iAAA bestätigen weitgehend die publizierten Ergebnisse für die Jahre 2013 bis 2018. Beim rAAA sind die Ergebnisse der einzelnen Jahresberichte hingegen widersprüchlich, die kleinen berichteten jährlichen Fallzahlen erlauben nur Aussagen über größere Zeiträume.
Introduction: The induced membrane technique for the treatment of large bone defects is a two-step procedure. In the first operation, a foreign body membrane is induced around a spacer, then, in the second step, several weeks or months later, the spacer is removed and the Membrane pocket is filled with autologous bone material. Induction of a functional biological membrane might be avoided by initially using a biological membrane. In this study, the effect of a human acellular dermis (hADM, Epiflex, DIZG gGmbH) was evaluated for the treatment of a large (5 mm), plate-stabilised femoral bone defect.
Material and Methods: In an established rat model, hADM was compared to the two-stage induced membrane technique and a bone defect without membrane cover. Syngeneous spongiosa from donor animals was used for defect filling in all groups. The group size in each case was n = 5, the induction time of the membrane was 3–4 weeks and the healing time after filling of the defect was 8 weeks.
Results: The ultimate loads were increased to levels comparable with native bone in both membrane groups (hADM: 63.2% ± 29.6% of the reference bone, p < 0.05 vs. no membrane, induced membrane: 52.1% ± 25.8% of the reference bone, p < 0.05 vs. no membrane) and were significantly higher than the control group without membrane (21.5%). The membrane groups were radiologically and histologically almost completely bridged by new bone formation, in contrast to the control Group where no closed osseous bridging could be observed.
Conclusion: The use of the human acellular dermis leads to equivalent healing results in comparison to the two-stage induced membrane technique. This could lead to a shortened therapy duration of large bone defects.
Background: The prevalence of multimorbidity is increasing in recent years, and patients with multimorbidity often have a decrease in quality of life and require more health care. The aim of this study was to explore the evolution of multimorbidity taking the sequence of diseases into consideration.
Methods: We used a Belgian database collected by extracting coded parameters and more than 100 chronic conditions from the Electronic Health Records of general practitioners to study patients older than 40 years with multiple diagnoses between 1991 and 2015 (N = 65 939). We applied Markov chains to estimate the probability of developing another condition in the next state after a diagnosis. The results of Weighted Association Rule Mining (WARM) allow us to show strong associations among multiple conditions.
Results: About 66.9% of the selected patients had multimorbidity. Conditions with high prevalence, such as hypertension and depressive disorder, were likely to occur after the diagnosis of most conditions. Patterns in several disease groups were apparent based on the results of both Markov chain and WARM, such as musculoskeletal diseases and psychological diseases. Psychological diseases were frequently followed by irritable bowel syndrome.
Conclusions: Our study used Markov chains and WARM for the first time to provide a comprehensive view of the relations among 103 chronic conditions, taking sequential chronology into consideration. Some strong associations among specific conditions were detected and the results were consistent with current knowledge in literature, meaning the approaches were valid to be used on larger data sets, such as National Health care Systems or private insurers.
Introduction: Dravet syndrome (DS), a prototypic developmental and genetic epileptic encephalopathy (DEE), is characterized by an early onset of treatment-refractory seizures, together with impairments in motor control, behavior, and cognition. Even with multiple conventional anti-epileptic drugs, seizures remain poorly controlled, and there has been a considerable unmet need for effective and tolerable treatments. Areas covered: This targeted literature review aims to highlight recent changes to the therapeutic landscape for DS by summarizing the most up-to-date, evidence-based research, including pivotal data from the clinical development of stiripentol, cannabidiol, and fenfluramine, which are important milestones for DS treatment, together with the latest findings of other pharmacotherapies in development. In phase III, double-blind, placebo-controlled randomized controlled trials stiripentol, cannabidiol, and fenfluramine have shown clinically relevant reductions in convulsive seizure frequency, and are generally well tolerated. Stiripentol was associated with responder rates (greater than 50% reduction in convulsive seizure frequency) of 67%-71%, when added to valproic acid and clobazam; cannabidiol was associated with responder rates of 43%-49% (48%-63% in conjunction with clobazam), and fenfluramine of 54%-68% across studies. Therapies in development include soticlestat, ataluren, verapamil, and clemizole, with strategies to treat the underlying cause of DS, including gene therapy and antisense oligonucleotides beginning to emerge from preclinical studies. Expert opinion: Despite the challenges of drug development in rare diseases, this is an exciting time for the treatment of DS, with the promise of new efficacious and well-tolerated therapies, which may pave the way for treatment advances in other DEEs.
BH3 mimetics are novel anticancer therapeutics that induce apoptosis by targeting anti‐apoptotic BCL‐2 proteins. Highly specific inhibitors of the main anti-apoptotic proteins BCL-2, BCL‐XL and MCL‐1 promise new opportunities for the treatment of AML. However, it is currently unclear which of these anti-apoptotic BCL-2 proteins represents the most promising target in AML. Therefore, we investigated the effect of BH3 mimetics targeting either BCL-2 (ABT-199, S55746), BCL-XL (A-1331852) or MCL-1 (S63845) on eleven AML cell lines. Drug sensitivity screening revealed heterogeneous sensitivity towards the different BH3 mimetics, with the best responses observed upon targeting of MCL-1. Selected cell lines that displayed sensitivity towards the specific BH3 mimetics underwent intrinsic apoptosis, which was characterized by loss of mitochondrial membrane potential, exposure of phosphatidylserine and activation of caspases. Furthermore, S63845 turned out to displace BIMS and NOXA from MCL-1 to induce apoptotic cell death. Importantly, the translational relevance of this study was demonstrated by experiments in primary AML blasts, which displayed similar sensitivity towards BH3 mimetics as the cell lines did. Additionally, experiments with nonmalignant cells could confirm the clinical relevance of the MCL-1 inhibitor. There we could show, that S63845 does not cause cytotoxicity on HPCs at efficacious doses.
In conclusion, our findings reveal that the inhibition of BCL-2 proteins, especially MCL-1, by BH3 mimetics can be a promising strategy in AML treatment.
Study design: Systematic review. Background and objectives: Preoperative neuromuscular function is predictive for knee function and return to sports (RTS) after reconstruction of the anterior cruciate ligament (ACL). The aim of this review was to examine the potential benefits of prehabilitation on pre-/postoperative objective, self-reported and RTS-specific outcomes. Methods: A systematic search was conducted within three databases. From the 1.071 studies screened, two randomized control trials (RCTs), two control trials (CTs) and two cohort studies (CS) met the inclusion criteria. Methodological quality rating adopted the PEDro- (RCT, CT) or Newcastle-Ottawa-Scale (CS). Results and conclusions: Methodological quality of the included studies was moderate (PEDro score: 6.5 ± 1.7; range 4 to 9). Two studies reported higher increases of the maximal quadriceps torque from baseline to pre-reconstruction: one study in the limb symmetry index (LSI), and one in both legs of the prehabilitation group compared to the controls. At 12-weeks post-reconstruction, one study (from two) indicated that the prehabilitation group had a lesser post-operative decline in the single-leg-hop for distance LSI (clinically meaningful). Similar findings were found in terms of quadriceps strength LSI (one study). At both pre-reconstruction (three studies) and two-year post-surgery (two studies), the prehabilitation groups reached significantly higher self-reported knee function (clinically meaningful) than the controls. RTS tended to be faster (one study). At two years post-surgery, RTS rates (one study) were higher in the prehabilitation groups. The results provide evidence for the relevance of prehabilitation prior to ACL-reconstruction to improve neuromuscular and self-reported knee function as well as RTS. More high quality confirmatory RCTs are warranted.
Class I and II histone deacetylases (HDAC) are considered important regulators of immunity and inflammation. Modulation of HDAC expression and activity is associated with altered inflammatory responses but reports are controversial and the specific impact of single HDACs is not clear. We examined class I and II HDACs in TLR-4 signaling pathways in murine macrophages with a focus on IκB kinase epsilon (IKKε) which has not been investigated in this context before. Therefore, we applied the pan-HDAC inhibitors (HDACi) trichostatin A (TSA) and suberoylanilide hydroxamic acid (SAHA) as well as HDAC-specific siRNA. Administration of HDACi reduced HDAC activity and decreased expression of IKKε although its acetylation was increased. Other pro-inflammatory genes (IL-1β, iNOS, TNFα) also decreased while COX-2 expression increased. HDAC 2, 3 and 4, respectively, might be involved in IKKε and iNOS downregulation with potential participation of NF-κB transcription factor inhibition. Suppression of HDAC 1–3, activation of NF-κB and RNA stabilization mechanisms might contribute to increased COX-2 expression. In conclusion, our results indicate that TSA and SAHA exert a number of histone- and HDAC-independent functions. Furthermore, the data show that different HDAC enzymes fulfill different functions in macrophages and might lead to both pro- and anti-inflammatory effects which have to be considered in therapeutic approaches.
Objectives: The ongoing coronavirus pandemic is challenging, especially in severely affected patients who require intubation and sedation. Although the potential benefits of sedation with volatile anesthetics in coronavirus disease 2019 patients are currently being discussed, the use of isoflurane in patients with coronavirus disease 2019–induced acute respiratory distress syndrome has not yet been reported. Design: We performed a retrospective analysis of critically ill patients with hypoxemic respiratory failure requiring mechanical ventilation. Setting: The study was conducted with patients admitted between April 4 and May 15, 2020 to our ICU. Patients: We included five patients who were previously diagnosed with severe acute respiratory syndrome coronavirus 2 infection. Intervention: Even with high doses of several IV sedatives, the targeted level of sedation could not be achieved. Therefore, the sedation regimen was switched to inhalational isoflurane. Clinical data were recorded using a patient data management system. We recorded demographical data, laboratory results, ventilation variables, sedative dosages, sedation level, prone positioning, duration of volatile sedation and outcomes. Measurements & Main Results: Mean age (four men, one women) was 53.0 (± 12.7) years. The mean duration of isoflurane sedation was 103.2 (± 66.2) hours. Our data demonstrate a substantial improvement in the oxygenation ratio when using isoflurane sedation. Deep sedation as assessed by the Richmond Agitation and Sedation Scale was rapidly and closely controlled in all patients, and the subsequent discontinuation of IV sedation was possible within the first 30 minutes. No adverse events were detected. Conclusions: Our findings demonstrate the feasibility of isoflurane sedation in five patients suffering from severe coronavirus disease 2019 infection. Volatile isoflurane was able to achieve the required deep sedation and reduced the need for IV sedation.
Background: Non-clear cell renal cell cancers (nccRCC) are rare entities, and the optimal therapy in metastatic disease has still to be defined. Methods: In this small prospectively randomized phase IIa multicenter trial, we investigated temsirolimus (TEM) versus sunitinib (SUN) as first-line therapy in patients with metastatic nccRCC. The patients were randomized 1:1 to either TEM in a dose of 25 mg i.v. once a week or SUN with 50 mg p.o. daily for 4 weeks on and 2 weeks off. Primary endpoint was progression-free survival (PFS). In total, 22 patients were included with predominantly papillary RCC (16/22) followed by chromophobe RCC and others. Results: The male to female ratio was 16:6. The tumor control rate (CR + PR + SD) was 58% for TEM and 90% for SUN-treated patients. There was also a trend for improved PFS with 9.3 versus 13.2 months (HR 1.64; 95% CI 0.65–4.18) in favor of SUN. There was no trend for overall survival. Conclusions: Despite this trial had to be terminated earlier due to low recruitment, the results match the other studies published so far with the mTOR inhibitor everolimus and SUN, which show a trend in favor of SUN for ORR and PFS.
Die Pathophysiologie der Bandscheibendegeneration (intervertebral disc degeneration, IVDD) und ihre molekularen Mechanismen sind noch in weiten Teilen unverstanden. Ihre Ursachen und Risikofaktoren sind vielfältig und schließen unter anderem Alter, Geschlecht, Umwelteinflüsse oder mechanische Belastungen mit ein.
Für das der Bandscheibe eng verwandte Knorpelgewebe wurde in aktuellen Studien der Einfluss des Sympathikus bzw. dessen Neurotransmitters Noradrenalin (NE) via adrenerger Rezeptoren (AR) auf die Zellproliferation, die Expression von Molekülen der extrazellulären Matrix und somit auch auf die Degeneration beschrieben. In Bandscheiben wurde bereits das Vorhandensein von sympathischen Nervenendigungen nachgewiesen, allerdings wurde die Expression der Adrenozeptoren hier noch nie untersucht. Das Ziel der vorliegenden Arbeit war also die Analyse der ARs im Gewebe der Bandscheibe und die Evaluation der Korrelation mit der Bandscheibendegeneration.
Das für die Analyse benötigte Gewebe stammt von Patienten, bei welchen eine Wirbelkörperverblockung (Spondylodese) durchgeführt wurde. Im Rahmen dieser Spondylodese wird das Bandscheibengewebe des betroffenen Segmentes entfernt. Der Degenerationsgrad der anonymisierten Proben wurde prä- und intraoperativ bestimmt und im entnommenen Gewebe sowie in isolierten Zellen die Expression aller bekannten ARs mittels reverse transcription polymerase chain reaction (RT-PCR) untersucht. Zum Nachweis der ARs auf Proteinebene wurden einzelne humane Proben auch immunhistochemisch analysiert. Des Weiteren wurde anhand von Wildtyp- und sogenannten SM/J-Mäusen, die eine spontane IVDD entwickeln, die Proteinexpression der ARs und der extrazellulären Matrix (ECM) von gesunden und geschädigten Bandscheiben an histologischen Schnitten verglichen. Schließlich wurde an isolierten und kultivierten humanen Zellen ein Stimulationsversuch mit Noradrenalin durchgeführt, um zu prüfen, ob es nach Aktivierung der ARs zu einer intrazellulären Signalweiterleitung kommt.
In Nativgewebe der humanen Bandscheibe konnte die messenger Ribonukleinsäure (mRNA) von α1a-, α1b-, α2a-, α2b-, α2c-, β1- und β2-ARs nachgewiesen werden. Nach siebentägiger Zellkultur im Monolayer präsentierte sich ein nur dezent abweichendes Genexpressionsmuster. Auf Proteinebene war das Signal des β2-AR nur im Bereich des Annulus fibrosus (AF) detektierbar jedoch nicht im Nucleus pulposus (NP). Selbiges war auch in murinen Schnitten festzustellen, wobei sich bei Wildtype (WT)-Mäusen hauptsächlich im inneren AF β2-positive Zellen fanden, während sich das Signal bei der SM/J-Maus weiter in Richtung des äußeren AF und des NP ausdehnte. α2a-AR und α2c-AR waren hingegen auf Proteinebene nicht nachweisbar. Bei der immunhistochemischen Untersuchung relevanter ECM-Moleküle zeigte sich für Kollagen II, Kollagen XII, cartilage oligomeric matrix protein (COMP) und Decorin (DCN) eine Verteilung, die mit der des β2-AR-Signals korreliert. Der Stimulationsversuch in humaner Zellkultur ergab eine Aktivierung der für die ARs relevanten Proteinkinase A (PKA)- und extracellular signal–regulated kinases (ERK1/2) -Signalwege.
In der vorliegenden Arbeit konnte zum ersten Mal die Existenz und Funktionalität von Adrenozeptoren im Bandscheibengewebe nachgewiesen werden. Unterschiede in der Expression der ARs, kombiniert mit Veränderungen der ECM-Zusammensetzung könnten ein Hinweis auf den Einfluss des Sympathikus bei IVDD sein. Die aktuelle demographische Entwicklung und die sich hieraus ergebende gesundheitsökonomische Belastung machen die Ergründung molekularer Mechanismen der IVDD und die daraus resultierende Entwicklung innovativer Behandlungsmethoden zu Kardinalfragen moderner orthopädischer Grundlagenforschung.
Für Patienten mit lebensbedrohlichen Herzkrankheiten ist die Extracorporeal-Life-Support-Behandlung (ECLS) eine sinnvolle Therapiemöglichkeit. Sie bietet für Patienten im kardiogenen Schock ein Zeitfenster, um eine myokardiale Erholung zu erreichen. Hierbei kann in Abhängigkeit vom Krankheitsbild die zusätzliche Anwendung von IABP die Heilungschancen begünstigen. In der durchgeführten retrospektiven Studie wurden 118 Patienten betrachtet, die in der Klinik für Thorax-, Herz- und thorakale Gefäßchirurgie der Universität Frankfurt am Main im Zeitraum von Dezember 2001 bis Ende 2013 eine ECLS-Therapie erhalten haben. Bei 59 Patienten wurde die ECLS-Unterstützung in Kombination mit IABP durchgeführt. Die beiden Patientenkollektive - mit und
ohne IABP- sind hinsichtlich ihrer Risikofaktoren vergleichbar. Ausgehend von der Zielsetzung dieser Arbeit wurde analysiert, ob der gleichzeitige Einsatz der IABP bei ECLS – Therapie von Vor- oder sogar von Nachteil ist. Hierfür wurden für die beiden Therapiegruppen Überlebenszeitanalysen nach Kaplan-Meier durchgeführt. Der statistische Vergleich der Überlebensraten und des Weaningerfolgs erfolgte mit Hilfe des Log-Rank-Tests.
Die Auswertung der erhobenen Daten hat ergeben, dass kein signifikanter Unterschied bei der 30-Tages-Überlebensrate und dem Weaningerfolg für die beiden Patientenkollektive mit und ohne zusätzliche IABP-Anwendung vorhanden ist. Risikofaktoren wie ein hohes Lebensalter oder eine bereits vor Einlieferung stattgefundene Intubation verringern außerdem die Überlebenschancen nach ECLS - Therapie. Ein fortgeschrittenes NYHA-Stadium konnte nicht als negativer prädiktiver Faktor identifiziert werden.
In der Literatur kommt man bezüglich der Mortalität unter alleiniger ECLS-Therapie oder dem zusätzlichen Einsatz einer IABP zu unterschiedlichen Ergebnissen. Diese besagen teilweise, dass ECLS und IABP einander ergänzende Methoden sind, die sich durchaus synergistisch auf den Behandlungserfolg auswirken können und dass die Mortalität beim zusätzlichen Einsatz einer IABP signifikant niedriger ist. Die Durchführung weiterer prospektiver Studien mit vergleichbaren Patientenkollektiven zur Untersuchung des Outcomes bei den unterschiedlichen Behandlungsmethoden ist jedoch erforderlich, um ein aussagekräftiges Fazit ziehen zu können.
Im Rahmen der Serie „Biomarker“, die im Zentralblatt für Arbeitsmedizin und Ergonomie publiziert wird, ist das Angiotensin-Converting Enzyme (ACE) zugehörig als häufiger Marker in der Diagnostik von pulmonalen und extrapulmonalen Erkrankungen. Die Bestimmung von ACE stellt einen wesentlichen Bestandteil der Diagnostik von pulmonalen und extrapulmonalen Erkrankungen dar. Der Einfluss von Tabakkonsum, Medikamenten, Zink-Chelatoren auf die ACE-Konzentration wird eruiert. ACE erwies sich als Marker mit einer hohen Sensitivität und Spezifität bei Lungenerkrankungen.
Hintergrund und Ziel der Studie: Steigende Erwartungen der Kataraktpatienten führen zu stetigen Weiterentwicklungen in der Linsenchirurgie. Der Wunsch nach einem perfekten Sehvermögen und nach Brillenfreiheit ist ein häufiges Anliegen insbesondere jüngerer Patienten. Die Einführung von Multifokallinsen hat die Therapie der Katarakt revolutioniert. Mit ihnen kann der Patient sowohl in der Nähe und Ferne als auch im Intermediärbereich scharf sehen. Für die präoperative Berechnung der Linsenstärke sind verschiedene Formeln verfügbar. Bis jetzt wurde keine Studie publiziert, die die Präzision dieser Formeln für die Kalkulation der Stärke von Tri- oder Quadrifokallinsen untersucht hat. Das Ziel dieser Studie war die Evaluation neun moderner Formeln für die Berechnung der Linsenstärke der quadrifokalen AcrySof IQ PanOptix TFNT00.
Methoden:
Die vorliegende Studie ist eine retrospektive konsekutive Fallserie, die an der Augenklinik der Johann Wolfgang Goethe-Universität Frankfurt am Main durchgeführt wurde. Es wurden alle Patienten eingeschlossen, die sich einer Femtosekundenlaser-assistierten Operation mit Implantation der quadrifokalen Intraokularlinse unterzogen. Mit dem IOLMaster 500 wurden die präoperativen Biometrie- und Keratometriemessungen durchgeführt. Das Scheimpflug-System Pentacam wurde zur Bestimmung der zentralen Hornhautdicke genutzt. Drei Monate nach der Operation fand eine subjektive Refraktionsbestimmung statt.
Die folgenden Formeln wurden untersucht: Holladay 1, Sanders-Retzlaff-Kraff/ theoretical (SRK/T), Hoffer Q, T2, Holladay 2, Haigis, Barrett Universal II, Olsen und Hill-radiale Basisfunktion (RBF). Aus den präoperativ gemessenen Parametern und der implantierten Linsenstärke wurde für jedes Auge die postoperative Refraktion berechnet. Der Vorhersagefehler (PE) ist definiert als die Differenz zwischen dem tatsächlich erreichten postoperativen sphärischen Äquivalent und dem vorhergesagten postoperativen sphärischen Äquivalent. Zur Reduktion systematischer Fehler und somit des mittleren PE wurden die Linsenkonstanten optimiert. Primäre Endpunkte waren die Unterschiede der mittleren absoluten Vorhersagefehler (MAE) zwischen den Formeln. Die Standardabweichung sowie mediane und maximale absolute PE wurden ebenso untersucht wie die Prozent-zahlen der Augen, deren PE innerhalb von ± 0,25 Dioptrien (dpt), ± 0,5 dpt, ± 1,0 dpt und ± 2,0 dpt lagen.
Ergebnisse: 75 Augen von 38 Patienten wurden in die vorliegende Studie eingeschlossen. Es gab signifikante Unterschiede zwischen den Formeln bezüglich des MAE (p Wert < 0,001). Die Rangfolge der Formeln entsprechend ihrer MAE lautet wie folgt: Barrett Universal II (0,294 dpt), Hill-RBF (0,332 dpt), Olsen (0,339 dpt), T2 (0,351 dpt), Holladay 1 (0,381 dpt), Haigis (0,382 dpt), SRK/T (0,393 dpt), Holladay 2 (0,399 dpt) und Hoffer Q (0,410 dpt). Den niedrigsten maximalen absoluten PE erreichte die Barrett Universal II. Bei der Verwendung der Haigis, Barrett Universal II, Olsen und Hill-RBF hatten 80% der Augen einen PE innerhalb von ± 0,5 dpt. In der Subgruppe der kurzen Augen wurde der niedrigste PE mit der Hill-RBF erreicht, in der Subgruppe der langen Augen mit der Barrett Universal II und mit der Olsen. Die Unterschiede zwischen den Formeln in den Subgruppenanalysen waren jedoch nicht signifikant.
Schlussfolgerung:Die exaktesten Vorhersagen der postoperativen Refraktion konnten mit der Barrett Universal II erzielt werden. Diese Formel sollte daher künftig für die Berechnung der Linsenstärke der quadrifokalen PanOptix genutzt werden. Die T2, Olsen und Hill-RBF führten ebenfalls zu niedrigen PE. Weitere Studien sollten durchgeführt werden, um insbesondere die neuen Formeln im Zusammenhang mit verschiedenen Multifokallinsen zu untersuchen. Um die Vorhersagegenauigkeit der Formeln in Augen mit extremen Achsenlängen zu bewerten, sind Studien mit größeren Fallzahlen nötig.
Introduction: Gastroesophageal reflux disease (GERD) is associated with accelerated decline in lung health in children with cystic fibrosis (CF). Thus, antireflux surgery (ARS) is offered to a selected CF cohort with refractory GERD, but outcomes remain poorly investigated. This study aimed to determine the incidence of GERD in children with CF and to evaluate complications and outcomes of ARS. Materials and Methods: A systematic literature-based search was conducted using various online databases according to Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines. The number of GERD cases in pediatric CF cohorts who underwent diagnostic investigation(s) was recorded. Data on postoperative complications and outcomes (including symptoms, lung function, and nutritional status) following ARS were analyzed. Results: Ten articles (n = 289 patients) met the defined inclusion criteria (51% male; age range, 0.5 month–36 years). The overall incidence of GERD was 46% (range, 19–81%), derived from seven studies (n = 212 patients). Four publications (n = 82 patients) reported on ARS due to uncontrolled GERD. All ARSs were Nissen fundoplication (majority with gastrostomy placement). Major postoperative complications occurred in 15 (18%) patients, two required redo-ARS. Median follow-up time was 2 years (range, 3 months–6 years); 59% showed symptom improvement, and pulmonary exacerbations and decline in lung function were reduced. Nutritional status mainly improved in milder CF cases. There were no deaths related to ARS. Conclusion: Approximately half of pediatric CF patients have GERD. Published data for children with CF are limited and heterogeneous in terms of GERD diagnosis and outcomes following ARS. However, ARS has shown to slow the deterioration of lung function in CF.
Duale Thrombozytenaggregationshemmung (‚dual antiplatelet therapy‘: DAPT) erhöht das Risiko für eine hämorrhagische Transformation (HT) von ischämischen Schlaganfällen nach Thrombolyse mit gewebespezifischem Plasminogenaktivator (‚tissue plasminogen activator‘: tPA). Bisherige klinische Studien waren jedoch nicht vollends eindeutig, ob diese erhöhte Blutungswahrscheinlichkeit tatsächlich zu einer schlechteren Ausgangssituation für Patienten führt. Viele sehen die initiale klinische Verschlechterung im Rahmen einer potenziellen HT durch den Nutzen der wiederhergestellten Rekanalisation verschlossener Gefäße aufgewogen. Aus diesem Grunde sollte tPA auch in Patienten angewendet werden, die einen Schlaganfall unter DAPT erleiden. Bisher sind der Pathomechanismus und die beteiligten Mediatoren der HT unverstanden. Allerdings könnte die Reduktion der tPA-assoziierten HT zu einer sichereren Anwendung der Thrombolyse beitragen und ihren Nutzen insgesamt weiter steigern. Daher war es Ziel dieser Studie, ein Schlaganfallmodell mit tPA-assoziierter HT in Mäusen unter DAPT zu etablieren, um damit erste Bewertungen therapeutischer Ansätze zur Begrenzung der HT zu ermöglichen.
Ein entscheidender Aspekt vorab war die Bestimmung der Thrombozytenfunktion in den behandelten Mäusen, um damit die Wirksamkeit der DAPT zu messen. Dies war besonders vor dem Hintergrund wichtig, dass DAPT bei Patienten unterschiedlich wirksam ist. So gibt es einen gewissen Anteil Patienten, der resistent gegenüber Aspirin und/oder anderen Thrombozytenaggregationshemmern wie Clopidogrel zu sein scheint. Daher galt es, dieses Phänomen in unserem Modell zu kontrollieren und etwaige Non-Responder zu identifizieren und gegebenenfalls auszuschließen. Dies ist bei herkömmlichen Methoden der Aggregometrie (dem Standardverfahren zur Messung der Thrombozytenfunktion und Therapieüberwachung von Thrombozytenaggregationshemmern) eine Herausforderung, da im Handel erhältliche Aggregometer Blutvolumina erfordern, die für eine Maus tödlich wären. Auch Schwanzblutungstests (sog. „tail bleeding tests“) versagen häufig, wenn sie nach einer experimentellen Schlaganfalloperation durchgeführt werden. Wir haben daher einen Durchflusszytometrie-basierten Ansatz zur Messung der in vitro Thrombozytenfunktion modifiziert, der nur geringe Blutvolumina erfordert und von uns erstmals in einem experimentellen Schlaganfallprotokoll eingesetzt wurde. Dieser zeigte eine signifikant reduzierte Thrombozytenfunktion nach DAPT mit Aspirin und Clopidogrel (ASA+CPG) an. Die Methode korrelierte gut mit Ergebnissen von zusätzlich durchgeführten Schwanzblutungstests und wird künftige präklinische Studien zur DAPT in Mäusen erleichtern. Obwohl es eine gewisse Variabilität in der Thrombozytenfunktion der behandelten Mäuse gab, identifizierten wir letztendlich keine Non-Responder.
Als nächstes zeigten wir erfolgreich, dass DAPT mit ASA+CPG in Mäusen beim experimentellen Schlaganfall zu vermehrter HT beiträgt. Wurde die DAPT mit einer tPA-Thrombolyse verbunden, erhöhte sich die HT-Rate sogar signifikant im Vergleich zu unbehandelten Mäusen mit und ohne tPA-Thrombolyse. Unser Modell kann nun genutzt werden, um die Mechanismen der HT weiter zu untersuchen. Noch wichtiger ist, dass die Einrichtung eines solchen Modells es Forschern ermöglicht, mögliche Strategien zur Minderung des Blutungsrisikos bei Patienten mit DAPT zu testen.
Zur Verringerung der HT wählten wir zwei verschiedene pharmakologische Strategien. Zunächst untersuchten wir die Reduktion der tPA Dosis, welche allerdings nicht erfolgreich vor hämorrhagischen Komplikationen schützen konnte. Danach fokussierten wir uns auf die Rolle der 12/15-Lipoxygenase (12/15-LOX) in unserem Modell. Verschiedene Vorarbeiten hatten gezeigt, dass die 12/15-LOX zum Untergang von Endothelzellen im ischämischen Gehirn beiträgt und damit wahrscheinlich eine ursächliche oder zumindest unterstützende Rolle in der Entstehung der HT hat. So wiederholten wir unsere Versuche der tPA-assoziierten HT unter DAPT in LOX-knockout Mäusen und inhibierten die 12/15-LOX pharmakologisch mit ML351. Wir zeigten erfolgreich, dass die Hemmung von 12/15-LOX in Wildtyp-Mäusen die Blutungsrate signifikant reduzierte und identifizierten die 12/15-LOX damit als geeigneten Kandidaten für weiterführende Studien zur Eindämmung sekundärer Schäden nach ischämischen Schlaganfall. Zudem wäre neben der therapeutischen, auch die prophylaktische Gabe von 12/15-LOX Inhibitoren in Hochrisikopatienten additiv zur Thrombolyse denkbar. Eine solche Blutungsprophylaxe könnte zu einer Indikationserweiterung der Lysetherapie beitragen und das funktionelle Langzeit-Ergebnis der Patienten verbessern.
Background: Current literature is inconsistent regarding the risk of severe side effects using accelerated induction protocols in Hymenoptera venom immunotherapy (VIT). In addition, several data indicate the influence of purity grade of venom preparation on tolerability. We evaluated the safety and tolerability of ultra-rush and rush build-up protocols using purified and non-purified venom preparations. Methods: Retrospective single-center study of 581 VIT inductions (325 ultra-rush and 256 rush protocols) from 2005 to 2018 in 559 patients with bee and vespid venom allergy using aqueous purified (ALK SQ®) for ultra-rush protocol and aqueous non-purified (ALK Reless®) venom preparations for rush protocol. Results: Urticaria (8% vs. 3.1%, p = 0,013) and dose reductions (4.3% vs. 1.2%, p = 0,026) were significantly more frequent in the ultra-rush group. Overall rate of moderate-to-severe side effects (anaphylaxis ≥grade 2 according to Ring and Meβmer) was low and did not differ significantly between protocols (p = 0.105). Severe events (grade 4 anaphylaxis) were not reported. Discontinuation rate was very low in both cohorts (0.6% vs 1.2%). The higher purity grade of venom preparations in the ultra-rush cohort did not improve tolerability. The bee venom group showed a non-significant trend towards higher incidence of mild reactions (urticaria), resulting in more frequent dose reductions and antiallergic therapy. Conclusion: Rush and ultra-rush protocols show an excellent safety profile with only infrequent and mild anaphylactic reactions in bee and vespid venom allergy. Ultra-rush immunotherapy reduces the duration of the inpatient build-up phase setting and thus is viewed by the authors as preferred treatment in Hymenoptera venom allergic patients.
Background: Dental professionals are subjected to higher risks for musculoskeletal disorders (MSDs) than other professional groups, especially the hand region. This study aims to investigate the prevalence of hand complaints among dentists (Ds) and dental assistants (DAs) and examines applied therapies. Methods: For this purpose, an online questionnaire analysed 389 Ds (240female/149male) and 406 DAs (401female/5male) working in Germany. The self-reported data of the two occupational groups were compared with regard to the topics examined. The questionnaire was based on the Nordic Questionnaire (self-reported lifetime, 12-month and 7-day MSDs prevalence of the hand, the conducted therapy and its success), additional occupational and sociodemographic questions as well as questions about specific medical conditions. Results: 30.8% of Ds affirmed MSDs in the hand at any time in their lives, 20.3% in the last twelve months and 9.5% in the last seven days. Among DAs, 42.6% reported a prevalence of MSDs in the hand at any time in their lives, 31.8% in the last 12 months and 15.3% in the last seven days. 37.5% of the Ds and 28.3% of the DAs stated that they had certain treatments. For both, Ds and DAs, physiotherapy was the most frequently chosen form of therapy. 89.7% of Ds and 63.3% of DAs who received therapy reported an improvement of MSDs. Conclusion: Although the prevalence of MSDs on the hand is higher among DAs than among Ds, the use of therapeutic options and the success of therapy is lower for DAs compared to Ds.
This study deals with 3D laser investigation on the border between the human lymph node T-zone and germinal centre. Only a few T-cells specific for antigen selected B-cells are allowed to enter germinal centres. This selection process is guided by sinus structures, chemokine gradients and inherent motility of the lymphoid cells. We measured gaps and wall-like structures manually, using IMARIS, a 3D image software for analysis and interpretation of microscopy datasets. In this paper, we describe alpha-actin positive and semipermeable walls and wall-like structures that may hinder T-cells and other cell types from entering germinal centres. Some clearly defined holes or gaps probably regulate lymphoid traffic between T- and B-cell areas. In lymphadenitis, the morphology of this border structure is clearly defined. However, in case of malignant lymphoma, the wall-like structure is disrupted. This has been demonstrated exemplarily in case of angioimmunoblastic T-cell lymphoma. We revealed significant differences of lengths of the wall-like structures in angioimmunoblastic T-cell lymphoma in comparison with wall-like structures in reactive tissue slices. The alterations of morphological structures lead to abnormal and less controlled T- and B-cell distributions probably preventing the immune defence against tumour cells and infectious agents by dysregulating immune homeostasis.
Background: The intraoperative blood loss is estimated daily in the operating room and is mainly done by visual techniques. Due to local standards, the surgical sponge colours can vary (e.g. white in US, green in Germany). The influence of sponge colour on accuracy of estimation has not been in the focus of research yet. Material and methods: A blood loss simulation study containing four “bleeding” scenarios each per sponge colour were created by using expired whole blood donation samples. The blood donations were applied to white and green surgical sponges after dilution with full electrolyte solution. Study participants had to estimate the absorbed blood loss in sponges in all scenarios. The difference to the reference blood loss was analysed. Multivariate linear regression analysis was performed to investigate other influence factors such as staff experience and sponge colour. Results: A total of 53 anaesthesists participated in the study. Visual estimation correlated moderately with reference blood loss in white (Spearman's rho: 0.521; p = 3.748*10−16) and green sponges (Spearman's rho: 0.452; p = 4.683*10−12). The median visually estimated blood loss was higher in white sponges (250ml IRQ 150–412.5ml) than in green sponges (150ml IQR 100-300ml), compared to reference blood loss (103ml IQR 86–162.8). For both colour types of sponges, major under- and overestimation was observed. The multivariate statistics demonstrates that fabric colours have a significant influence on estimation (p = 3.04*10−10), as well as clinician’s qualification level (p = 2.20*10−10, p = 1.54*10−08) and amount of RBL to be estimated (p < 2*10−16). Conclusion: The deviation of correct blood loss estimation was smaller with white surgical sponges compared to green sponges. In general, deviations were so severe for both types of sponges, that it appears to be advisable to refrain from visually estimating blood loss whenever possible and instead to use other techniques such as e.g. colorimetric estimation.
Objectives: In this study, localization accuracy and sensitivity to acoustic interaural time differences (ITDs) in subjects using cochlear implants with combined electric-acoustic stimulation (EAS) were assessed and compared with the results of a normal hearing control group. Methods: Eight CI users with EAS (2 bilaterally implanted, 6 unilaterally implanted) and symmetric binaural acoustic hearing and 24 normal hearing subjects participated in the study. The first experiment determined mean localization error (MLE) for different angles of sound incidence between ± 60° (frontal and dorsal presentation). The stimuli were either low-pass, high-pass or broadband noise bursts. In a second experiment, just noticeable differences (JND) of ITDs were measured for pure tones of 125 Hz, 250 Hz and 500 Hz (headphone presentation). Results: Experiment 1: MLE of EAS subjects was 8.5°, 14.3° and 14.7°, (low-, high-pass and broadband stimuli respectively). In the control group, MLE was 1.8° (broadband stimuli). In the differentiation between sound incidence from front and back, EAS subjects performed on chance level. Experiment 2: The JND-ITDs were 88.7 μs for 125 Hz, 48.8 μs for 250 Hz and 52.9 μs for 500 Hz (EAS subjects). Compared to the control group, JND-ITD for 125 Hz was on the same level of performance. No statistically significant correlation was found between MLE and JND-ITD in the EAS cohort. Conclusions: Near to normal ITD sensitivity in the lower frequency acoustic hearing was demonstrated in a cohort of EAS users. However, in an acoustic localization task, the majority of the subjects did not reached the level of accuracy of normal hearing. Presumably, signal processing time delay differences between devices used on both sides are deteriorating the transfer of precise binaural timing cues.
Background: Anemia is the most important complication during major surgery and transfusion of red blood cells is the mainstay to compensate for life threating blood loss. Therefore, accurate measurement of hemoglobin (Hb) concentration should be provided in real-time. Blood Gas Analysis (BGA) provides rapid point-of-care assessment using smaller sampling tubes compared to central laboratory (CL) services. Objective: This study aimed to investigate the accuracy of BGA hemoglobin testing as compared to CL services. Methods: Data of the ongoing LIBERAL-Trial (Liberal transfusion strategy to prevent mortality and anemia-associated ischemic events in elderly non-cardiac surgical patients, LIBERAL) was used to assess the bias for Hb level measured by BGA devices (ABL800 Flex analyzer®, GEM series® and RapidPoint 500®) and CL as the reference method. For that, we analyzed pairs of Hb level measured by CL and BGA within two hours. Furthermore, the impact of various confounding factors including age, gender, BMI, smoker status, transfusion of RBC, intraoperative hemodilution, and co-medication was elucidated. In order to ensure adequate statistical analysis, only data of participating centers providing more than 200 Hb pairs were used. Results: In total, three centers including 963 patients with 1,814 pairs of Hb measurements were analyzed. Mean bias was comparable between ABL800 Flex analyzer® and GEM series®: - 0.38 ± 0.15 g/dl whereas RapidPoint 500® showed a smaller bias (-0.09 g/dl) but greater median absolute deviation (± 0.45 g/dl). In order to avoid interference with different standard deviations caused by the different analytic devices, we focused on two centers using the same BGA technique (309 patients and 1,570 Hb pairs). A Bland-Altman analysis and LOWESS curve showed that bias decreased with smaller Hb values in absolute numbers but increased relatively. The smoker status showed the greatest reduction in bias (0.1 g/dl, p<0.001) whereas BMI (0.07 g/dl, p = 0.0178), RBC transfusion (0.06 g/dl, p<0.001), statins (0.04 g/dl, p<0.05) and beta blocker (0.03 g/dl, p = 0.02) showed a slight effect on bias. Intraoperative substitution of volume and other co-medications did not influence the bias significantly. Conclusion: Many interventions like substitution of fluids, coagulating factors or RBC units rely on the accuracy of laboratory measurement devices. Although BGA Hb testing showed a consistently stable difference to CL, our data confirm that BGA devices are associated with different bias. Therefore, we suggest that hospitals assess their individual bias before implementing BGA as valid and stable supplement to CL. However, based on the finding that bias decreased with smaller Hb values, which in turn are used for transfusion decision, we expect no unnecessary or delayed RBC transfusion, and no major impact on the LIBERAL trial performance.
Objectives: To review systematically the past 10 years of research activity into the healthcare experiences (HCX) of patients with chronic heart failure (CHF) in Germany, in order to identify research foci and gaps and make recommendations for future research. Design: In this scoping review, six databases and grey literature sources were systematically searched for articles reporting HCX of patients with CHF in Germany that were published between 2008 and 2018. Extracted results were summarised using quantitative and qualitative descriptive analysis. Results: Of the 18 studies (100%) that met the inclusion criteria, most were observational studies (60%) that evaluated findings quantitatively (60%). HCX were often concerned with patient information, global satisfaction as well as relationships and communication between patients and providers and generally covered ambulatory care, hospital care and rehabilitation services. Overall, the considerable heterogeneity of the included studies’ outcomes only permitted relatively trivial levels of synthesis. Conclusion: In Germany, research on HCX of patients with CHF is characterised by missing, inadequate and insufficient information. Future research would benefit from qualitative analyses, evidence syntheses, longitudinal analyses that investigate HCX throughout the disease trajectory, and better reporting of sociodemographic data. Furthermore, research should include studies that are based on digital data, reports of experiences gained in under-investigated yet patient-relevant healthcare settings and include more female subjects.
Hintergrund/Zielsetzung: Die Studentische Poliklinik Frankfurt (SP) ist die erste sogenannte Student-run Free Clinic in Deutschland. In ihr versorgen Studenten der Humanmedizin unter ärztlicher Aufsicht nicht-krankenversicherte Patienten. Vor der Tätigkeit in der SP müssen die Studenten ein intensives Vorbereitungsprogramm absolvieren. Dieses Programm ist seit Sommer 2013 als Wahlpflichtfach an der Medizinischen Fakultät der Goethe-Universität Frankfurt curricular verankert. Im Wintersemester 2016/2017 wurde zusätzlich zum bestehenden Peer-assisted Learning Kurs ein web-basierter Virtual Patient Learning Kurs eingeführt.
Ziel dieser Studie war es, die Wirksamkeit von Peer-assisted Learning mit Virtual Patient Learning im Erwerb allgemeinmedizinischer Grundkenntnisse und -fertigkeiten zu vergleichen. Betrachtet wurden hierbei unterschiedliche Ebenen des Kompetenzerwerbs: theoretisches Wissen, praktisches Wissen und Selbstevaluation standen im Fokus der Studie.
Methoden: 51 Studenten des fünften Fachsemesters wurden randomisiert in eine Peer-assisted Learning Gruppe (PAL Gruppe; n = 20), eine Virtual Patient Learning Gruppe (VPL Gruppe; n = 20) und eine Kontrollgruppe (KG, n = 11). Alle Gruppen absolvierten den curricularen Unterricht des ersten klinischen Semesters. Zusätzlich durchlief die PAL Gruppe das Wahlfach der SP im Peer-assisted Learning Format. Die VPL Gruppe durchlief das Wahlfach der SP im web-basierten Format mit sogenannten virtuellen Patienten auf der e-Learning Plattform Lernbar der Goethe Universität Frankfurt.
Die Messung des Wissenserwerbs beinhaltete einen theoretischen Vortest und Nachtest (Langzeit-Test) mit je 24 Single-Choice Fragen und theoretische Kurzzeit-Tests nach jedem der Kasuistikseminare mit je fünf Single-Choice Fragen. Der praktische Kompetenzerwerb wurde durch eine curriculare und eine zum Wahlfach gehörende Objective Structured Clinical Examination (OSCE) nach Abschluss der Intervention gemessen. Außerdem schätzten die Studienteilnehmer ihren Wissens- und Kompetenzerwerb vor und nach Teilnahme am Wahlfach der SP mit Hilfe eines Fragebogens ein. Hierfür beantworteten sie 34 Fragen anhand einer sechsstufigen Likert-Skala (1 = sehr sicher; 6 = überhaupt nicht sicher).
Nach jedem Kasuistikseminar evaluierten die Studenten die jeweilige Kasuistik mit je fünf Fragen anhand einer sechsstufigen Likert-Skala (1 = ich stimme voll zu; 6 = ich stimme überhaupt nicht zu).
Das Signifikanzniveau wurde auf 0.05 festgelegt.
Ergebnisse: Im gesamten theoretischen Nachtest erwarben alle Gruppen (PAL, VPL und KG) einen signifikanten Wissenszuwachs (PAL p < 0.0001; VPL p < 0.0001; KG p = 0.0156) verglichen mit dem theoretischen Vortest. In allen theoretischen Kurzzeit-Tests wies die VPL Gruppe ein signifikant besseres Ergebnis auf als die PAL Gruppe (Mittelwert PAL = 85.75 %; Mittelwert VPL = 90.57 %; p = 0.0047).
Im Wahlfach OSCE zeigte sich kein signifikanter Unterschied zwischen der PAL und VPL Gruppe (p = 0.5395). Im curricularen OSCE zeigte sich kein signifikanter Unterschied zwischen beiden Testgruppen und der KG (p = 0.4263).
In der Selbsteinschätzung nach Intervention schätzte sich die PAL Gruppe in 31 von 34 Items signifikant besser ein als zuvor. Die VPL Gruppe schätzte sich in 25 Items und die KG in 16 der 34 Items signifikant besser ein als zuvor.
Die Kasuistikseminare wurden von der PAL und VPL Gruppe ähnlich bewertet. Die Mediane für die einzelnen Kasuistiken lagen bei 1 oder 2.
Allgemeinmedizinische Grundkenntnisse und Fertigkeiten können mit VPL genauso effektiv vermittelt werden wie mit PAL. Aufgrund der Kosteneffizienz, einer hohen Reproduzierbarkeit und des frei wählbaren Umfangs bezüglich Bearbeitungsort-und Zeit, sollte VPL häufiger in der allgemeinmedizinischen Lehre im Rahmen von Student-run Free Clinics durchgeführt werden. Letztendlich kann dies zu einer verbesserten Behandlungsqualität und Patientenzufriedenheit führen.
Die VPL Seminare sollten dennoch weiterentwickelt werden und besonders im Hinblick auf Feedback an die Studenten moduliert und individualisierter gestaltet werden.
Functional roles of COMP and TSP-4 in articular cartilage and their relevance in osteoarthritis
(2020)
Osteoarthritis (OA) is a slowly progressing disease, resulting in the degradation of cartilage and the loss of joint functionality. The cartilage extracellular matrix (ECM) is degraded and undergoes remodelling in OA progression. Chondrocytes start to express degrading proteases but are also reactivated and synthesise ECM proteins. The spectrum of these newly synthesised proteins and their involvement in OA specific processes and cartilage repair is hardly investigated.
Human articular cartilage obtained from OA patients undergoing knee replacement surgery was evaluated according to the OARSI histopathology grading system. Healthy, non-OA cartilage samples were used as controls. The expression and distribution of thrombospondin-4 (TSP-4) and the closely related COMP were analysed on the gene level by PCR and on the protein level by immunohistology and immunoblot assays. The potential of TSP-4 as a diagnostic marker was evaluated by immunoblot assays, using serum samples from OA patients and healthy individuals. The functional role of both proteins was further investigated in in vitro studies using chondrocytes isolated from femoral condyles of healthy pigs. The effect of COMP and TSP-4 on chondrocyte migration and attachment was investigated via transwell and attachment assays, respectively. Moreover, the potential of COMP and TSP-4 to modulate the chondrocyte phenotype by inducing gene expression, ECM protein synthesis and matrix formation was investigated by immunofluorescence staining and qPCR. The activation of cartilage relevant signalling pathways was investigated by immunoblot assays.
These results showed for the first time the presence of TSP-4 in articular cartilage. Its amount dramatically increased in OA compared to healthy cartilage and correlated positively with OA severity. In healthy cartilage TSP-4 was primarily found in the superficial zone while it was wider distributed in the middle and deeper zones of OA cartilage. The amount of specific TSP-4 fragments was increased in sera of OA patients compared to healthy controls, indicating a potential to serve as an OA biomarker. COMP was ubiquitously expressed in healthy cartilage but degraded in early as well as re-expressed in late-stage OA. The overall protein levels between OA severity grades were comparable. Contrary to TSP-4, COMP was localised primarily in the upper zone of OA cartilage, in particular in areas with severe damage. COMP could attract chondrocytes and facilitated their attachment, while TSP-4 did not affect these processes. COMP and TSP 4 were generally weak inducers of gene expression, although both could induce COL2A1 and TSP-4 additionally COL12A1 and ACAN after 6 h. Correlating data were obtained on the protein level: COMP and TSP-4 promoted the synthesis and matrix formation of collagen II, collagen IX, collagen XII and proteoglycans. In parallel, both proteins suppressed chondrocyte hypertrophy and dedifferentiation by reducing collagen X and collagen I. By analysing the effect of COMP and TSP-4 on intracellular signalling, both proteins induced Erk1/2 phosphorylation and TSP-4 could further promote Smad2/3 signalling induced by TGF-β1. None of the two proteins had a direct or modulatory effect on Smad1/5/9 dependent signalling.
In summary, COMP and TSP-4 contribute to ECM maintenance and repair by inducing the expression of essential ECM proteins and suppressing chondrocyte dedifferentiation. These effects might be mediated by Erk1/2 phosphorylation. The presented data demonstrate an important functional role of COMP and TSP-4 in both healthy and OA cartilage and provide a basis for further studies on their potential in clinical applications for OA diagnosis and treatment.
Mesenchymale Knochenmarksstammzellen (engl. Bone Marrow-Derived Mesenchymal Stem Cells (BMSCs)) sind hochproliferative multipotente Progenitorzellen mit einem hohen Regenerationspotential. Sie können aus dem Knochenmark in geschädigte Knorpelareale migrieren und dort zu Chondrozyten differenzieren. Somit können sie zur Reparatur traumatisch oder osteoarthrotisch bedingter Knorpelschäden beitragen. In verschiedenen Bereichen des Gelenks konnten zudem sympathische Nervenfasern sowie der sympathische Neurotransmitter Noradrenalin (NE) nachgewiesen werden. NE inhibiert die chondrogene Differenzierungskapazität von BMSCs und kann so zur Pathogenese der Osteoarthrose (OA) beitragen. Unbekannt ist zum derzeitigen Zeitpunkt, inwiefern NE die Proliferation von humanen BMSCs beeinflusst. Ziel unserer Studie war, den Einfluss von NE auf die Proliferationskapazität humaner BMSCs zu untersuchen und beteiligte intrazelluläre Signalwege zu identifizieren.
Zu diesem Zweck wurden BMSCs von Patienten nach stattgehabtem Gelenktrauma (Trauma BMSCs) und von Patienten mit diagnostizierter OA (OA BMSCs) untersucht. Zunächst erfolgte eine Analyse des Genexpressionsmusters der verschiedenen Adrenorezeptoren (ARs). Anschließend wurden sowohl Trauma als auch OA BMSCs mit NE in unterschiedlichen Konzentrationen sowie mit NE in Kombination mit verschiedenen AR-Antagonisten (Doxazosin (α1), Yohimbin (α2) oder Propranolol (β2)) behandelt. Die Aktivierung der AR-gekoppelten Signalwege wurde anhand der Phosphorylierung der beiden Hauptsignalwege der extrazellulären signalregulierten Kinasen 1/2 (ERK1/2) und der Proteinkinase A (PKA) via Western Blot untersucht.
Die Genexpression diverser AR-Subtypen konnte in Trauma (α2B-, α2C- und β2-AR) und OA BMSCs (α2A-, α2B- und β2-AR) nachgewiesen werden. Die Behandlung mit NE in hohen Konzentrationen führte zu einer statistisch signifikanten Inhibition der Proliferation von Trauma und OA BMSCs. Die Behandlung mit NE in niedrigen Konzentrationen hatte hingegen keinen Einfluss auf die Proliferation von Trauma und OA BMSCs. Sowohl ERK1/2 als auch PKA wurden in Trauma und OA BMSCs nach Behandlung mit NE aktiviert. Lediglich der β2-Antagonist Propranolol konnte sowohl die Effekte auf die Proliferation als auch auf die Aktivierung von ERK1/2 und PKA aufheben. Doxazosin und Yohimbin hatten hingegen keinen signifikanten Einfluss auf die Proliferation sowie die ERK1/2- und PKA-Phosphorylierung.
Unsere Untersuchungen zeigen, dass NE die Proliferation von Trauma und OA BMSCs konzentrationsabhängig inhibiert. Dieser Effekt wird vornehmlich über eine β2-AR-gekoppelte ERK1/2- und PKA-Aktivierung vermittelt. Über diesen Mechanismus kann NE das regenerative Potential von humanen BMSCs verringern und somit zur Pathogenese der OA beitragen. Über eine zielgerichtete Beeinflussung des β2-Signalweges könnten sich zukünftig neue therapeutische Optionen bei der Behandlung osteoarthrotisch oder traumatisch bedingter Knorpelschäden ergeben.
The intestinal epithelium acts as a selective barrier for the absorption of water, nutrients and orally administered drugs. To evaluate the gastrointestinal permeability of a candidate molecule, scientists and drug developers have a multitude of cell culture models at their disposal. Static transwell cultures constitute the most extensively characterized intestinal in vitro system and can accurately categorize molecules into low, intermediate and high permeability compounds. However, they lack key aspects of intestinal physiology, including the cellular complexity of the intestinal epithelium, flow, mechanical strain, or interactions with intestinal mucus and microbes. To emulate these features, a variety of different culture paradigms, including microfluidic chips, organoids and intestinal slice cultures have been developed. Here, we provide an updated overview of intestinal in vitro cell culture systems and critically review their suitability for drug absorption studies. The available data show that these advanced culture models offer impressive possibilities for emulating intestinal complexity. However, there is a paucity of systematic absorption studies and benchmarking data and it remains unclear whether the increase in model complexity and costs translates into improved drug permeability predictions. In the absence of such data, conventional static transwell cultures remain the current gold-standard paradigm for drug absorption studies.
Background: Dentists are at a higher risk of suffering from musculoskeletal disorders (MSD) than the general population. However, the latest study investigating MSD in the dental profession in Germany was published about 20 years ago. Therefore, the aim of this study was to reveal the current prevalence of MSD in dentists and dental students in Germany. Methods: The final study size contained 450 (287 f/163 m) subjects of different areas of specialization. The age of the participants ranged from 23 to 75 years. The questionnaire consisted of a modified version of the Nordic Questionnaire, work-related questions from the latest questionnaire of German dentists, typical medical conditions and self-developed questions. Results: The overall prevalence showed that dentists suffered frequently from MSD (seven days: 65.6%, twelve months: 92%, lifetime: 95.8%). The most affected body regions included the neck (42.7%–70.9%–78.4%), shoulders (29.8%–55.6%–66.2%) and lower back (22.9%–45.8%–58.7%). Overall, female participants stated that they suffered from pain significantly more frequently, especially in the neck, shoulders and upper back. Conclusion: The prevalence of MSD among dentists, especially in the neck, shoulder and back area, was significantly higher than in the general population. In addition, women suffered more frequently from MSD than men in almost all body regions.
Large spines are stable and important for memory trace formation. The majority of large spines also contains synaptopodin (SP), an actin-modulating and plasticity-related protein. Since SP stabilizes F-actin, we speculated that the presence of SP within large spines could explain their long lifetime. Indeed, using 2-photon time-lapse imaging of SP-transgenic granule cells in mouse organotypic tissue cultures we found that spines containing SP survived considerably longer than spines of equal size without SP. Of note, SP-positive (SP+) spines that underwent pruning first lost SP before disappearing. Whereas the survival time courses of SP+ spines followed conditional two-stage decay functions, SP-negative (SP-) spines and all spines of SP-deficient animals showed single-phase exponential decays. This was also the case following afferent denervation. These results implicate SP as a major regulator of long-term spine stability: SP clusters stabilize spines, and the presence of SP indicates spines of high stability.
Prostate cancer patients whose tumors develop resistance to conventional treatment often turn to natural, plant-derived products, one of which is sulforaphane (SFN). This study was designed to determine whether anti-tumor properties of SFN, identified in other tumor entities, are also evident in cultivated DU145 and PC3 prostate cancer cells. The cells were incubated with SFN (1–20 µM) and tumor cell growth and proliferative activity were evaluated. Having found a considerable anti-growth, anti-proliferative, and anti-clonogenic influence of SFN on both prostate cancer cell lines, further investigation into possible mechanisms of action were performed by evaluating the cell cycle phases and cell-cycle-regulating proteins. SFN induced a cell cycle arrest at the S- and G2/M-phase in both DU145 and PC3 cells. Elevation of histone H3 and H4 acetylation was also evident in both cell lines following SFN exposure. However, alterations occurring in the Cdk-cyclin axis, modification of the p19 and p27 proteins and changes in CD44v4, v5, and v7 expression because of SFN exposure differed in the two cell lines. SFN, therefore, does exert anti-tumor properties on these two prostate cancer cell lines by histone acetylation and altering the intracellular signaling cascade, but not through the same molecular mechanisms.
Human placentation is a highly invasive process. Deficiency in the invasiveness of trophoblasts is associated with a spectrum of gestational diseases, such as preeclampsia (PE). The oncogene B-cell lymphoma 6 (BCL6) is involved in the migration and invasion of various malignant cells. Intriguingly, its expression is deregulated in preeclamptic placentas. We have reported that BCL6 is required for the proliferation, survival, fusion, and syncytialization of trophoblasts. In the present work, we show that the inhibition of BCL6, either by its gene silencing or by using specific small molecule inhibitors, impairs the migration and invasion of trophoblastic cells, by reducing cell adhesion and compromising the dynamics of the actin cytoskeleton. Moreover, the suppression of BCL6 weakens the signals of the phosphorylated focal adhesion kinase, Akt/protein kinase B, and extracellular regulated kinase 1/2, accompanied by more stationary, but less migratory, cells. Interestingly, transcriptomic analyses reveal that a small interfering RNA-induced reduction of BCL6 decreases the levels of numerous genes, such as p21 activated kinase 1, myosin light chain kinase, and gamma actin related to cell adhesion, actin dynamics, and cell migration. These data suggest BCL6 as a crucial player in the migration and invasion of trophoblasts in the early stages of placental development through the regulation of various genes associated with the migratory machinery.
Background: MitraClip ® (MC) is an established procedure for severe mitral regurgitation (MR) in patients deemed unsuitable for surgery. Right ventricular dysfunction (RVD) is associated with a higher mortality risk. The prognostic accuracy of heart failure risk scores like the Seattle heart failure model (SHFM) and Meta-Analysis Global Group in Chronic Heart Failure (MAGGIC) score in pts undergoing MC with or without RVD has not been investigated so far. Methods: SHFM and MAGGIC score were calculated retrospectively. RVD was determined as tricuspid annular plane systolic excursion (TAPSE) ≤15 mm. Area under receiver operating curves (AUROC) of SHFM and MAGGIC were performed for one-year all-cause mortality after MC. Results: N = 103 pts with MR III° (73 ± 11 years, LVEF 37 ± 17%) underwent MC with a reduction of at least I° MR. One-year mortality was 28.2%. In Kaplan-Meier analysis, one- year mortality was significantly higher in RVD-pts (34.8% vs 2.8%, p = 0.009). Area under the Receiver Operating Characteristic (AUROC) for SHFM and MAGGIC were comparable for both scores (SHFM: 0.704, MAGGIC: 0.692). In pts without RVD, SHFM displayed a higher AUROC and therefore better diagnostic accuracy (SHFM: 0.776; MAGGIC: 0.551, p < 0.05). In pts with RVD, MAGGIC and SHFM displayed comparable AUROCs. Conclusion: RVD is an important prognostic marker in pts undergoing MC. SHFM and MAGGIC displayed adequate over-all prognostic power in these pts. Accuracy differed in pts with and without RVD, indicating higher predictive power of the SHFM score in pts without RVD.
During the course of sepsis in critically ill patients, kidney dysfunction and damage are among the first events of a complex scenario toward multi-organ failure and patient death. Acute kidney injury triggers the release of lipocalin-2 (Lcn-2), which is involved in both renal injury and recovery. Taking into account that Lcn-2 binds and transports iron with high affinity, we aimed at clarifying if Lcn-2 fulfills different biological functions according to its iron-loading status and its cellular source during sepsis-induced kidney failure. We assessed Lcn-2 levels both in serum and in the supernatant of short-term cultured renal macrophages (MΦ) as well as renal tubular epithelial cells (TEC) isolated from either Sham-operated or cecal ligation and puncture (CLP)-treated septic mice. Total kidney iron content was analyzed by Perls’ staining, while Lcn-2-bound iron in the supernatants of short-term cultured cells was determined by atomic absorption spectroscopy. Lcn-2 protein in serum was rapidly up-regulated at 6 h after sepsis induction and subsequently increased up to 48 h. Lcn-2-levels in the supernatant of TEC peaked at 24 h and were low at 48 h with no change in its iron-loading. In contrast, in renal MΦ Lcn-2 was low at 24 h, but increased at 48 h, where it mainly appeared in its iron-bound form. Whereas TEC-secreted, iron-free Lcn-2 was associated with renal injury, increased MΦ-released iron-bound Lcn-2 was linked to renal recovery. Therefore, we hypothesized that both the cellular source of Lcn-2 as well as its iron-load crucially adds to its biological function during sepsis-induced renal injury.