Refine
Year of publication
Document Type
- Article (60)
- Conference Proceeding (4)
- Contribution to a Periodical (1)
- Preprint (1)
Has Fulltext
- yes (66)
Is part of the Bibliography
- no (66)
Keywords
- Inflammation (4)
- ACLF (3)
- COVID-19 (3)
- cirrhosis (3)
- Angiogenesis (2)
- Biomarker (2)
- Biomarkers (2)
- Brain tumor (2)
- CRM (2)
- Entscheidungsassistenz (2)
Institute
- Medizin (66) (remove)
Memory Concerns, Memory Performance and Risk of Dementia in Patients with Mild Cognitive Impairment
(2014)
Background: Concerns about worsening memory (“memory concerns”; MC) and impairment in memory performance are both predictors of Alzheimer's dementia (AD). The relationship of both in dementia prediction at the pre-dementia disease stage, however, is not well explored. Refined understanding of the contribution of both MC and memory performance in dementia prediction is crucial for defining at-risk populations. We examined the risk of incident AD by MC and memory performance in patients with mild cognitive impairment (MCI).
Methods: We analyzed data of 417 MCI patients from a longitudinal multicenter observational study. Patients were classified based on presence (n = 305) vs. absence (n = 112) of MC. Risk of incident AD was estimated with Cox Proportional-Hazards regression models.
Results: Risk of incident AD was increased by MC (HR = 2.55, 95%CI: 1.33–4.89), lower memory performance (HR = 0.63, 95%CI: 0.56–0.71) and ApoE4-genotype (HR = 1.89, 95%CI: 1.18–3.02). An interaction effect between MC and memory performance was observed. The predictive power of MC was greatest for patients with very mild memory impairment and decreased with increasing memory impairment.
Conclusions: Our data suggest that the power of MC as a predictor of future dementia at the MCI stage varies with the patients' level of cognitive impairment. While MC are predictive at early stage MCI, their predictive value at more advanced stages of MCI is reduced. This suggests that loss of insight related to AD may occur at the late stage of MCI.
Background: Clinical practice guidelines for patients with primary biliary cholangitis (PBC) have been recently revised and implemented for well-established response criteria to standard first-line ursodeoxycholic acid (UDCA) therapy at 12 months after treatment initiation for the early identification of high-risk patients with inadequate treatment responses who may require treatment modification. However, there are only very limited data concerning the real-world clinical management of patients with PBC in Germany. Objective: The aim of this retrospective multicenter study was to evaluate response rates to standard first-line UDCA therapy and subsequent Second-line treatment regimens in a large cohort of well-characterized patients with PBC from 10 independent hepatological referral centers in Germany prior to the introduction of obeticholic acid as a licensed second-line treatment option. Methods: Diagnostic confirmation of PBC, standard first-line UDCA treatment regimens and response rates at 12 months according to Paris-I, Paris-II, and Barcelona criteria, the follow-up cut-off alkaline phosphatase (ALP) ≤ 1.67 × upper limit of normal (ULN) and the normalization of bilirubin (bilirubin ≤ 1 × ULN) were retrospectively examined between June 1986 and March 2017. The management and hitherto applied second-line treatment regimens in patients with an inadequate response to UDCA and subsequent response rates at 12 months were also evaluated. Results: Overall, 480 PBC patients were included in this study. The median UDCA dosage was 13.2 mg UDCA/kg bodyweight (BW)/d. Adequate UDCA treatment response rates according to Paris-I, Paris-II, and Barcelona criteria were observed in 91, 71.3, and 61.3% of patients, respectively. In 83.8% of patients, ALP ≤ 1.67 × ULN were achieved. A total of 116 patients (24.2%) showed an inadequate response to UDCA according to at least one criterion. The diverse second-line treatment regimens applied led to significantly higher response rates according to Paris-II (35 vs. 60%, p = 0.005), Barcelona (13 vs. 34%, p = 0.0005), ALP ≤ 1.67 × ULN and bilirubin ≤ 1 × ULN (52.1 vs. 75%, p = 0.002). The addition of bezafibrates appeared to induce the strongest beneficial effect in this cohort (Paris II: 24 vs. 74%, p = 0.004; Barcelona: 50 vs. 84%, p = 0.046; ALP < 1.67 × ULN and bilirubin ≤ 1 × ULN: 33 vs. 86%, p = 0.001). Conclusion: Our large retrospective multicenter study confirms high response rates following UDCA first-line standard treatment in patients with PBC and highlights the need for close monitoring and early treatment modification in high-risk patients with an insufficient response to UDCA since early treatment modification significantly increases subsequent response rates of these patients.
The nuclear factor kappa beta (NFκB) signaling pathway plays an important role in liver homeostasis and cancer development. Tax1-binding protein 1 (Tax1BP1) is a regulator of the NFκB signaling pathway, but its role in the liver and hepatocellular carcinoma (HCC) is presently unknown. Here we investigated the role of Tax1BP1 in liver cells and murine models of HCC and liver fibrosis. We applied the diethylnitrosamine (DEN) model of experimental hepatocarcinogenesis in Tax1BP1+/+ and Tax1BP1−/− mice. The amount and subsets of non-parenchymal liver cells in in Tax1BP1+/+ and Tax1BP1−/− mice were determined and activation of NFκB and stress induced signaling pathways were assessed. Differential expression of mRNA and miRNA was determined. Tax1BP1−/− mice showed increased numbers of inflammatory cells in the liver. Furthermore, a sustained activation of the NFκB signaling pathway was found in hepatocytes as well as increased transcription of proinflammatory cytokines in isolated Kupffer cells from Tax1BP1−/− mice. Several differentially expressed mRNAs and miRNAs in livers of Tax1BP1−/− mice were found, which are regulators of inflammation or are involved in cancer development or progression. Furthermore, Tax1BP1−/− mice developed more HCCs than their Tax1BP1+/+ littermates. We conclude that Tax1BP1 protects from liver cancer development by limiting proinflammatory signaling.
Introduction: In an emergency department, the majority of pediatric trauma patients present because of minor injuries. The aim of this study was to evaluate temporal changes in age-related injury pattern, trauma mechanism, and surgeries in pediatric patients. Methods: This retrospective study included patients < 18 years of age following trauma from 01/2009 to 12/2018 at a level I trauma center. They were divided into two groups: group A (A: 01/2009 to 12/2013) and group B (B: 01/2014 to 12/2018). Injury mechanism, injury pattern, and surgeries were analyzed. As major injuries fractures, dislocations, and organ injuries and as minor injuries contusions and superficial wounds were defined. Results: 23,582 patients were included (58% male, median age 8.2 years). There was a slight increase in patients comparing A (n = 11,557) and B (n = 12,025) with no difference concerning demographic characteristics. Significant more patients (A: 1.9%; B: 2.4%) were admitted to resuscitation room, though the number of multiple injured patients was not significantly different. In A (25.5%), major injuries occurred significantly less frequently than in B (27.0%), minor injuries occurred equally. Extremity fractures were significantly more frequent in B (21.5%) than in A (20.2%), peaking at 8–12 years. Most trauma mechanisms of both groups were constant, with a rising of sport injuries at 8–12 years. Conclusion: Although number of patients increases only slightly over a decade, there was a clear increase in major injuries, particularly extremity fractures, peaking at 8–12 years. At this age also sport accidents significantly increased. At least, admittance to resuscitation room rose but without an increase of multiple injured patients.
Background: Polytraumatized patients undergo a strong immunological stress upon insult. Phagocytes (granulocytes and monocytes) play a substantial role in immunological defense against bacteria, fungi and yeast, and in the clearance of cellular debris after tissue injury. We have reported a reduced monocytes phagocytic activity early after porcine polytrauma before. However, it is unknown if both phagocyte types undergo those functional alterations, and if there is a pathogen-specific phagocytic behavior. We characterized the phagocytic activity and capacity of granulocytes and monocytes after polytrauma.
Methods: Eight pigs (Sus scrofa) underwent polytrauma consisting of lung contusion, liver laceration, tibial fracture and hemorrhagic shock with fluid resuscitation and fracture fixation with external fixator. Intensive care treatment including mechanical ventilation for 72 h followed. Phagocytic activity and capacity were investigated using an in vitro ex vivo whole blood stimulation phagocytosis assays before trauma, after surgery, 24, 48, and 72 h after trauma. Blood samples were stimulated with Phorbol-12-myristate-13-acetate and incubated with FITC-labeled E. coli, S. aureus or S. cerevisiae for phagocytosis assessment by flow cytometry.
Results: Early polytrauma-induced significant increase of granulocytes and monocytes declined to baseline values within 24 h. Percentage of E. coli-phagocytizing granulocytes significantly decreased after polytrauma and during further intensive care treatment, while their capacity significantly increased. Interestingly, both granulocytic phagocytic activity and capacity of S. aureus significantly decreased after trauma, although a recovery was observed after 24 h and yet was followed by another decrease. The percentage of S. cerevisiae-phagocytizing granulocytes significantly increased after 24 h, while their impaired capacity after surgery and 72 h later was detected. Monocytic E. coli-phagocytizing percentage did not change, while their capacity increased after 24–72 h. After a significant decrease in S. aureus-phagocytizing monocytes after surgery, a significant increase after 24 and 48 h was observed without capacity alterations. No significant changes in S. cerevisiae-phagocytizing monocytes occurred, but their capacity dropped 48 and 72 h.
Conclusion: Phagocytic activity and capacity of granulocytes and monocytes follow a different pattern and significantly change within 72 h after polytrauma. Both phagocytic activity and capacity show significantly different alterations depending on the pathogen strain, thus potentially indicating at certain and possibly more relevant infection causes after polytrauma.
In Bone Tissue Engineering (BTE), autologous bone-regenerative cells are combined with a scaffold for large bone defect treatment (LBDT). Microporous, polylactic acid (PLA) scaffolds showed good healing results in small animals. However, transfer to large animal models is not easily achieved simply by upscaling the design. Increasing diffusion distances have a negative impact on cell survival and nutrition supply, leading to cell death and ultimately implant failure. Here, a novel scaffold architecture was designed to meet all requirements for an advanced bone substitute. Biofunctional, porous subunits in a load-bearing, compression-resistant frame structure characterize this approach. An open, macro- and microporous internal architecture (100 µm–2 mm pores) optimizes conditions for oxygen and nutrient supply to the implant’s inner areas by diffusion. A prototype was 3D-printed applying Fused Filament Fabrication using PLA. After incubation with Saos-2 (Sarcoma osteogenic) cells for 14 days, cell morphology, cell distribution, cell survival (fluorescence microscopy and LDH-based cytotoxicity assay), metabolic activity (MTT test), and osteogenic gene expression were determined. The adherent cells showed colonization properties, proliferation potential, and osteogenic differentiation. The innovative design, with its porous structure, is a promising matrix for cell settlement and proliferation. The modular design allows easy upscaling and offers a solution for LBDT.
Seit dem Sommersemester 2006 führt der Fachbereich Medizin der J. W. Goethe – Universität kontinuierlich eine Analyse über den Zusammenhang der Oberstufen- und Abiturprüfungsnoten mit den Studiumserfolgen bei den Frankfurter Medizinstudierenden durch. Den Rahmen hierfür bildet das Projekt Studierendenauswahl zur Identifizierung und Validierung geeigneter Prädiktoren des Studienerfolgs zur universitären Auswahl (60 Prozent der Studiumsplätze in den ZVS-Fächern unterliegen der direkten Vergabekompetenz der Hochschulen). Wir präsentieren in dieser Kommunikation die Ergebnisse einer retrospektiven Datenerhebung bei den Studierenden im klinischen Studienabschnitt, welche Kurse zur Erlangung der Hochschulreife ausgewählt wurden (n=700). Hintergrund für dieses Vorgehen ist die Vermutung, dass zwischen der Kurswahl und dem Studienerfolg ein unmittelbarer Zusammenhang besteht. Die Studienleistungen wurden auf der Basis von fachbereichseigenen Prüfungen (vorklinische und klinische Leistungsnachweise) durch die Ergebnisse in den Staatsexamina identifiziert. Des Weiteren sollte geklärt werden, wie viele Leistungskurskombinationen – bedingt durch die Vorgaben des deutschen Oberstufensystems – vorliegen. Erste Auswertungen geben zu erkennen, dass die Korrelation zwischen den erreichten Punkten der individuellen Schulfächer (Mathematik, Biologie, Chemie, Deutsch und Englisch) und den Ergebnissen im 1.Abschnitt der ärztlichen Prüfung deutlich fachabhängig sind. Das Gleiche gilt für die Leistungskurswahl: Die große Anzahl verschiedener Leistungskurskombinationen (bei 700 Studierenden über 80) zeigt ausgesprochen variable Korrelationen mit den Leistungen im 1. Abschnitt der Ärztlichen Prüfung. Dabei ist die Leistungskurskombination Mathematik + Englisch nach unserer gegenwärtigen Analyse der beste Prädiktor für Erfolg im Medizinstudium. Diese Ergebnisse könnten in naher Zukunft als Basis des universitären Auswahlverfahrens für Medizinstudierende dienen.
Die Evaluation der studentischen Lehre - Basis für eine leistungsorientierte Mittelvergabe (LOM)?
(2008)
Die Evaluation der medizinischen Ausbildung wird am Fachbereich Medizin der J.W. Goethe – Universität Frankfurt seit 1998 systematisch durchgeführt. Damit ist diese Implementierung deutlich vor den bindenden Bestimmungen der Ärztlichen Approbationsordnung (in Kraft getreten am 01.10.2003) installiert worden. Die Evaluation der studentischen Lehre beinhaltet die Evaluierung sämtlicher Pflichtveranstaltungen (Kurse, Seminare, Praktika) durch einen standardisierten Fragebogen, der am Ende der Lehrpflichtveranstaltung (in jedem Semester) ausgeteilt und nach dem Ausfüllen durch die Studierenden wieder eingesammelt wird.
In dieser Kommunikation belegen wir anhand ausgewählter Beispiele (vom Wintersemester 2003/2004 bis zum Wintersemester 2005/2006), dass die anderen Orts oft vorgetragenen negativen studentischen Bewertungen der vorklinischen Fächer an der J.W. Goethe – Universität nicht zutreffen (Bsp.:Kursus Anatomie I, Makroskopischer Teil, WS 2005/2006: M=1,8, SD=0,86). Die Bewertung der didaktischen Qualität („Lehrstoff wurde gut verständlich präsentiert“) ist bei den meisten vorklinischen Pflichtveranstaltungen zufriedenstellend (Bsp.: Kursus Anatomie I, Makroskopischer Teil, WS 2005/2006: M=2,06, SD=0,94). Aus diesen Ergebnissen schließen wir auf eine positive Rückwirkung des curricularen und didaktischen Umbaus des Medizinstudiums an der Goethe – Universität.
Die Veröffentlichung der Ergebnisse der studentischen Evaluation („Zusammenfassende Beurteilung“) muss dem Umstand Rechnung tragen, dass praxisferne Fächer vielen Studierenden nur schwer zu vermitteln sind. Deswegen wird auf ein Ranking verzichtet. Nach diesen Ergebnissen wird ein Teil der Mittel leistungsorientiert vergeben (im jährlichen Zyklus). Diese leistungsorientierte Mittelvergabe (LOM) (davon 45 Prozent nach der studentischen Evaluation) beträgt 4 Prozent des jeweiligen Grundetats für Forschung und Lehre (Landeszuführung). Eine positive Lehrevaluation kann für eine Klinik/ein Institut einen wesentlich größeren Betrag bedeuten. Das Verfahren ist am Fachbereich akzeptiert.
Background: Conversion from calcineurin inhibitor (CNI) therapy to a mammalian target of rapamycin (mTOR) inhibitor following kidney transplantation may help to preserve graft function. Data are sparse, however, concerning the impact of conversion on posttransplant diabetes mellitus (PTDM) or the progression of pre-existing diabetes.
Methods: PTDM and other diabetes-related parameters were assessed post hoc in two large open-label multicenter trials. Kidney transplant recipients were randomized (i) at month 4.5 to switch to everolimus or remain on a standard cyclosporine (CsA)-based regimen (ZEUS, n = 300), or (ii) at month 3 to switch to everolimus, remain on standard CNI therapy or convert to everolimus with reduced-exposure CsA (HERAKLES, n = 497).
Results: There were no significant differences in the incidence of PTDM between treatment groups (log rank p = 0.97 [ZEUS], p = 0.90 [HERAKLES]). The mean change in random blood glucose from randomization to month 12 was also similar between treatment groups in both trials for patients with or without PTDM, and with or without pre-existing diabetes. The change in eGFR from randomization to month 12 showed a benefit for everolimus versus comparator groups in all subpopulations, but only reached significance in larger subgroups (no PTDM or no pre-existing diabetes).
Conclusions: Within the restrictions of this post hoc analysis, including non-standardized diagnostic criteria and limited glycemia laboratory parameters, these data do not indicate any difference in the incidence or severity of PTDM with early conversion from a CsA-based regimen to everolimus, or in the progression of pre-existing diabetes.
Trial registration: clinicaltrials.gov, NCT00154310 (registered September 2005) and NCT00514514 (registered August 2007); EudraCT (2006-007021-32 and 2004-004346-40).
Delayed wound repair in sepsis is associated with reduced local pro-inflammatory cytokine expression
(2013)
Sepsis is one of the main causes for morbidity and mortality in hospitalized patients. Moreover, sepsis associated complications involving impaired wound healing are common. Septic patients often require surgical interventions that in-turn may lead to further complications caused by impaired wound healing. We established a mouse model to the study delayed wound healing during sepsis distant to the septic focus point. For this reason cecal ligation and puncture (CLP) was combined with the creation of a superficial wound on the mouse ear. Control animals received the same procedure without CPL. Epithelialization was measured every second day by direct microscopic visualization up to complete closure of the wound. As interplay of TNF-α, TGF-β, matrix metalloproteinases (MMP), and tissue inhibitors of metalloproteinases (TIMP) is important in wound healing in general, TNF-α, TGF-β, MMP7, and TIMP1 were assessed immunohistochemical in samples of wounded ears harvested on days 2, 6, 10 and 16 after wounding. After induction of sepsis, animals showed a significant delay in wound epithelialization from day 2 to 12 compared to control animals. Complete wound healing was attained after mean 12.2± standard deviation (SD) 3.0 days in septic animals compared to 8.7± SD 1.7 days in the control group. Septic animals showed a significant reduction in local pro-inflammatory cytokine level of TNF-α on day 2 and day 6 as well as a reduced expression of TGF-β on day 2 in wounds. A significant lower expression of MMP7 as well as TIMP1 was also observed on day 2 after wounding. The induction of sepsis impairs wound healing distant to the septic focus point. We could demonstrate that expression of important cytokines for wound repair is deregulated after induction of sepsis. Thus restoring normal cytokine response locally in wounds could be a good strategy to enhance wound repair in sepsis.
Epoxyeicosatrienoic acids (EET) facilitate regeneration in different tissues, and their benefit in dermal wound healing has been proven under normal conditions. In this study, we investigated the effect of 11,12 EET on dermal wound healing in diabetes. We induced diabetes by i.p. injection of streptozotocin 2 weeks prior to wound creation on the dorsal side of the mouse ear. 11,12 EET was applied every second day on the wound, whereas the control groups received only solvent. Epithelialization was monitored every second day intravitally up to wound closure. Wounds were stained for VEGF, CD31, TGF-β, TNF-α, SDF-1α, NF-κB, and Ki-67, and fibroblasts were counted after hematoxylin-eosin stain on days 3, 6, 9, and 16 after wounding. After induction of diabetes, wounds closed on day 13.00 ± 2.20 standard deviation (SD). Local 11,12 ETT application improved wound closure significantly to day 8.40 ± 1.39 SD. EET treatment enhanced VEGF and CD31 expression in wounds on day 3. It also seemed to raise TNF-α level on all days investigated as well as TGF-β level on days 3 and 6. A decrease in NF-κB could be observed on days 9 and 16 after EET application. The latter findings were not significant. SDF-1α expression was not influenced by EET application, and Ki-67 was significantly less in the EET group on day 9 after EET application. The number of fibroblasts was significantly increased on day 9 after the 11,12 EET application. 11,12 EET improve deteriorated wound healing in diabetes by enhancing neoangiogenesis, especially in the early phase of wound healing. Furthermore, they contribute to the dissolution of the initial inflammatory reaction, allowing the crucial transition from the inflammatory to proliferative phase in wound healing.
Introduction: Stem cell transplantation is one of the most promising strategies to improve healing in chronic wounds as systemic administration of endothelial progenitor cells (EPC) enhances healing by promoting neovascularization and homing though a high amount of cells is needed. In the following study, we analysed whether local application can reduce the number of EPC needed achieving the same beneficial effect on wound healing.
Material and Methods: Wound healing after local or systemic treatment with EPC was monitored in vivo by creating standardized wounds on the dorsum of hairless mice measuring wound closure every second day. Systemic group received 2 × 106 EPC i.v. and locally treated group 2 × 105 EPC, locally injected. As control PBS injection was performed the same way. Expression of CD31, VEGF, CD90 and, SDF-1α was analysed immunohistochemically for evaluation of neovascularisation and amelioration of homing.
Results: Local (7.1 ± 0.45 SD) as well as systemic (6.1 ± 0.23 SD) EPC transplantation led to a significant acceleration of wound closure compared to controls (PBS local: 9.7 ± 0.5 SD, PBS systemic 10.9 ± 0.38 SD). Systemic application enhanced CD31 expression on day 6 after wounding and local EPC on 6 and 9 in comparison to control. VEGF expression was not significantly affected. Systemic and local EPC treatment resulted in a significantly enhanced SDF-1α and CD90 expression on all days investigated.
Conclusion: Local as well as systemic EPC treatment enhances wound healing. Moreover, beneficial effects are obtained with a tenfold decrease number of EPC when applied locally. Thus, local EPC treatment might be more convenient way to enhance wound healing as number of progenitor cells is limited.
Introduction: Epoxyeicosatrienoic acids (EETs) are able to enhance angiogenesis and regulate inflammation that is especially important in wound healing under ischemic conditions. Thus, we evaluated the effect of local EET application on ischemic wounds in mice.
Methods: Ischemia was induced by cautherization of two of the three supplying vessels to the mouse ear. Wounding was performed on the ear three days later. Wounds were treated either with 11,12 or 14,15 EET and compared to untreated control and normal wounds. Epithelialization was measured every second day. VEGF, TNF-α, TGF-β, matrix metalloproteinases (MMP), tissue inhibitors of metalloproteinases (TIMP), Ki67, and SDF-1α were evaluated immunohistochemically in wounds on day 3, 6, and 9.
Results: Ischemia delayed wound closure (12.8 days ± 1.9 standard deviation (SD) for ischemia and 8.0 days ± 0.94 SD for control). 11,12 and14,15 EET application ameliorated deteriorated wound healing on ischemic ears (7.6 ± 1.3 SD for 11,12 EET and 9.2 ± 1.4 SD for 14,15 EET). Ischemia did not change VEGF, TNF-α, TGF-β, SDF-1α, TIMP, MMP7 or MMP9 level significantly compared to control. Local application of 11,12 as well as 14,15 EET induced a significant elevation of VEGF, TGF-β, and SDF-1α expression as well as proliferation during the whole phase of wound healing compared to control and ischemia alone.
Conclusion: In summary, EET improve impaired wound healing caused by ischemia as they enhance neovascularization and alter inflammatory response in wounds. Thus elevating lipid mediator level as 11,12 and 14,15 EET in wounds might be a successful strategy for amelioration of deranged wound healing under ischemia.
The emerging disciplines of lipidomics and metabolomics show great potential for the discovery of diagnostic biomarkers, but appropriate pre-analytical sample-handling procedures are critical because several analytes are prone to ex vivo distortions during sample collection. To test how the intermediate storage temperature and storage period of plasma samples from K3EDTA whole-blood collection tubes affect analyte concentrations, we assessed samples from non-fasting healthy volunteers (n = 9) for a broad spectrum of metabolites, including lipids and lipid mediators, using a well-established LC-MS-based platform. We used a fold change-based approach as a relative measure of analyte stability to evaluate 489 analytes, employing a combination of targeted LC-MS/MS and LC-HRMS screening. The concentrations of many analytes were found to be reliable, often justifying less strict sample handling; however, certain analytes were unstable, supporting the need for meticulous processing. We make four data-driven recommendations for sample-handling protocols with varying degrees of stringency, based on the maximum number of analytes and the feasibility of routine clinical implementation. These protocols also enable the simple evaluation of biomarker candidates based on their analyte-specific vulnerability to ex vivo distortions. In summary, pre-analytical sample handling has a major effect on the suitability of certain metabolites as biomarkers, including several lipids and lipid mediators. Our sample-handling recommendations will increase the reliability and quality of samples when such metabolites are necessary for routine clinical diagnosis.
Multiple choice (MC)-Klausuren sind im deutschen Medizinstudium trotz weitgehend fehlender Daten zur Validität dieser Prüfungsform zur Regelprüfung geworden. Darüber hinaus ist unklar, in welchem Ausmaß die Studierenden - auch solche mit guten Prüfungsergebnissen - den geprüften Lernstoff tatsächlich beherrschen. Am Fachbereich Medizin der Johann-Wolfgang-Goethe-Universität Frankfurt wurde am Ende des SS 2003 im Fach Mikrobiologie für die Studierenden des 2. klinischen Semesters eine MC-basierte Abschlussprüfung geschrieben. Die Studierenden des 1. klinischen Semesters hatten - bedingt durch Umstellungen des Curriculums - eine identische Ausbildung. Diese wurde durch eine inhaltlich weitgehend identische, im Format aber andere Klausur abgeschlossen, in der sowohl offene Fragen enthalten waren als auch Fragen, bei denen die Studierenden jede Aussage einzeln auf Korrektheit bewerten mussten. Der Vergleich der Ergebnisse für inhaltlich gleiche Fragen zeigt, dass die Studierenden im MC-Format eine hohe Quote richtiger Antworten erzielen, diese jedoch durch ein geändertes Fragenformat stark reduziert wird. So erreichten nur 20 - 30% der Studierenden ein vollständig richtiges Ergebnis, wenn jede Aussage einzeln bewertet werden musste, während die inhaltlich gleiche Frage im MC-Format 80 - 90% richtige Ergebnisse erzielte. In freien Fragen konnten nur 30 - 40% der Studierenden die richtige Antwort aktiv niederschreiben, während 90 -99% der Studierenden die richtige Lösung passiv erkannten. Wir interpretieren diese Ergebnisse dahin, dass der Entscheidungszwang in MC-basierten Fragen einen starken Einfluss auf die Quote richtiger Antworten hat, und die Prüfungsergebnisse damit wesentlich durch das Format beeinflusst werden, das Wissen dagegen nicht beherrscht wird. Die Ergebnisse dieser Studie legen nahe, Sorgfalt bei der Auswahl des Prüfungsverfahrens walten zu lassen und der Steuerung des studentischen Lernverhaltens durch das Prüfungsformat wesentlich mehr Aufmerksamkeit zu widmen als bisher.
Deutsche medizinische Fachbereiche und Fakultäten sollen ihre Studienanfänger nach eigenen Kriterien aussuchen. Da bis zu 40 000 Bewerbungen pro Jahr erwartet werden können, ist eine Vorauswahl erforderlich, bevor arbeitsintensivere Auswahlmechanismen eingesetzt werden können. Wir haben einen Fragebogen konzipiert für den Versuch, zusätzlich zu schulischen Leistungen weitere Bewerbercharakteristika zu erfassen wie vorbestehendes medizinisch relevantes Wissen, musische, soziale, sportliche und berufliche Aktivitäten. Alle Studienanfänger des Wintersemesters 2005/2006 (860 Studierende) der Medizinischen Fachbereiche/Fakultäten an der Johann Wolfgang Goethe-Universität Frankfurt (FFM) und der Medizinischen Universität Innsbruck (MUI) wurden gebeten, diesen Fragebogen auszufüllen. Zum Wintersemester 2005/2006 wurde in FFM ausschließlich nach Abiturnote zugelassen, während die Zulassung an der MUI nach dem Posteingang der Bewerbung erfolgte, ohne Berücksichtigung von Schulnoten. Beide Gruppen (FFM 431 Studierende, MUI 429 Studierende) gaben vergleichbare nichtschulische Aktivitäten mit fast identischer Häufigkeit an, mit der Ausnahme der Ableistung eines Krankenhauspraktikums. Ein Pflegepraktikum wird von der deutschen Approbationsordnung verlangt, kann aber vor Studienbeginn absolviert werden, so dass deutsche Studienbeginner (sowohl Zulassung in FFM - 53%; deutsche Studienanfänger an der MUI - 67%) überwiegend ein Praktikum absolviert hatten, während österreichische Studienanfänger ein Praktikum wesentlich seltener abgeleistet hatten (14%). Derzeit sollten die erfassten nichtschulischen Leistungen als Zulassungskriterium für das Medizinstudium nur nach vorheriger Überprüfung der Eignung verwendet werden.
Background & Aims: In ACLF patients, an adequate risk stratification is essential, especially for liver transplant allocation, since ACLF is associated with high short-term mortality. The CLIF-C ACLF score is the best prognostic model to predict outcome in ACLF patients. While lung failure is generally regarded as signum malum in ICU care, this study aims to evaluate and quantify the role of pulmonary impairment on outcome in ACLF patients.
Methods: In this retrospective study, 498 patients with liver cirrhosis and admission to IMC/ICU were included. ACLF was defined according to EASL-CLIF criteria. Pulmonary impairment was classified into three groups: unimpaired ventilation, need for mechanical ventilation and defined pulmonary failure. These factors were analysed in different cohorts, including a propensity score-matched ACLF cohort.
Results: Mechanical ventilation and pulmonary failure were identified as independent risk factors for increased short-term mortality. In matched ACLF patients, the presence of pulmonary failure showed the highest 28-day mortality (83.7%), whereas mortality rates in ACLF with mechanical ventilation (67.3%) and ACLF without pulmonary impairment (38.8%) were considerably lower (p < .001). Especially in patients with pulmonary impairment, the CLIF-C ACLF score showed poor predictive accuracy. Adjusting the CLIF-C ACLF score for the grade of pulmonary impairment improved the prediction significantly.
Conclusions: This study highlights that not only pulmonary failure but also mechanical ventilation is associated with worse prognosis in ACLF patients. The grade of pulmonary impairment should be considered in the risk assessment in ACLF patients. The new score may be useful in the selection of patients for liver transplantation.
Purpose: The primary treatment goals for advanced-stage thumb carpometacarpal (CMC) joint osteoarthritis are complete pain relief and restoration of thumb strength. The purpose of the present study was to introduce a variation of the abductor pollicis longus (APL) suspension arthroplasty using a single looping of a radial slip from the APL tendon around the flexor carpi radialis (FCR) tendon combined with RegJoint™ interposition and to determine its efficacy in the treatment of thumb CMC joint osteoarthritis.
Methods: Between 2015 and 2017, 21 patients were included. The average age was 60.8 years (range 48–79). The mean follow-up was 27.7 months (range 8–50). Evaluation included pain, radial and palmar abduction, tip pinch and grip strength, and Disabilities of the Arm, Shoulder, and Hand (DASH) score.
Results: Pain averaged 0.3 (range 0–4) at rest and 1.4 (range 0–4) on exertion. The radial and palmar abduction were 97% and 99% compared to the contralateral side. The tip pinch and grip strength were 4.1 kg (range 3–6.5) and 22 kg (range 13.3–40), respectively. The DASH score accounted for 18.5 (range 0.8–41.7).
Conclusion: The modified APL suspension interposition arthroplasty was an efficient and simplified option for the treatment of thumb CMC joint osteoarthritis, with results comparable or better than other published procedures. The APL suspension technique was easy to perform avoiding difficult bone tunneling and incision of the FCR tendon. The RegJoint™ interposition as spacer prevented impingement of the first metacarpal base on the second metacarpal base or the trapezoid bone.
Background: The combination of intermediate-dose cytarabine plus mitoxantrone (IMA) can induce high complete remission rates with acceptable toxicity in elderly patients with acute myeloid leukemia (AML). We present the final results of a randomized-controlled trial comparing IMA with the standard 7 + 3 induction regimen consisting of continuous infusion cytarabine plus daunorubicin (DA).
Patients and methods: Patients with newly diagnosed AML >60 years were randomized to receive either intermediate-dose cytarabine (1000 mg/m2 twice daily on days 1, 3, 5, 7) plus mitoxantrone (10 mg/m2 days 1–3) (IMA) or standard induction therapy with cytarabine (100 mg/m2 continuously days 1–7) plus daunorubicin (45 mg/m2 days 3–5) (DA). Patients in complete remission after DA received intermediate-dose cytarabine plus amsacrine as consolidation treatment, whereas patients after IMA were consolidated with standard-dose cytarabine plus mitoxantrone.
Results: Between February 2005 and October 2009, 485 patients were randomized; 241 for treatment arm DA and 244 for IMA; 76% of patients were >65 years. The complete response rate after DA was 39% [95% confidence interval (95% CI): 33–45] versus 55% (95% CI: 49–61) after IMA (odds ratio 1.89, P = 0.001). The 6-week early-death rate was 14% in both arms. Relapse-free survival curves were superimposable in the first year, but separated afterwards, resulting in 3-year relapse-free survival rates of 29% versus 14% in the DA versus IMA arms, respectively (P = 0.042). The median overall survival was 10 months in both arms (P = 0.513).
Conclusion: The dose escalation of cytarabine in induction therapy lead to improved remission rates in the elderly AML patients. This did not translate into a survival advantage, most likely due to differences in consolidation treatment. Thus, effective consolidation strategies need to be further explored. In combination with an effective consolidation strategy, the use of intermediate-dose cytarabine in induction may improve curative treatment for elderly AML patients.