Refine
Year of publication
- 2019 (498) (remove)
Document Type
- Article (498) (remove)
Has Fulltext
- yes (498)
Is part of the Bibliography
- no (498) (remove)
Keywords
- inflammation (10)
- glioblastoma (8)
- cancer (7)
- Breast cancer (6)
- biomarker (6)
- Inflammation (5)
- breast cancer (5)
- immunotherapy (5)
- mTOR (5)
- Biomarkers (4)
Institute
- Medizin (498) (remove)
Blunt thoracic trauma (TxT) deteriorates clinical post-injury outcomes. Ongoing inflammatory changes promote the development of post-traumatic complications, frequently causing Acute Lung Injury (ALI). Club Cell Protein (CC)16, a pulmonary anti-inflammatory protein, correlates with lung damage following TxT. Whether CC16-neutralization influences the inflammatory course during ALI is elusive. Ninety-six male CL57BL/6N mice underwent a double hit model of TxT and cecal ligation puncture (CLP, 24 h post-TxT). Shams underwent surgical procedures. CC16 was neutralized by the intratracheal application of an anti-CC16-antibody, either after TxT (early) or following CLP (late). Euthanasia was performed at 6 or 24 h post-CLP. Systemic and pulmonary levels of IL-6, IL-1β, and CXCL5 were determined, the neutrophils were quantified in the bronchoalveolar lavage fluid, and histomorphological lung damage was assessed. ALI induced a significant systemic IL-6 increase among all groups, while the local inflammatory response was most prominent after 24 h in the double-hit groups as compared to the shams. Significantly increased neutrophilic infiltration upon double hit was paralleled with the enhanced lung damage in all groups as compared to the sham, after 6 and 24 h. Neutralization of CC16 did not change the systemic inflammation. However, early CC16-neutralization increased the neutrophilic infiltration and lung injury at 6 h post-CLP, while 24 h later, the lung injury was reduced. Late CC16-neutralization increased neutrophilic infiltration, 24 h post-CLP, and was concurrent with an enhanced lung injury. The data confirmed the anti-inflammatory potential of endogenous CC16 in the murine double-hit model of ALI.
Introduction: Dravet syndrome (DS) is a rare developmental and epileptic encephalopathy. This study estimated cost, cost-driving factors and quality of life (QoL) in patients with Dravet syndrome and their caregivers in a prospective, multicenter study in Germany.
Methods: A validated 3–12-month retrospective questionnaire and a prospective 3-month diary assessing clinical characteristics, QoL, and direct, indirect and out-of-pocket (OOP) costs were administered to caregivers of patients with DS throughout Germany.
Results: Caregivers of 93 patients (mean age 10.1 years, ±7.1, range 15 months–33.7 years) submitted questionnaires and 77 prospective diaries. The majority of patients (95%) experienced at least one seizure during the previous 12 months and 77% a status epilepticus (SE) at least once in their lives. Over 70% of patients had behavioural problems and delayed speech development and over 80% attention deficit symptoms and disturbance of motor skills and movement coordination. Patient QoL was lower than in the general population and 45% of caregivers had some form of depressive symptoms. Direct health care costs per three months were a mean of €6,043 ± €5,825 (median €4054, CI €4935-€7350) per patient. Inpatient costs formed the single most important cost category (28%, €1,702 ± €4,315), followed by care grade benefits (19%, €1,130 ± €805), anti-epileptic drug (AED) costs (15%, €892 ± €1,017) and ancillary treatments (9%, €559 ± €503). Total indirect costs were €4,399 ±€ 4,989 (median €0, CI €3466-€5551) in mothers and €391 ± €1,352 (median €0, CI €195-€841) in fathers. In univariate analysis seizure frequency, experience of SE, nursing care level and severe additional symptoms were found to be associated with total direct healthcare costs. Severe additional symptoms was the single independently significant explanatory factor in a multivariate analysis.
Conclusions: This study over a period up to 15 months revealed substantial direct and indirect healthcare costs of DS in Germany and highlights the relatively low patient and caregiver QoL compared with the general population.
Platelet function (PF) plays a pivotal role in both hemostasis and thrombosis, and manual light transmission aggregometry (LTA) is considered the standard of care for platelet function testing but is an error-prone and time-consuming procedure. We aimed to test the agreement regarding maximum aggregation (MA), velocity (VEL), and lag-phase (LagP) of platelet aggregation of the automated Sysmex CS-2100i analyzer (Siemens, Germany) against the APACT 4004 (Elitech, France) in samples derived from healthy participants and patients with hemostaseologic disorders. In total, 123 patient-derived samples were investigated, including 42 patients with acetylsalicylic acid and/or clopidogrel intake and 20 patients with other hemostaseologic disorders. Both MA and VEL showed good or excellent intermethod correlation. Agreement between the testing methods was only partially achieved, and values were indicative for a systematic bias to lower measurements below a threshold of 50% MA with the CS-2100i compared to the APACT 4004. All patients with impaired PF in the APACT 4004 were successfully identified with the CS-2100i, and reference values for automated LTA are provided. Conclusively, automated LTA with the CS-2100i is a highly standardized and reliable PF testing method and represents a decisive step in the simplification of platelet function testing in clinical routine.
Discovery of key whole-brain transitions and dynamics during human wakefulness and non-REM sleep
(2019)
The modern understanding of sleep is based on the classification of sleep into stages defined by their electroencephalography (EEG) signatures, but the underlying brain dynamics remain unclear. Here we aimed to move significantly beyond the current state-of-the-art description of sleep, and in particular to characterise the spatiotemporal complexity of whole-brain networks and state transitions during sleep. In order to obtain the most unbiased estimate of how whole-brain network states evolve through the human sleep cycle, we used a Markovian data-driven analysis of continuous neuroimaging data from 57 healthy participants falling asleep during simultaneous functional magnetic resonance imaging (fMRI) and EEG. This Hidden Markov Model (HMM) facilitated discovery of the dynamic choreography between different whole-brain networks across the wake-non-REM sleep cycle. Notably, our results reveal key trajectories to switch within and between EEG-based sleep stages, while highlighting the heterogeneities of stage N1 sleep and wakefulness before and after sleep.
Circadian oscillations in circulating leukocyte subsets including immature hematopoietic cells have been appreciated; the origin and nature of these alterations remain elusive. Our analysis of wild-type C57BL/6 mice under constant darkness confirmed circadian fluctuations of circulating leukocytes and clonogenic cells in blood and spleen but not bone marrow. Clock gene deficient Bmal1-/- mice lacked this regulation. Cell cycle analyses in the different hematopoietic compartments excluded circadian changes in total cell numbers, rather favoring shifting hematopoietic cell redistribution as the underlying mechanism. Transplant chimeras demonstrate that circadian rhythms within the stroma mediate the oscillations independently of hematopoietic-intrinsic cues. We provide evidence of circadian CXCL12 regulation via clock genes in vitro and were able to confirm CXCL12 oscillation in bone marrow and blood in vivo. Our studies further implicate cortisol as the conveyor of circadian input to bone marrow stroma and mediator of the circadian leukocyte oscillation. In summary, we establish hematopoietic-extrinsic cues as causal for circadian redistribution of circulating mature/immature blood cells.
Background: Transcatheter aortic valve replacement (TAVR) is a therapeutic option for patients with aortic valve stenosis at increased surgical risk. Telomeres are an established marker for cellular senescence and have served to evaluate cardiovascular diseases including severe aortic valve stenosis. In our study, we hypothesized that telomere length may be a predictor for outcome and associated with comorbidities in patients with TAVR.
Methods and results: We analyzed leucocyte telomere length from 155 patients who underwent TAVR and correlated the results with 1-year mortality and severe comorbidities. The cohort was subdivided into 3 groups according to telomere length. Although a trend for a positive correlation of telomere length with a lower EuroSCORE could be found, telomere length was not associated with survival, aortic valve opening area or cardiovascular comorbidities (peripheral, coronary or cerebrovascular disease). Interestingly, long telomeres were significantly correlated to a reduced left ventricular ejection fraction (LVEF).
Conclusion: In elderly patients with severe aortic valve stenosis, leucocyte telomere length did not predict post-procedural survival. The correlation between long telomere length and reduced LVEF in these patients deserves further attention.
During erythropoiesis, haematopoietic stem cells (HSCs) differentiate in successive steps of commitment and specification to mature erythrocytes. This differentiation process is controlled by transcription factors that establish stage- and cell type-specific gene expression. In this study, we demonstrate that FUSE binding protein 1 (FUBP1), a transcriptional regulator important for HSC self-renewal and survival, is regulated by T-cell acute lymphocytic leukaemia 1 (TAL1) in erythroid progenitor cells. TAL1 directly activates the FUBP1 promoter, leading to increased FUBP1 expression during erythroid differentiation. The binding of TAL1 to the FUBP1 promoter is highly dependent on an intact GATA sequence in a combined E-box/GATA motif. We found that FUBP1 expression is required for efficient erythropoiesis, as FUBP1-deficient progenitor cells were limited in their potential of erythroid differentiation. Thus, the finding of an interconnection between GATA1/TAL1 and FUBP1 reveals a molecular mechanism that is part of the switch from progenitor- to erythrocyte-specific gene expression. In summary, we identified a TAL1/FUBP1 transcriptional relationship, whose physiological function in haematopoiesis is connected to proper erythropoiesis.
Previous research indicates that anxiety disorders are characterized by an overgeneralization of conditioned fear as compared with healthy participants. Therefore, fear generalization is considered a key mechanism for the development of anxiety disorders. However, systematic investigations on the variance in fear generalization are lacking. Therefore, the current study aims at identifying distinctive phenotypes of fear generalization among healthy participants. To this end, 1175 participants completed a differential fear conditioning phase followed by a generalization test. To identify patterns of fear generalization, we used a k-means clustering algorithm based on individual arousal generalization gradients. Subsequently, we examined the reliability and validity of the clusters and phenotypical differences between subgroups on the basis of psychometric data and markers of fear expression. Cluster analysis reliably revealed five clusters that systematically differed in mean responses, differentiation between conditioned threat and safety, and linearity of the generalization gradients, though mean response levels accounted for most variance. Remarkably, the patterns of mean responses were already evident during fear acquisition and corresponded most closely to psychometric measures of anxiety traits. The identified clusters reliably described subgroups of healthy individuals with distinct response characteristics in a fear generalization test. Following a dimensional view of psychopathology, these clusters likely delineate risk factors for anxiety disorders. As crucial group characteristics were already evident during fear acquisition, our results emphasize the importance of average fear responses and differentiation between conditioned threat and safety as risk factors for anxiety disorders.
Background: Previous trials of PCSK9 (proprotein convertase subtilisin-kexin type 9) inhibitors demonstrated reductions in major adverse cardiovascular events, but not death. We assessed the effects of alirocumab on death after index acute coronary syndrome.
Methods: ODYSSEY OUTCOMES (Evaluation of Cardiovascular Outcomes After an Acute Coronary Syndrome During Treatment With Alirocumab) was a double-blind, randomized comparison of alirocumab or placebo in 18 924 patients who had an ACS 1 to 12 months previously and elevated atherogenic lipoproteins despite intensive statin therapy. Alirocumab dose was blindly titrated to target achieved low-density lipoprotein cholesterol (LDL-C) between 25 and 50 mg/dL. We examined the effects of treatment on all-cause death and its components, cardiovascular and noncardiovascular death, with log-rank testing. Joint semiparametric models tested associations between nonfatal cardiovascular events and cardiovascular or noncardiovascular death.
Results: Median follow-up was 2.8 years. Death occurred in 334 (3.5%) and 392 (4.1%) patients, respectively, in the alirocumab and placebo groups (hazard ratio [HR], 0.85; 95% CI, 0.73 to 0.98; P=0.03, nominal P value). This resulted from nonsignificantly fewer cardiovascular (240 [2.5%] vs 271 [2.9%]; HR, 0.88; 95% CI, 0.74 to 1.05; P=0.15) and noncardiovascular (94 [1.0%] vs 121 [1.3%]; HR, 0.77; 95% CI, 0.59 to 1.01; P=0.06) deaths with alirocumab. In a prespecified analysis of 8242 patients eligible for ≥3 years follow-up, alirocumab reduced death (HR, 0.78; 95% CI, 0.65 to 0.94; P=0.01). Patients with nonfatal cardiovascular events were at increased risk for cardiovascular and noncardiovascular deaths (P<0.0001 for the associations). Alirocumab reduced total nonfatal cardiovascular events (P<0.001) and thereby may have attenuated the number of cardiovascular and noncardiovascular deaths. A post hoc analysis found that, compared to patients with lower LDL-C, patients with baseline LDL-C ≥100 mg/dL (2.59 mmol/L) had a greater absolute risk of death and a larger mortality benefit from alirocumab (HR, 0.71; 95% CI, 0.56 to 0.90; Pinteraction=0.007). In the alirocumab group, all-cause death declined with achieved LDL-C at 4 months of treatment, to a level of approximately 30 mg/dL (adjusted P=0.017 for linear trend).
Conclusions: Alirocumab added to intensive statin therapy has the potential to reduce death after acute coronary syndrome, particularly if treatment is maintained for ≥3 years, if baseline LDL-C is ≥100 mg/dL, or if achieved LDL-C is low.
Clinical Trial Registration: URL: https://www.clinicaltrials.gov. Unique identifier: NCT01663402.
Background: Computer-assisted implant planning has become an important diagnostic and therapeutic tool in modern dentistry. This case report emphasizes the possibilities in modern implantology combining virtual implant planning, guided surgery with tooth and implant supported templates, immediate implant placement and loading.
Case presentation: A straight forward approach was followed for the mandible presenting with hopeless lower incisors. Diagnosis, decision making and treatment approach were based on clinical findings and detailed virtual three-dimensional implant planning. Extractions of the hopeless mandibular incisors, immediate and guided implant placement of six standard implants, and immediate loading with a provisional fixed dental prosthesis (FDP) were performed fulfilling patient’s functional and esthetic demands. The final computer assisted design / computer assisted manufacturing (CAD/CAM) FDP with a titanium framework and composite veneering was delivered after 6 months. At the 1-year recall the FDP was free of technical complications. Stable bony conditions and a healthy peri-implant mucosa could be observed.
Conclusions: Computer assisted implantology including three-dimensional virtual implant planning, guided surgery, and CAD/CAM fabrication of provisional and final reconstructions allowed for a concise treatment workflow with predictable esthetic and functional outcomes in this mandibular full-arch case. The combination of immediate implant placement and immediate loading was considerably more complex and required a high level of organization between implantologist, technician and patient. After the usage of a first tooth-supported surgical template with subsequent extraction of the supporting teeth, a second surgical template stabilized on the previously inserted implants helped to transfer the planned implant position in the extraction sites with a guided approach.
Background: Iron deficiency anemia is common in pregnancy with a prevalence of approximately 16% in Austria; however, international guideline recommendations on screening and subsequent treatment with iron preparations are inconsistent. The aim of this study was to find out how often pregnant women take iron-containing supplements, and who recommended them. As hemoglobin data were available for a sub-group of women, hemoglobin status during pregnancy and associated consumption of iron-containing medications were also recorded.
Methods: This cross-sectional study was conducted at the Mother-Child-Booklet service center of the Styrian Health Insurance Fund in Graz, Austria. A questionnaire containing seven questions was developed. Absolute and relative numbers were determined, and corresponding 95% confidence intervals calculated using bootstrapping techniques.
Results: A total of 325 women completed the questionnaire, 11% had been diagnosed with anemia before becoming pregnant, 67% reported taking iron-containing compounds. The women reported taking 45 different products but 61% took 1 of 3 different supplements. Overall, 185 (57%) women had not been diagnosed with anemia before becoming pregnant but reported taking an iron-containing supplement and 89% of the women took supplements on the recommendation of their physician. Of the 202 women whose hemoglobin status was assessed, 92% were found not to be anemic.
Conclusion: Overall, 67% of pregnant women took iron-containing compounds, irrespective of whether they were deficient in iron. Physicians were generally responsible for advising them to take them. No standardized procedure is available on which to base the decision whether to take iron during pregnancy, even in guidelines. As most guidelines only recommend taking iron supplements in cases of anemia, the high percentage of women taking them in Austria is incomprehensible.e
Purpose: Artificial intelligence (AI) has accelerated novel discoveries across multiple disciplines including medicine. Clinical medicine suffers from a lack of AI-based applications, potentially due to lack of awareness of AI methodology. Future collaboration between computer scientists and clinicians is critical to maximize the benefits of transformative technology in this field for patients. To illustrate, we describe AI-based advances in the diagnosis and management of gliomas, the most common primary central nervous system (CNS) malignancy.
Methods: Presented is a succinct description of foundational concepts of AI approaches and their relevance to clinical medicine, geared toward clinicians without computer science backgrounds. We also review novel AI approaches in the diagnosis and management of glioma.
Results: Novel AI approaches in gliomas have been developed to predict the grading and genomics from imaging, automate the diagnosis from histopathology, and provide insight into prognosis.
Conclusion: Novel AI approaches offer acceptable performance in gliomas. Further investigation is necessary to improve the methodology and determine the full clinical utility of these novel approaches.
This is a randomized trial (ATHENA study) in de novo kidney transplant patients to compare everolimus versus mycophenolic acid (MPA) with similar tacrolimus exposure in both groups, or everolimus with concomitant tacrolimus or cyclosporine (CsA), in an unselected population. In this 12-month, multicenter, open-label study, de novo kidney transplant recipients were randomized to everolimus with tacrolimus (EVR/TAC), everolimus with CsA (EVR/CsA) or MPA with tacrolimus (MPA/TAC), with similar tacrolimus exposure in both groups. Non-inferiority of the primary end point (estimated glomerular filtration rate [eGFR] at month 12), assessed in the per-protocol population of 338 patients, was not shown for EVR/TAC or EVR/CsA versus MPA/TAC. In 123 patients with TAC levels within the protocol-specified range, eGFR outcomes were comparable between groups. The mean increase in eGFR during months 1 to 12 post-transplant, analyzed post hoc, was similar with EVR/TAC or EVR/CsA versus MPA/TAC. The incidence of treatment failure (biopsy proven acute rejection, graft loss or death) was not significant for EVR/TAC but significant for EVR/CsA versus MPA/TAC. Most biopsy-proven acute rejection events in this study were graded mild (BANFF IA). There were no differences in proteinuria between groups. Cytomegalovirus and BK virus infection were significantly more frequent with MPA/TAC. Thus, everolimus with TAC or CsA showed comparable efficacy to MPA/TAC in de novo kidney transplant patients. Non-inferiority of renal function, when pre-specified, was not shown, but the mean increase in eGFR from month 1 to 12 was comparable to MPA/TAC.
Introduction: Epoxyeicosatrienoic acids (EETs) are able to enhance angiogenesis and regulate inflammation that is especially important in wound healing under ischemic conditions. Thus, we evaluated the effect of local EET application on ischemic wounds in mice.
Methods: Ischemia was induced by cautherization of two of the three supplying vessels to the mouse ear. Wounding was performed on the ear three days later. Wounds were treated either with 11,12 or 14,15 EET and compared to untreated control and normal wounds. Epithelialization was measured every second day. VEGF, TNF-α, TGF-β, matrix metalloproteinases (MMP), tissue inhibitors of metalloproteinases (TIMP), Ki67, and SDF-1α were evaluated immunohistochemically in wounds on day 3, 6, and 9.
Results: Ischemia delayed wound closure (12.8 days ± 1.9 standard deviation (SD) for ischemia and 8.0 days ± 0.94 SD for control). 11,12 and14,15 EET application ameliorated deteriorated wound healing on ischemic ears (7.6 ± 1.3 SD for 11,12 EET and 9.2 ± 1.4 SD for 14,15 EET). Ischemia did not change VEGF, TNF-α, TGF-β, SDF-1α, TIMP, MMP7 or MMP9 level significantly compared to control. Local application of 11,12 as well as 14,15 EET induced a significant elevation of VEGF, TGF-β, and SDF-1α expression as well as proliferation during the whole phase of wound healing compared to control and ischemia alone.
Conclusion: In summary, EET improve impaired wound healing caused by ischemia as they enhance neovascularization and alter inflammatory response in wounds. Thus elevating lipid mediator level as 11,12 and 14,15 EET in wounds might be a successful strategy for amelioration of deranged wound healing under ischemia.
Lipoxygenases (LOXs) catalyze the stereo-specific peroxidation of polyunsaturated fatty acids (PUFAs) to their corresponding hydroperoxy derivatives. Human macrophages express two arachidonic acid (AA) 15-lipoxygenating enzymes classified as ALOX15 and ALOX15B. ALOX15, which was first described in 1975, has been extensively characterized and its biological functions have been investigated in a number of cellular systems and animal models. In macrophages, ALOX15 functions to generate specific phospholipid (PL) oxidation products crucial for orchestrating the nonimmunogenic removal of apoptotic cells (ACs) as well as synthesizing precursor lipids required for production of specialized pro-resolving mediators (SPMs) that facilitate inflammation resolution. The discovery of ALOX15B in 1997 was followed by comprehensive analyses of its structural properties and reaction specificities with PUFA substrates. Although its enzymatic properties are well described, the biological functions of ALOX15B are not fully understood. In contrast to ALOX15 whose expression in human monocyte-derived macrophages is strictly dependent on Th2 cytokines IL-4 and IL-13, ALOX15B is constitutively expressed. This review aims to summarize the current knowledge on the regulation and functions of ALOX15 and ALOX15B in human macrophages.
Purpose: To examine whether applying case management in general practices reduces thromboembolic events requiring hospitalization and major bleeding events (combined primary outcome). Secondary endpoints were mortality, frequency and duration of hospitalization, severe treatment interactions, adverse events, quality of anticoagulation, health-related quality of life and intervention costs, patients’ assessment of chronic illness care, self-reported adherence to medication, GP and HCA knowledge, patient knowledge and satisfaction with shared decision-making.
Methods: Cluster-randomized controlled trial undertaken at 52 general practices in Germany with adult patients with a long-term indication for oral anticoagulation. The complex intervention included training for healthcare assistants, information and quality circles for general practitioners and 24 months of case management for patients. Assessment was after 12 and 24 months. The intention-to-treat population included all randomized practices and patients, while the per-protocol analysis included only those that received treatment without major protocol violations.
Results: The mean (SD) age of the 736 patients was 73.5 (9.4) years and 597 (81.1%) had atrial fibrillation. After 24 months, the primary endpoint had occurred in 40 (11.0%) intervention and 48 (12.9%) control patients (hazard ratio 0.83, 95% CI 0.55 to 1.25; P = .37). Patients’ perceived quality of care, their knowledge, and HCAs’ knowledge, had improved significantly at 24 months. The other secondary endpoints did not differ between groups. In the intervention group, hospital admissions were significantly reduced in patients that received treatment without major protocol deviations.
Conclusions: Even though the main outcomes did not differ significantly, the intervention appears to have positively influenced several process parameters under "real-world conditions".
Diagnosing and treating acute severe and recurrent antivenom-related anaphylaxis (ARA) is challenging and reported experience is limited. Herein, we describe our experience of severe ARA in patients with neurotoxic snakebite envenoming in Nepal. Patients were enrolled in a randomised, double-blind trial of high vs. low dose antivenom, given by intravenous (IV) push, followed by infusion. Training in ARA management emphasised stopping antivenom and giving intramuscular (IM) adrenaline, IV hydrocortisone, and IV chlorphenamine at the first sign/s of ARA. Later, IV adrenaline infusion (IVAI) was introduced for patients with antecedent ARA requiring additional antivenom infusions. Preantivenom subcutaneous adrenaline (SCAd) was introduced in the second study year (2012). Of 155 envenomed patients who received ≥ 1 antivenom dose, 13 (8.4%), three children (aged 5−11 years) and 10 adults (18−52 years), developed clinical features consistent with severe ARA, including six with overlapping signs of severe envenoming. Four and nine patients received low and high dose antivenom, respectively, and six had received SCAd. Principal signs of severe ARA were dyspnoea alone (n=5 patients), dyspnoea with wheezing (n=3), hypotension (n=3), shock (n=3), restlessness (n=3), respiratory/cardiorespiratory arrest (n=7), and early (n=1) and late laryngeal oedema (n=1); rash was associated with severe ARA in 10 patients. Four patients were given IVAI. Of the 8 (5.1%) deaths, three occurred in transit to hospital. Severe ARA was common and recurrent and had overlapping signs with severe neurotoxic envenoming. Optimising the management of ARA at different healthy system levels needs more research. This trial is registered with NCT01284855.
Background: Computerized virtual patients (VP) have spread into many areas of healthcare delivery and medical education. They provide various advantages like flexibility in pace and space of learning, a high degree of teaching reproducibility and a cost effectiveness. However, the educational benefit of VP as an additive or also as an alternative to traditional teaching formats remains unclear. Moreover, there are no randomized-controlled studies that investigated the use of VP in a dental curriculum. Therefore, this study investigates VP as an alternative to lecturer-led small-group teaching in a curricular, randomized and controlled setting.
Methods: Randomized and controlled cohort study. Four VP cases were created according to previously published design principles and compared with lecturer-led small group teaching (SGT) within the Oral and Maxillofacial Surgery clerkship for dental students at the Department for Cranio-, Oral and Maxillofacial Plastic Surgery, Goethe University, Frankfurt, Germany. Clinical competence was measured prior (T0), directly (T1) and 6 weeks (T2) after the intervention using theoretical tests and a self-assessment questionnaire. Furthermore, VP design was evaluated using a validated toolkit.
Results: Fifty-seven students (VP = 32; SGT = 25) agreed to participate in the study. No competence differences were found at T0 (p = 0.56). The VP group outperformed (p < .0001) the SGT group at T1. At T2 there was no difference between both groups (p = 0.55). Both interventions led to a significant growth in self-assessed competence. The VP group felt better prepared to diagnose and treat real patients and regarded VP cases as a rewarding learning experience.
Conclusions: VP cases are an effective alternative to lecture-led SGT in terms of learning efficacy in the short and long-term as well as self-assessed competence growth and student satisfaction. Furthermore, integrating VP cases within a curricular Oral and Maxillofacial Surgery Clerkship is feasible and leads to substantial growth of clinical competence in undergraduate dental students.
Purpose: To determine whether machine learning assisted-texture analysis of multi-energy virtual monochromatic image (VMI) datasets from dual-energy CT (DECT) can be used to differentiate metastatic head and neck squamous cell carcinoma (HNSCC) lymph nodes from lymphoma, inflammatory, or normal lymph nodes.
Materials and methods: A retrospective evaluation of 412 cervical nodes from 5 different patient groups (50 patients in total) having undergone DECT of the neck between 2013 and 2015 was performed: (1) HNSCC with pathology proven metastatic adenopathy, (2) HNSCC with pathology proven benign nodes (controls for (1)), (3) lymphoma, (4) inflammatory, and (5) normal nodes (controls for (3) and (4)). Texture analysis was performed with TexRAD® software using two independent sets of contours to assess the impact of inter-rater variation. Two machine learning algorithms (Random Forests (RF) and Gradient Boosting Machine (GBM)) were used with independent training and testing sets and determination of accuracy, sensitivity, specificity, PPV, NPV, and AUC.
Results: In the independent testing (prediction) sets, the accuracy for distinguishing different groups of pathologic nodes or normal nodes ranged between 80 and 95%. The models generated using texture data extracted from the independent contour sets had substantial to almost perfect agreement. The accuracy, sensitivity, specificity, PPV, and NPV for correctly classifying a lymph node as malignant (i.e. metastatic HNSCC or lymphoma) versus benign were 92%, 91%, 93%, 95%, 87%, respectively.
Conclusion: Machine learning assisted-DECT texture analysis can help distinguish different nodal pathology and normal nodes with a high accuracy.
Objectives: Current knowledge on infections caused by Scedosporium spp. and Lomentospora prolificans in children is scarce. We therefore aim to provide an overview of risk groups, clinical manifestation and treatment strategies of these infections.
Methods: Pediatric patients (age ≤18 years) with proven/probable Scedosporium spp. or L. prolificans infection were identified in PubMed and the FungiScope® registry. Data on diagnosis, treatment and outcome were collected.
Results: Fifty-five children (median age 9 years [IQR: 5–14]) with invasive Scedosporium spp. (n = 33) or L. prolificans (n = 22) infection were identified between 1990 and 2019. Malignancy, trauma and near drowning were the most common risk factors. Infections were frequently disseminated. Most patients received systemic antifungal therapy, mainly voriconazole and amphotericin B, plus surgical treatment.
Overall, day 42 mortality was 31%, higher for L. prolificans (50%) compared to Scedosporium spp. (18%). L. prolificans infection was associated with a shorter median survival time compared to Scedosporium spp. (6 days [IQR: 3–28] versus 61 days [IQR: 16–148]). Treatment for malignancy and severe disseminated infection were associated with particularly poor outcome (HR 8.33 [95% CI 1.35–51.40] and HR 6.12 [95% CI 1.52–24.66], respectively). Voriconazole use at any time and surgery for antifungal treatment were associated with improved clinical outcome (HR 0.33 [95% CI 0.11–0.99] and HR 0.09 [95% CI 0.02–0.40], respectively).
Conclusions: Scedosporium spp. and L. prolificans infections in children are associated with high mortality despite comprehensive antifungal therapy. Voriconazole usage and surgical intervention are associated with successful outcome.