Refine
Year of publication
- 2021 (20) (remove)
Document Type
- Article (19)
- Contribution to a Periodical (1)
Has Fulltext
- yes (20)
Is part of the Bibliography
- no (20)
Keywords
- COVID-19 (3)
- Entscheidungsassistenz (2)
- Environmental support (2)
- Medical consultation (2)
- Patient needs (2)
- Patientenbedürfnisse (2)
- Pediatric (2)
- Person-Umwelt-Passung (2)
- Person-environment fit (2)
- Supported decision making (2)
Institute
- Medizin (20) (remove)
Background: Intensive Care Resources are heavily utilized during the COVID-19 pandemic. However, risk stratification and prediction of SARS-CoV-2 patient clinical outcomes upon ICU admission remain inadequate. This study aimed to develop a machine learning model, based on retrospective & prospective clinical data, to stratify patient risk and predict ICU survival and outcomes. Methods: A Germany-wide electronic registry was established to pseudonymously collect admission, therapeutic and discharge information of SARS-CoV-2 ICU patients retrospectively and prospectively. Machine learning approaches were evaluated for the accuracy and interpretability of predictions. The Explainable Boosting Machine approach was selected as the most suitable method. Individual, non-linear shape functions for predictive parameters and parameter interactions are reported. Results: 1039 patients were included in the Explainable Boosting Machine model, 596 patients retrospectively collected, and 443 patients prospectively collected. The model for prediction of general ICU outcome was shown to be more reliable to predict “survival”. Age, inflammatory and thrombotic activity, and severity of ARDS at ICU admission were shown to be predictive of ICU survival. Patients’ age, pulmonary dysfunction and transfer from an external institution were predictors for ECMO therapy. The interaction of patient age with D-dimer levels on admission and creatinine levels with SOFA score without GCS were predictors for renal replacement therapy. Conclusions: Using Explainable Boosting Machine analysis, we confirmed and weighed previously reported and identified novel predictors for outcome in critically ill COVID-19 patients. Using this strategy, predictive modeling of COVID-19 ICU patient outcomes can be performed overcoming the limitations of linear regression models. Trial registration “ClinicalTrials” (clinicaltrials.gov) under NCT04455451.
Background: The alpha7 nicotinic acetylcholine receptor (Chrna7) plays an essential anti-inflammatory role in immune homeostasis and was recently found on mast cells (MC). Psychosocial stress can trigger MC hyperactivation and increases pro-inflammatory cytokines in target tissues such as the skin. If the cholinergic system (CS) and Chrna7 ligands play a role in these cascades is largely unknown. Objective: To elucidate the role of the CS in the response to psychosocial stress using a mouse-model for stress-triggered cutaneous inflammatory circuits. Methods: Key CS markers (ACh, Ch, SLURP-1, SLURP-2, Lynx1, Chrm3, Chrna7, Chrna9, ChAT, VAChT, Oct3, AChE, and BChE) in skin and its MC (sMC), MC activation, immune parameters (TNFα, IL1β, IL10, TGFβ, HIF1α, and STAT3) and oxidative stress were analyzed in skin from 24 h noise-stressed mice and in cultured MC (cMC) from C57BL/6 or Chrna7-Knockout mice. Results: First, Chrna7 and SLURP-1 mRNA were exclusively upregulated in stressed skin. Second, histomorphometry located Chrna7 and SLURP-1 in nerves and sMC and demonstrated upregulated contacts and increased Chrna7+ sMC in stressed skin, while 5 ng/mL SLURP-1 degranulated cMC. Third, IL1β+ sMC were high in stressed skin, and while SLURP-1 alone had no significant effect on cMC cytokines, it upregulated IL1β in cMC from Chrna7-KO and in IL1β-treated wildtype cMC. In addition, HIF1α+ sMC were high in stressed skin and Chrna7-agonist AR-R 17779 induced ROS in cMC while SLURP-1 upregulated TNFα and IL1β in cMC when HIF1α was blocked. Conclusions: These data infer that the CS plays a role in the regulation of stress-sensitive inflammatory responses but may have a surprising pro-inflammatory effect in healthy skin, driving IL1β expression if SLURP-1 is involved.
Background: Polytraumatized patients undergo a strong immunological stress upon insult. Phagocytes (granulocytes and monocytes) play a substantial role in immunological defense against bacteria, fungi and yeast, and in the clearance of cellular debris after tissue injury. We have reported a reduced monocytes phagocytic activity early after porcine polytrauma before. However, it is unknown if both phagocyte types undergo those functional alterations, and if there is a pathogen-specific phagocytic behavior. We characterized the phagocytic activity and capacity of granulocytes and monocytes after polytrauma.
Methods: Eight pigs (Sus scrofa) underwent polytrauma consisting of lung contusion, liver laceration, tibial fracture and hemorrhagic shock with fluid resuscitation and fracture fixation with external fixator. Intensive care treatment including mechanical ventilation for 72 h followed. Phagocytic activity and capacity were investigated using an in vitro ex vivo whole blood stimulation phagocytosis assays before trauma, after surgery, 24, 48, and 72 h after trauma. Blood samples were stimulated with Phorbol-12-myristate-13-acetate and incubated with FITC-labeled E. coli, S. aureus or S. cerevisiae for phagocytosis assessment by flow cytometry.
Results: Early polytrauma-induced significant increase of granulocytes and monocytes declined to baseline values within 24 h. Percentage of E. coli-phagocytizing granulocytes significantly decreased after polytrauma and during further intensive care treatment, while their capacity significantly increased. Interestingly, both granulocytic phagocytic activity and capacity of S. aureus significantly decreased after trauma, although a recovery was observed after 24 h and yet was followed by another decrease. The percentage of S. cerevisiae-phagocytizing granulocytes significantly increased after 24 h, while their impaired capacity after surgery and 72 h later was detected. Monocytic E. coli-phagocytizing percentage did not change, while their capacity increased after 24–72 h. After a significant decrease in S. aureus-phagocytizing monocytes after surgery, a significant increase after 24 and 48 h was observed without capacity alterations. No significant changes in S. cerevisiae-phagocytizing monocytes occurred, but their capacity dropped 48 and 72 h.
Conclusion: Phagocytic activity and capacity of granulocytes and monocytes follow a different pattern and significantly change within 72 h after polytrauma. Both phagocytic activity and capacity show significantly different alterations depending on the pathogen strain, thus potentially indicating at certain and possibly more relevant infection causes after polytrauma.
Introduction: In an emergency department, the majority of pediatric trauma patients present because of minor injuries. The aim of this study was to evaluate temporal changes in age-related injury pattern, trauma mechanism, and surgeries in pediatric patients. Methods: This retrospective study included patients < 18 years of age following trauma from 01/2009 to 12/2018 at a level I trauma center. They were divided into two groups: group A (A: 01/2009 to 12/2013) and group B (B: 01/2014 to 12/2018). Injury mechanism, injury pattern, and surgeries were analyzed. As major injuries fractures, dislocations, and organ injuries and as minor injuries contusions and superficial wounds were defined. Results: 23,582 patients were included (58% male, median age 8.2 years). There was a slight increase in patients comparing A (n = 11,557) and B (n = 12,025) with no difference concerning demographic characteristics. Significant more patients (A: 1.9%; B: 2.4%) were admitted to resuscitation room, though the number of multiple injured patients was not significantly different. In A (25.5%), major injuries occurred significantly less frequently than in B (27.0%), minor injuries occurred equally. Extremity fractures were significantly more frequent in B (21.5%) than in A (20.2%), peaking at 8–12 years. Most trauma mechanisms of both groups were constant, with a rising of sport injuries at 8–12 years. Conclusion: Although number of patients increases only slightly over a decade, there was a clear increase in major injuries, particularly extremity fractures, peaking at 8–12 years. At this age also sport accidents significantly increased. At least, admittance to resuscitation room rose but without an increase of multiple injured patients.
Purpose: While more advanced COVID-19 necessitates medical interventions and hospitalization, patients with mild COVID-19 do not require this. Identifying patients at risk of progressing to advanced COVID-19 might guide treatment decisions, particularly for better prioritizing patients in need for hospitalization.
Methods: We developed a machine learning-based predictor for deriving a clinical score identifying patients with asymptomatic/mild COVID-19 at risk of progressing to advanced COVID-19. Clinical data from SARS-CoV-2 positive patients from the multicenter Lean European Open Survey on SARS-CoV-2 Infected Patients (LEOSS) were used for discovery (2020-03-16 to 2020-07-14) and validation (data from 2020-07-15 to 2021-02-16).
Results: The LEOSS dataset contains 473 baseline patient parameters measured at the first patient contact. After training the predictor model on a training dataset comprising 1233 patients, 20 of the 473 parameters were selected for the predictor model. From the predictor model, we delineated a composite predictive score (SACOV-19, Score for the prediction of an Advanced stage of COVID-19) with eleven variables. In the validation cohort (n = 2264 patients), we observed good prediction performance with an area under the curve (AUC) of 0.73 ± 0.01. Besides temperature, age, body mass index and smoking habit, variables indicating pulmonary involvement (respiration rate, oxygen saturation, dyspnea), inflammation (CRP, LDH, lymphocyte counts), and acute kidney injury at diagnosis were identified. For better interpretability, the predictor was translated into a web interface.
Conclusion: We present a machine learning-based predictor model and a clinical score for identifying patients at risk of developing advanced COVID-19.
Large-scale molecular profiling studies in recent years have shown that central nervous system (CNS) tumors display a much greater heterogeneity in terms of molecularly distinct entities, cellular origins and genetic drivers than anticipated from histological assessment. DNA methylation profiling has emerged as a useful tool for robust tumor classification, providing new insights into these heterogeneous molecular classes. This is particularly true for rare CNS tumors with a broad morphological spectrum, which are not possible to assign as separate entities based on histological similarity alone. Here, we describe a molecularly distinct subset of predominantly pediatric CNS neoplasms (n = 60) that harbor PATZ1 fusions. The original histological diagnoses of these tumors covered a wide spectrum of tumor types and malignancy grades. While the single most common diagnosis was glioblastoma (GBM), clinical data of the PATZ1-fused tumors showed a better prognosis than typical GBM, despite frequent relapses. RNA sequencing revealed recurrent MN1:PATZ1 or EWSR1:PATZ1 fusions related to (often extensive) copy number variations on chromosome 22, where PATZ1 and the two fusion partners are located. These fusions have individually been reported in a number of glial/glioneuronal tumors, as well as extracranial sarcomas. We show here that they are more common than previously acknowledged, and together define a biologically distinct CNS tumor type with high expression of neural development markers such as PAX2, GATA2 and IGF2. Drug screening performed on the MN1:PATZ1 fusion-bearing KS-1 brain tumor cell line revealed preliminary candidates for further study. In summary, PATZ1 fusions define a molecular class of histologically polyphenotypic neuroepithelial tumors, which show an intermediate prognosis under current treatment regimens.
Background: Liver cirrhosis is a relevant comorbidity with increasing prevalence. Postoperative decompensation and development of complications in patients with cirrhosis remains a frequent clinical problem. Surgery has been discussed as a precipitating event for decompensation and complications of cirrhosis, but the underlying pathomechanisms are still obscure. The aim of this study was to analyze the role of abdominal extrahepatic surgery in cirrhosis on portal pressure and fibrosis in a preclinical model. Methods: Compensated liver cirrhosis was induced using tetrachlormethane (CCL4) inhalation and bile duct ligation (BDL) models in rats, non-cirrhotic portal hypertension by partial portal vein ligation (PPVL). Intestinal manipulation (IM) as a model of extrahepatic abdominal surgery was performed. 2 and 7 days after IM, portal pressure was measured in-vivo. Hydroxyproline measurements, Sirius Red staining and qPCR measurements of the liver were performed for evaluation of fibrosis development and hepatic inflammation. Laboratory parameters of liver function in serum were analyzed. Results: Portal pressure was significantly elevated 2 and 7 days after IM in both models of cirrhosis. In the non-cirrhotic model the trend was the same, while not statistically significant. In both cirrhotic models, IM shows strong effects of decompensation, with significant weight loss, elevation of liver enzymes and hypoalbuminemia. 7 days after IM in the BDL group, Sirius red staining and hydroxyproline levels showed significant progression of fibrosis and significantly elevated mRNA levels of hepatic inflammation compared to the respective control group. A progression of fibrosis was not observed in the CCL4 model. Conclusion: In animal models of cirrhosis with continuous liver injury (BDL), IM increases portal pressure, and development of fibrosis. Perioperative portal pressure and hence inflammation processes may be therapeutic targets to prevent post-operative decompensation in cirrhosis.
Epoxyeicosatrienoic acids (EET) facilitate regeneration in different tissues, and their benefit in dermal wound healing has been proven under normal conditions. In this study, we investigated the effect of 11,12 EET on dermal wound healing in diabetes. We induced diabetes by i.p. injection of streptozotocin 2 weeks prior to wound creation on the dorsal side of the mouse ear. 11,12 EET was applied every second day on the wound, whereas the control groups received only solvent. Epithelialization was monitored every second day intravitally up to wound closure. Wounds were stained for VEGF, CD31, TGF-β, TNF-α, SDF-1α, NF-κB, and Ki-67, and fibroblasts were counted after hematoxylin-eosin stain on days 3, 6, 9, and 16 after wounding. After induction of diabetes, wounds closed on day 13.00 ± 2.20 standard deviation (SD). Local 11,12 ETT application improved wound closure significantly to day 8.40 ± 1.39 SD. EET treatment enhanced VEGF and CD31 expression in wounds on day 3. It also seemed to raise TNF-α level on all days investigated as well as TGF-β level on days 3 and 6. A decrease in NF-κB could be observed on days 9 and 16 after EET application. The latter findings were not significant. SDF-1α expression was not influenced by EET application, and Ki-67 was significantly less in the EET group on day 9 after EET application. The number of fibroblasts was significantly increased on day 9 after the 11,12 EET application. 11,12 EET improve deteriorated wound healing in diabetes by enhancing neoangiogenesis, especially in the early phase of wound healing. Furthermore, they contribute to the dissolution of the initial inflammatory reaction, allowing the crucial transition from the inflammatory to proliferative phase in wound healing.
Aims: Patients with cardiovascular comorbidities have a significantly increased risk for a critical course of COVID-19. As the SARS-CoV2 virus enters cells via the angiotensin-converting enzyme receptor II (ACE2), drugs which interact with the renin angiotensin aldosterone system (RAAS) were suspected to influence disease severity.
Methods and results: We analyzed 1946 consecutive patients with cardiovascular comorbidities or hypertension enrolled in one of the largest European COVID-19 registries, the Lean European Open Survey on SARS-CoV-2 (LEOSS) registry. Here, we show that angiotensin II receptor blocker intake is associated with decreased mortality in patients with COVID-19 [OR 0.75 (95% CI 0,59–0.96; p = 0.013)]. This effect was mainly driven by patients, who presented in an early phase of COVID-19 at baseline [OR 0,64 (95% CI 0,43–0,96; p = 0.029)]. Kaplan-Meier analysis revealed a significantly lower incidence of death in patients on an angiotensin receptor blocker (ARB) (n = 33/318;10,4%) compared to patients using an angiotensin-converting enzyme inhibitor (ACEi) (n = 60/348;17,2%) or patients who received neither an ACE-inhibitor nor an ARB at baseline in the uncomplicated phase (n = 90/466; 19,3%; p<0.034). Patients taking an ARB were significantly less frequently reaching the mortality predicting threshold for leukocytes (p<0.001), neutrophils (p = 0.002) and the inflammatory markers CRP (p = 0.021), procalcitonin (p = 0.001) and IL-6 (p = 0.049). ACE2 expression levels in human lung samples were not altered in patients taking RAAS modulators.
Conclusion: These data suggest a beneficial effect of ARBs on disease severity in patients with cardiovascular comorbidities and COVID-19, which is linked to dampened systemic inflammatory activity.
Background and purpose: During acute coronavirus disease 2019 (COVID-19) infection, neurological signs, symptoms and complications occur. We aimed to assess their clinical relevance by evaluating real-world data from a multinational registry. Methods: We analyzed COVID-19 patients from 127 centers, diagnosed between January 2020 and February 2021, and registered in the European multinational LEOSS (Lean European Open Survey on SARS-Infected Patients) registry. The effects of prior neurological diseases and the effect of neurological symptoms on outcome were studied using multivariate logistic regression. Results: A total of 6537 COVID-19 patients (97.7% PCR-confirmed) were analyzed, of whom 92.1% were hospitalized and 14.7% died. Commonly, excessive tiredness (28.0%), headache (18.5%), nausea/emesis (16.6%), muscular weakness (17.0%), impaired sense of smell (9.0%) and taste (12.8%), and delirium (6.7%) were reported. In patients with a complicated or critical disease course (53%) the most frequent neurological complications were ischemic stroke (1.0%) and intracerebral bleeding (ICB; 2.2%). ICB peaked in the critical disease phase (5%) and was associated with the administration of anticoagulation and extracorporeal membrane oxygenation (ECMO). Excessive tiredness (odds ratio [OR] 1.42, 95% confidence interval [CI] 1.20–1.68) and prior neurodegenerative diseases (OR 1.32, 95% CI 1.07–1.63) were associated with an increased risk of an unfavorable outcome. Prior cerebrovascular and neuroimmunological diseases were not associated with an unfavorable short-term outcome of COVID-19. Conclusion: Our data on mostly hospitalized COVID-19 patients show that excessive tiredness or prior neurodegenerative disease at first presentation increase the risk of an unfavorable short-term outcome. ICB in critical COVID-19 was associated with therapeutic interventions, such as anticoagulation and ECMO, and thus may be an indirect complication of a life-threatening systemic viral infection.
Simple Summary: Acute myeloid leukemia (AML) is a genetically heterogeneous disease. Clinical phenotypes of frequent mutations and their impact on patient outcome are well established. However, the role of rare mutations often remains elusive. We retrospectively analyzed 1529 newly diagnosed and intensively treated AML patients for mutations of BCOR and BCORL1. We report a distinct co-mutational pattern that suggests a role in disease progression rather than initiation, especially affecting mechanisms of DNA-methylation. Further, we found loss-of-function mutations of BCOR to be independent markers of poor outcomes in multivariable analysis. Therefore, loss-of-function mutations of BCOR need to be considered for AML management, as they may influence risk stratification and subsequent treatment allocation.
Abstract: Acute myeloid leukemia (AML) is characterized by recurrent genetic events. The BCL6 corepressor (BCOR) and its homolog, the BCL6 corepressor-like 1 (BCORL1), have been reported to be rare but recurrent mutations in AML. Previously, smaller studies have reported conflicting results regarding impacts on outcomes. Here, we retrospectively analyzed a large cohort of 1529 patients with newly diagnosed and intensively treated AML. BCOR and BCORL1 mutations were found in 71 (4.6%) and 53 patients (3.5%), respectively. Frequently co-mutated genes were DNTM3A, TET2 and RUNX1. Mutated BCORL1 and loss-of-function mutations of BCOR were significantly more common in the ELN2017 intermediate-risk group. Patients harboring loss-of-function mutations of BCOR had a significantly reduced median event-free survival (HR = 1.464 (95%-Confidence Interval (CI): 1.005–2.134), p = 0.047), relapse-free survival (HR = 1.904 (95%-CI: 1.163–3.117), p = 0.01), and trend for reduced overall survival (HR = 1.495 (95%-CI: 0.990–2.258), p = 0.056) in multivariable analysis. Our study establishes a novel role for loss-of-function mutations of BCOR regarding risk stratification in AML, which may influence treatment allocation.
Introduction: Stem cell transplantation is one of the most promising strategies to improve healing in chronic wounds as systemic administration of endothelial progenitor cells (EPC) enhances healing by promoting neovascularization and homing though a high amount of cells is needed. In the following study, we analysed whether local application can reduce the number of EPC needed achieving the same beneficial effect on wound healing.
Material and Methods: Wound healing after local or systemic treatment with EPC was monitored in vivo by creating standardized wounds on the dorsum of hairless mice measuring wound closure every second day. Systemic group received 2 × 106 EPC i.v. and locally treated group 2 × 105 EPC, locally injected. As control PBS injection was performed the same way. Expression of CD31, VEGF, CD90 and, SDF-1α was analysed immunohistochemically for evaluation of neovascularisation and amelioration of homing.
Results: Local (7.1 ± 0.45 SD) as well as systemic (6.1 ± 0.23 SD) EPC transplantation led to a significant acceleration of wound closure compared to controls (PBS local: 9.7 ± 0.5 SD, PBS systemic 10.9 ± 0.38 SD). Systemic application enhanced CD31 expression on day 6 after wounding and local EPC on 6 and 9 in comparison to control. VEGF expression was not significantly affected. Systemic and local EPC treatment resulted in a significantly enhanced SDF-1α and CD90 expression on all days investigated.
Conclusion: Local as well as systemic EPC treatment enhances wound healing. Moreover, beneficial effects are obtained with a tenfold decrease number of EPC when applied locally. Thus, local EPC treatment might be more convenient way to enhance wound healing as number of progenitor cells is limited.
Background: The objective of the STREAM Trial was to evaluate the effect of simulation training on process times in acute stroke care.
Methods: The multicenter prospective interventional STREAM Trial was conducted between 10/2017 and 04/2019 at seven tertiary care neurocenters in Germany with a pre- and post-interventional observation phase. We recorded patient characteristics, acute stroke care process times, stroke team composition and simulation experience for consecutive direct-to-center patients receiving intravenous thrombolysis (IVT) and/or endovascular therapy (EVT). The intervention consisted of a composite intervention centered around stroke-specific in situ simulation training. Primary outcome measure was the ‘door-to-needle’ time (DTN) for IVT. Secondary outcome measures included process times of EVT and measures taken to streamline the pre-existing treatment algorithm.
Results: The effect of the STREAM intervention on the process times of all acute stroke operations was neutral. However, secondary analyses showed a DTN reduction of 5 min from 38 min pre-intervention (interquartile range [IQR] 25–43 min) to 33 min (IQR 23–39 min, p = 0.03) post-intervention achieved by simulation-experienced stroke teams. Concerning EVT, we found significantly shorter door-to-groin times in patients who were treated by teams with simulation experience as compared to simulation-naive teams in the post-interventional phase (−21 min, simulation-naive: 95 min, IQR 69–111 vs. simulation-experienced: 74 min, IQR 51–92, p = 0.04).
Conclusion: An intervention combining workflow refinement and simulation-based stroke team training has the potential to improve process times in acute stroke care.
Introduction: Current classifications of complete knee dislocations do not capture the extent of the complex concomitant ligamentous and bony injuries, which may have an impact on future outcomes. The purpose of this retrospective study was to evaluate the epidemiology of complete knee dislocations as well as to present an updated classification system based on the author’s experience at a Level-I trauma center.
Materials and methods: Only patients with complete loss of contact of the articulating bones and ≥ 18 years of age who admitted in our level-I trauma center between 2002 and 2019 were included. Patients were identified using a retrospective systematical query in the Hospital Information System (HIS) using the International Statistical Classification of Diseases and Related Health Problems Version10 (ICD-10) codes of the German Diagnosis Related Groups (G-DRG).
Results: Final data included 80 patients, with the majority of patients being male (n = 64; 80.0%). Mean age was 34.9 years (range: 18–70 years). External protective fixation was applied in 32 patients (40.0%). Reconstruction of the posterior cruciate ligament and the anterior cruciate ligament were performed in 56.3% (n = 45) and 55.0% (n = 44) of cases, respectively. The lateral collateral ligament complex was surgically addressed in 47.5% (n = 38), while the medial collateral ligament complex was reconstructed in 40% (n = 32). Surgery of the lateral meniscus and the medial meniscus was needed in 31.1% (n = 25) and 30.0% (n = 24). Neurovascular surgery occurred in 13.8% (n = 11). From the characteristic injury-patterns the authors of this study present a new classification system that ranks the injuries from Grade-A to Grade-D according to their severity.
Conclusion: This retrospective study demonstrates that the historically used classification systems for dislocations of the knee are insufficient for these severe injuries. Concomitant ligamentous, neurovascular, bony, and meniscal injuries were frequent, and required several staged procedures. Consequently, an updated classification system is proposed.
Evaluation of 2‑methoxyestradiol serum levels as a potential prognostic marker in malignant melanoma
(2021)
Experimental findings indicated that 2‑methoxyestradiol (2‑ME), an endogenous metabolite of 17β‑estradiol, may exhibit anti‑tumorigenic properties in various types of tumour, such as melanoma and endometrial carcinoma. In patients with endometrial cancer, the serum levels of 2‑ME are decreased compared with those in healthy controls, and this finding has been associated with a poor outcome. The aim of the present study was to examine whether the serum levels of 2‑ME are decreased in patients with melanoma, and whether this decrease may be correlated with disease stage and, therefore, serve as a prognostic indicator. ELISA was used to detect serum levels of 2‑ME in patients with stage I‑IV malignant melanoma (MM). A cohort of 78 patients with MM was analysed, along with 25 healthy controls, among whom 15 were women in the second trimester of pregnancy (positive control). As expected, significantly elevated levels of serum 2‑ME were observed in pregnant control patients compared with those in patients with MM and healthy controls. There was no observed correlation between 2‑ME serum levels in patients with MM and disease stage, tumour thickness, lactate dehydrogenase or S100 calcium‑binding protein B levels. In addition, the 2‑ME levels of patients with MM did not differ significantly from those of normal healthy controls. Overall, the findings of the present study indicated that the 2‑ME serum levels in patients with MM were not decreased, and there was no correlation with early‑ or advanced‑stage disease. Therefore, in contrast to published results on endometrial cancer, endogenous serum 2‑ME levels in MM were not found to be correlated with tumour stage and did not appear to be a suitable prognostic factor in MM.
Introduction Massive haemoptysis is a life-threatening event in advanced cystic fibrosis (CF) lung disease with bronchial artery embolisation (BAE) as standard of care treatment. The aim of our study was to scrutinise short-term and long-term outcomes of patients with CF and haemoptysis after BAE using coils.
Methods We carried out a retrospective cohort study of 34 adult patients treated for massive haemoptysis with super selective bronchial artery coil embolisation (ssBACE) between January 2008 and February 2015. Embolisation protocol was restricted to the culprit vessel(s) and three lobes maximum. Demographic data, functional end-expiratory volume in 1 s in % predicted (FEV1% pred.) and body mass index before and after ssBACE, sputum colonisation, procedural data, time to transplant and time to death were documented.
Results Patients treated with ssBACE showed significant improvement of FEV1% pred. after embolisation (p=0.004) with 72.8% alive 5 years post-ssBACE. Mean age of the patients was 29.9 years (±7.7). Mean FEV1% pred. was 45.7% (±20.1). Median survival to follow-up was 75 months (0–125). Severe complication rate was 0%, recanalisation rate 8.8% and 5-year-reintervention rate 58.8%. Chronic infection with Pseudomonas aeruginosa was found in 79.4%, Staphylococcus areus in 50% and Aspergillus fumigatus in 47.1%.
Discussion ssBACE is a safe and effective treatment for massive haemoptysis in patients with CF with good results for controlling haemostasis and excellent short-term and long-term survival, especially in severely affected patients with FEV<40% pred. We think the data of our study support the use of coils and a protocol of careful and prudent embolisation.
As some cognitive functions decline in old age, the ability to decide about important life events such as medical treatment is endangered. Environmental support to improve the comprehension of health-related information is therefore necessary. With a small-scale explorative approach, the present survey study aimed at investigating person-environment fit (PE-fit) of support provided during medical consultations. This fit was calculated by assessing the match between aids provided by five medical practitioners during medical consultations and aids most appreciated by the geriatric patients (N = 88). The results showed that the largest discrepancies of used and appreciated aids could be found concerning the opportunity to discuss decisions with relatives, the possibility to take notes, the use of objects, pictures and a keyword list. Female patients indicated a lower PE-fit. These findings highlight discrepancies between the use of specific aids and the wishes of patients and call for thoughtful use of aids during consultations with geriatric patients.
As some cognitive functions decline in old age, the ability to decide about important life events such as medical treatment is endangered. Environmental support to improve the comprehension of health-related information is therefore necessary. With a small-scale explorative approach, the present survey study aimed at investigating person-environment fit (PE-fit) of support provided during medical consultations. This fit was calculated by assessing the match between aids provided by five medical practitioners during medical consultations and aids most appreciated by the geriatric patients (N = 88). The results showed that the largest discrepancies of used and appreciated aids could be found concerning the opportunity to discuss decisions with relatives, the possibility to take notes, the use of objects, pictures and a keyword list. Female patients indicated a lower PE-fit. These findings highlight discrepancies between the use of specific aids and the wishes of patients and call for thoughtful use of aids during consultations with geriatric patients.
Background: Clinical practice guidelines for patients with primary biliary cholangitis (PBC) have been recently revised and implemented for well-established response criteria to standard first-line ursodeoxycholic acid (UDCA) therapy at 12 months after treatment initiation for the early identification of high-risk patients with inadequate treatment responses who may require treatment modification. However, there are only very limited data concerning the real-world clinical management of patients with PBC in Germany. Objective: The aim of this retrospective multicenter study was to evaluate response rates to standard first-line UDCA therapy and subsequent Second-line treatment regimens in a large cohort of well-characterized patients with PBC from 10 independent hepatological referral centers in Germany prior to the introduction of obeticholic acid as a licensed second-line treatment option. Methods: Diagnostic confirmation of PBC, standard first-line UDCA treatment regimens and response rates at 12 months according to Paris-I, Paris-II, and Barcelona criteria, the follow-up cut-off alkaline phosphatase (ALP) ≤ 1.67 × upper limit of normal (ULN) and the normalization of bilirubin (bilirubin ≤ 1 × ULN) were retrospectively examined between June 1986 and March 2017. The management and hitherto applied second-line treatment regimens in patients with an inadequate response to UDCA and subsequent response rates at 12 months were also evaluated. Results: Overall, 480 PBC patients were included in this study. The median UDCA dosage was 13.2 mg UDCA/kg bodyweight (BW)/d. Adequate UDCA treatment response rates according to Paris-I, Paris-II, and Barcelona criteria were observed in 91, 71.3, and 61.3% of patients, respectively. In 83.8% of patients, ALP ≤ 1.67 × ULN were achieved. A total of 116 patients (24.2%) showed an inadequate response to UDCA according to at least one criterion. The diverse second-line treatment regimens applied led to significantly higher response rates according to Paris-II (35 vs. 60%, p = 0.005), Barcelona (13 vs. 34%, p = 0.0005), ALP ≤ 1.67 × ULN and bilirubin ≤ 1 × ULN (52.1 vs. 75%, p = 0.002). The addition of bezafibrates appeared to induce the strongest beneficial effect in this cohort (Paris II: 24 vs. 74%, p = 0.004; Barcelona: 50 vs. 84%, p = 0.046; ALP < 1.67 × ULN and bilirubin ≤ 1 × ULN: 33 vs. 86%, p = 0.001). Conclusion: Our large retrospective multicenter study confirms high response rates following UDCA first-line standard treatment in patients with PBC and highlights the need for close monitoring and early treatment modification in high-risk patients with an insufficient response to UDCA since early treatment modification significantly increases subsequent response rates of these patients.