Refine
Year of publication
Document Type
- Article (82)
- Conference Proceeding (1)
- Preprint (1)
Has Fulltext
- yes (84)
Is part of the Bibliography
- no (84)
Keywords
Institute
- Medizin (84) (remove)
Acute cholecystitis – a cohort study in a real-world clinical setting (REWO study, NCT02796443)
(2018)
Background: For decades, the optimal timing of surgery for acute cholecystitis has been controversial. Recent meta-analyses and population-based studies favor early surgery. One recent large randomized trial has demonstrated that a delayed approach increases morbidity and cost compared to early surgery within 24 hours of hospital admission. Since cases of severe cholecystitis were excluded from this trial, we argue that these results do not reflect real-world clinical situations. From our point of view, these results were in contrast to the clinical experience with our patients; so, we decided to analyze critically all our patients with the null hypothesis that the patients treated with a delayed cholecystectomy after an acute cholecystitis have a similar or even better outcome than those treated with an early operative approach.
Patients and methods: We retrospectively analyzed clinical data from all patients with cholecystectomies in the period between January 2006 and September 2015. A total of 1,723 patients were categorized into four groups: early (n=138): urgent surgery of patients with acute cholecystitis within the first 72 hours of the onset of symptoms; intermediate (n=297): surgery of patients with acute cholecystitis within an average of 10 days after the onset of symptoms; delayed (n=427): initial non-surgical treatment of acute cholecystitis with surgery performed within 6–12 weeks of the onset of symptoms; and elective (n=868): cholecystectomy within a symptom-free interval of choice in patients with symptomatic cholecystolithiasis without signs of acute cholecystitis.
Results: In a real-world scenario, early/intermediate cholecystectomy in acute cholecystitis was associated with a significant increase in morbidity and mortality (Clavien–Dindo score) compared to a delayed approach with surgery performed 6–12 weeks after the onset of symptoms. The adjusted linear rank statistics showed a decrease in the complication score with values of 2.29 in the early group, 0.48 in the intermediate group, –0.26 in the delayed group and –2.12 in the elective group. The results translate into a continuous decrease of the complication score from early over intermediate and delayed to the elective group.
Conclusion: These results demonstrate that delayed cholecystectomy can be performed safely. In cases with severe cholecystitis, early and/or intermediate approaches still have a relatively high risk of morbidity and mortality.
Background: Subdural hematoma (SDH) is a common disease associated with high morbidity, which is becoming more prominent due to the increasing incidence. Decision for a surgical evacuation is made depending on the clinical appearance and the volume of SDH, wherefore it is important to have a simple ‘bedside’ method to measure and compare the volume of SDH.
Objective: The aim of the study was to verify the accuracy of the simplified ABC/2 volumetric formula to determine a valuable tool for the clinical practice.
Methods: Preoperative CT-scans of 83 patients with SDHs were used for the computer-assisted volumetric measurement via BrainLab® as well as the ABC/2 volumetric measurement. A = largest length (anterior to posterior) of the SDH; B = maximum width (lateral to midline) 90° to A; C = maximum height (coronal plane or multiplication of slices) of the hematoma. These measurements were performed by two independent clinicians in a blinded fashion. Both volumes were compared by linear regression analysis of Pearson and Bland-Altman regression analysis.
Results: Among 100 SDHs, 53% were under an 47% were over 100cm3 showing a well distribution of the hematoma sizes. There was an excellent correlation between computer-assisted volumetric measurement and ABC/2 (R2 = 0.947, p<0.0001) and no undesirable deviation and trend were detected (p = 0.101; p = 0.777). A 95% tolerance region of the ratios of both methods was [0.805–1.201].
Conclusion: The ABC/2 method is a simple and fast bedside formula for the measurement of SDH volume in a timely manner without limited access through simple adaption, which may replace the computer-assisted volumetric measurement in the clinical and research area. Reason for the good accuracy seems to be the spherical form of SDH, which has a similarity to a half ellipsoid.
The free radical theory of aging suggests reactive oxygen species as a main reason for accumulation of damage events eventually leading to aging. Nox4, a member of the family of NADPH oxidases constitutively produces ROS and therefore has the potential to be a main driver of aging. Herein we analyzed the life span of Nox4 deficient mice and found no difference when compared to their wildtype littermates. Accordingly neither Tert expression nor telomere length was different in cells isolated from those animals. In fact, Nox4 mRNA expression in lungs of wildtype mice dropped with age. We conclude that Nox4 has no influence on lifespan of healthy mice.
The hepatitis C virus (HCV) RNA replication cycle is a dynamic intracellular process occurring in three-dimensional space (3D), which is difficult both to capture experimentally and to visualize conceptually. HCV-generated replication factories are housed within virus-induced intracellular structures termed membranous webs (MW), which are derived from the Endoplasmatic Reticulum (ER). Recently, we published 3D spatiotemporal resolved diffusion–reaction models of the HCV RNA replication cycle by means of surface partial differential equation (sPDE) descriptions. We distinguished between the basic components of the HCV RNA replication cycle, namely HCV RNA, non-structural viral proteins (NSPs), and a host factor. In particular, we evaluated the sPDE models upon realistic reconstructed intracellular compartments (ER/MW). In this paper, we propose a significant extension of the model based upon two additional parameters: different aggregate states of HCV RNA and NSPs, and population dynamics inspired diffusion and reaction coefficients instead of multilinear ones. The combination of both aspects enables realistic modeling of viral replication at all scales. Specifically, we describe a replication complex state consisting of HCV RNA together with a defined amount of NSPs. As a result of the combination of spatial resolution and different aggregate states, the new model mimics a cis requirement for HCV RNA replication. We used heuristic parameters for our simulations, which were run only on a subsection of the ER. Nevertheless, this was sufficient to allow the fitting of core aspects of virus reproduction, at least qualitatively. Our findings should help stimulate new model approaches and experimental directions for virology.
Background: Physical activity is an important part of life, and hence exercise-induced bronchoconstriction (EIB) can reduce the quality of life. A standardized test is needed to diagnose EIB. The American Thoracic Society (ATS) guidelines recommend an exercise challenge in combination with dry air. We investigated the feasibility of a new, ATS guidelines conform exercise challenge in a cold chamber (ECC) to detect EIB. The aim of this study was to investigate the surrogate marker reaction to methacholine, ECC and exercise challenge in ambient temperature for the prediction of a positive reaction and to re-evaluate the reproducibility of the response to an ECC.
Methods: Seventy-eight subjects aged 6 to 40 years with suspected EIB were recruited for the study. The subjects performed one methacholine challenge, two ECCs, and one exercise challenge at an ambient temperature. To define the sensitivity and specificity of the predictor, a receiver-operating characteristic curve was plotted. The repeatability was evaluated using the method described by Bland and Altman (95% Limits of agreement).
Results: The following cut-off values showed the best combination of sensitivity and specificity: the provocation dose causing a 20% decrease in the forced expiratory volume in 1 s (PD20FEV1) of methacholine: 1.36 mg (AUC 0.69, p < 0.05), the maximal decrease in FEV1 during the ECC: 8.5% (AUC 0.78, p < 0.001) and exercise challenges at ambient temperatures: FEV1 5.2% (AUC 0.64, p = 0.13). The median decline in FEV1 was 14.5% (0.0–64.2) during the first ECC and 10.7% (0.0–52.5) during the second ECC. In the comparison of both ECCs, the Spearman rank correlation of the FEV1 decrease was r = 0.58 (p < 0.001). The 95% limits of agreement (95% LOAs) for the FEV1 decrease were − 17.7 to 26.4%.
Conclusions: The surrogate markers PD20FEV1 of methacholine and maximal decrease in FEV1 during ECC can predict a positive reaction in another ECC, whereas the maximal FEV1 decrease in an exercise challenge at an ambient temperature was not predictive. Compared with previous studies, we can achieve a similar reproducibility with an ECC.
Clinical trial registration: NCT02026492 (retrospectively registered 03/Jan/2014).
Hintergrund: Das genaue Wissen um die Umstände eines jeden tödlichen Arbeitsunfalls ist Voraussetzung für die Identifizierung von Unfallschwerpunkten und ermöglicht eine effektive Präventionsarbeit. Mit dieser rechtsmedizinischen Studie zum Arbeitsunfallgeschehen soll ein Beitrag dazu geleistet werden, die Zahl tödlicher Arbeitsunfälle in Deutschland zu senken.
Material und Methode: Zur Untersuchung kamen die tödlichen Arbeitsunfälle, die sich im Einzugsbereich des rechtsmedizinischen Instituts Frankfurt am Main in den Jahren von 2005 bis 2016 ereigneten. Ausgewertet wurden Obduktionsprotokolle sowie die dem Institut zur Verfügung gestellten staatsanwaltschaftlichen Ermittlungsakten.
Ergebnisse: Es fanden sich 87 tödliche Arbeitsunfälle in dem genannten Zwölfjahreszeitraum. Die Altersstruktur reichte vom jugendlichen Alter bis in das Rentenalter. Betroffen waren zum größten Teil männliche Arbeiter (96,6 %, p < 0,0001), verhältnismäßig häufig ausländischer Nationalität (34,5 %). Die meisten Unfälle ereigneten sich in der 2. Jahreshälfte (58,6 %), an Montagen (26,4 %), kurz vor und nach der Mittagspause. In 3 Fällen lag die Blutalkoholkonzentration über 0,5‰. Die Baubranche (55,2 %) war der unfallträchtigste Wirtschaftszweig. Der Absturz (28,7 %) war der häufigste Unfallmechanismus und das Polytrauma (39,1 %) gemeinsam mit dem Schädel-Hirn-Trauma (24,1 %) gemäß dem ISS die häufigste Todesursache.
Diskussion: Nach den Ergebnissen dieser Studie sollten Alter der Arbeiter sowie die Tages‑, Wochen- und Jahreszeit bei der Ausführung risikoreicher Arbeiten im Baugewerbe berücksichtigt werden. Besonderes Augenmerk sollten Arbeitgeber auf die Kontrolle von Sicherheitsvorkehrungen bei Arbeiten in der Höhe sowie auf die Durchsetzung der Helmpflicht gerade auch bei ausländischen Arbeitnehmern legen.
Background: Chronic hepatitis C virus (HCV) infections are causally linked with metabolic comorbidities such as insulin resistance, hepatic steatosis, and dyslipidemia. However, the clinical impact of HCV eradication achieved by direct-acting antivirals (DAAs) on glucose and lipid homeostasis is still controversial. The study aimed to prospectively investigate whether antiviral therapy of HCV with DAAs alters glucose and lipid parameters. Methods: 50 patients with chronic HCV who were treated with DAAs were screened, and 49 were enrolled in the study. Biochemical and virological data, as well as noninvasive liver fibrosis parameters, were prospectively collected at baseline, at the end of treatment (EOT) and 12 and 24 weeks post-treatment. Results: 45 of 46 patients achieved sustained virologic response (SVR). The prevalence of insulin resistance (HOMA-IR) after HCV clearance was significantly lower, compared to baseline (5.3 ± 6.1 to 2.5 ± 1.9, p < 0.001), which is primarily attributable to a significant decrease of fasting insulin levels (18.9 ± 17.3 to 11.7 ± 8.7; p = 0.002). In contrast to that, HCV eradication resulted in a significant increase in cholesterol levels (total cholesterol, low-density lipoprotein cholesterol (LDL-C), and high-density lipoprotein (HDL-C) levels) and Controlled Attenuated Score (CAP), although BMI did not significantly change over time (p = 0.95). Moreover, HOMA-IR correlated significantly with noninvasive liver fibrosis measurements at baseline und during follow-up (TE: r = 0.45; p = 0.003, pSWE: r = 0.35; p = 0.02, APRI: r = 0.44; p = 0.003, FIB-4: r = 0.41; p < 0.001). Conclusion: Viral eradication following DAA therapy may have beneficial effects on glucose homeostasis, whereas lipid profile seems to be worsened.
Rationale: The clinical relevance of sensitization to Aspergillus (A) fumigatus in cystic fibrosis (CF) is unclear. Some researchers propose that specific A fumigatus IgE is an innocent bystander, whereas others describe it as the major cause of TH‐2‐driven asthma‐like disease.
Objectives: Lung function parameters in mild CF patients may be different in patients with and without A fumigatus sensitization. We aimed to ascertain whether allergen exposure to A fumigatus by bronchial allergen provocation (BAP) induces TH‐2 inflammation comparable to an asthma‐like disease.
Methods: A total of 35 patients, aged 14.8 ± 8.5 years, and 20 healthy controls were investigated prospectively. The patients were divided into two groups: group 1 (n = 18): specific (s)IgE negative, and group 2 (n = 17): sIgE positive (≥0.7 KU/L) for A fumigatus. Lung function, exhaled NO, and induced sputum were analysed. All sensitized patients with an FEV1 > 75% (n = 13) underwent BAP with A fumigatus, and cell counts, and the expression of IL‐5, IL‐13, INF‐γ, and IL‐8 as well as transcription factors T‐bet, GATA‐3, and FoxP3, were measured.
Results: Lung function parameters decreased significantly compared to controls, but not within the CF patient group. After BAP, 8 of 13 patients (61%) had a significant asthmatic response and increased eNO 24 hours later. In addition, marked TH‐2‐mediated inflammation involving eosinophils, IL‐5, IL‐13, and FoxP3 became apparent in induced sputum cells.
Conclusion: Our study demonstrated the clinical relevance of A fumigatus for the majority of sensitized CF patients. A distinct IgE/TH‐2‐dominated inflammation was found in induced sputum after A fumigatus exposure.
Purpose: The prevalence of "ocal allergic rhinitis" within individuals suffering from perennial rhinitis remains uncertain, and patients usually are diagnosed with non-allergic rhinitis. The aim of this study was to evaluate the prevalence of a potential "local allergic rhinitis" in subjects suffering from non-allergic rhinitis in a non-selected group of young students.
Methods: 131 students (age 25.0 ± 5.1 years) with a possible allergic rhinitis and 25 non-allergic controls without rhinitis symptoms (age 22.0 ± 2.0 years) were recruited by public postings. 97 of 131 students with rhinitis were tested positive (≥3 mm) to prick testing with 17 frequent allergens at visit 1. Twenty-four 24 subjects with a house dust mite allergy, 21 subjects with a non-allergic rhinitis, and 18 non-allergic controls were further investigated at visit 2. Blood samples were taken, and nasal secretion was examined. In addition, all groups performed a nasal provocation test with house dust mite (HDM).
Results: In serum and nasal secretion, total IgE and house dust mite specific IgE significantly differed between HDM positive subjects and controls. However, no differences between non-allergic subjects and control subjects were quantifiable. Neither a nasal provocation test nor a nasal IgE to HDM allergens showed a measurable positive response in any of the non-allergic rhinitis subjects as well as the healthy controls, whilst being positive in 13 subjects with HDM allergy.
Conclusions: Nasal IgE is present in subjects with HDM allergy, but not in non-allergic rhinitis. In the investigated non-selected population, exclusive local production of IgE is absent. By implication, therefore, our findings challenge the emerging concept of local allergic rhinitis.
Study identifier at ClinicalTrials.gov: NCT 02810535.
Introduction: The number of individuals requesting medical treatment for gender dysphoria has increased significantly within the past years. Our purpose was to examine current biographic and socio‐demographic characteristics and aspects of legal gender reassignment.
Design: Medical files from n = 350 individuals of a German Endocrine outpatient clinic were collected from 2009 to 2017 and analysed retrospectively.
Results: Ratio of transwomen to transmen equates to 1:1.89 with a remarkable increase of transmen by the year 2013, showing a reversal of gender distribution compared with previous studies for the first time. Use of illegal substances or self‐initiated hormone therapy was rare (4.6 and 2.1%). Satisfaction with gender‐affirming hormone therapy was significantly higher in transmen than in transwomen (100% vs 96.2%, P = .005). Use of antidepressants declined significantly after onset of hormone treatment in transmen (13% vs 7%; P = .007). The number of individuals with a graduation diploma was only about half as high as in the general population (14.3% vs 27.3%), whereas unemployment rate was more than twice as high (14% vs 6.9%). Median latency between application for legal gender reassignment and definitive court decision was 9 months.
Conclusions: Our data provide possible indications for a decline of psychosocial burden in individuals diagnosed with gender dysphoria over the last years. However, affected individuals are still limited in their occupational and financial opportunities as well as by a complex and expensive procedure of legal gender reassignment in Germany.
Introduction: Colorectal cancers (CRCs) deficient in the DNA mismatch repair protein MutL homolog 1 (MLH1) display distinct clinicopathological features and require a different therapeutic approach compared to CRCs with MLH1 proficiency. However, the molecular basis of this fundamental difference remains elusive. Here, we report that MLH1-deficient CRCs exhibit reduced levels of the cytoskeletal scaffolding protein non-erythroid spectrin αII (SPTAN1), and that tumor progression and metastasis of CRCs correlate with SPTAN1 levels.
Methods and results: To investigate the link between MLH1 and SPTAN1 in cancer progression, a cohort of 189 patients with CRC was analyzed by immunohistochemistry. Compared with the surrounding normal mucosa, SPTAN1 expression was reduced in MLH1-deficient CRCs, whereas MLH1-proficient CRCs showed a significant upregulation of SPTAN1. Overall, we identified a strong correlation between MLH1 status and SPTAN1 expression. When comparing TNM classification and SPTAN1 levels, we found higher SPTAN1 levels in stage I CRCs, while stages II to IV showed a gradual reduction of SPTAN1 expression. In addition, SPTAN1 expression was lower in metastatic compared with non-metastatic CRCs. Knockdown of SPTAN1 in CRC cell lines demonstrated decreased cell viability, impaired cellular mobility and reduced cell-cell contact formation, indicating that SPTAN1 plays an important role in cell growth and cell attachment. The observed weakened cell-cell contact of SPTAN1 knockdown cells might indicate that tumor cells expressing low levels of SPTAN1 detach from their primary tumor and metastasize more easily.
Conclusion: Taken together, we demonstrate that MLH1 deficiency, low SPTAN1 expression, and tumor progression and metastasis are in close relation. We conclude that SPTAN1 is a candidate molecule explaining the tumor progression and metastasis of MLH1-deficient CRCs. The detailed analysis of SPTAN1 is now mandatory to substantiate its relevance and its potential value as a candidate protein for targeted therapy, and as a predictive marker of cancer aggressiveness.
Introduction: Cell salvage (CS) is an integral part of patient blood management (PBM) and aims to reduce allogeneic red blood cell (RBC) transfusion.
Material and methods: This observational study analysed patients scheduled for elective cardiac surgery requiring cardiopulmonary bypass (CPB) between November 2015 and October 2018. Patients were divided into a CS group (patients receiving CS) and a control group (no CS). Primary endpoints were the number of patients exposed to allogeneic RBC transfusions and the number of RBC units transfused per patient.
Results: A total of 704 patients undergoing cardiac surgery were analysed, of whom 338 underwent surgery with CS (CS group) and 366 were without CS (control group). Intraoperatively, 152 patients (45%) were exposed to allogeneic RBC transfusions in the CS group and 93 patients (25%) in the control group (P < 0.001). Considering the amount of intraoperative blood loss, regression analysis revealed a significant association between blood loss and increased use of RBC units in patients of the control compared to the CS group (1000 mL: 1.0 vs. 0.6 RBC units; 2000 mL: 2.2 vs. 1.1 RBC units; 3000 mL: 3.4 vs. 1.6 RBC units). Thus, CS was significantly associated with a reduced number of allogeneic RBCs by 40% for 1000 mL, 49% for 2000 mL, and 52% for 3000 mL of blood loss compared to patients without CS.
Conclusions: Cell salvage was significantly associated with a reduced number of allogeneic RBC transfusions. It supports the beneficial effect of CS in cardiac surgical patients as an individual measure in a comprehensive PBM program.
Six dentin adhesives were tested in vitro regarding their cytotoxicity on human fibroblasts. The adhesives Hybrid Bond, One-up Bond F Plus, AdheSE, Clearfil SE Bond, Optibond Solo Plus and Syntac were eluted with culture medium as single or sequentially applied adhesive part for 24 h. 75 Petri dishes were produced per group. They were evaluated triangulated, comprising the quantitative evaluation (105 ones) to determine “viable”, “dead” and “debris” cells with the use of a cell-counter and the reactivity index was also identified based on the qualitative assessment (420 ones). One-up Bond F Plus, AdheSE and Clearfil SE Bond showed a statistical difference of viable cells to the cell control. For One-up Bond F Plus, statistically, differences compared to hybrid bond and Syntac were also found. All the adhesives except One-up Bond F Plus showed significant differences between single and sequentially applied adhesive part regarding the quantitative evaluation. The test material showed a moderate grade of cytotoxicity. As a result, a statistically significant difference of the cytotoxicity between the self-etch and etch-and-rinse adhesives cannot be demonstrated regarding the qualitative evaluation and the reactivity index, but the differences between sequentially applied and single applied components can be proved.
Background: Ribavirin (RBV) remains part of several interferon-free treatment strategies even though its mechanisms of action are still not fully understood. One hypothesis is that RBV increases responsiveness to type I interferons. Pegylated Interferon alpha (PEG-IFNa) has recently been shown to alter natural killer (NK) cell function possibly contributing to control of hepatitis C virus (HCV) infection. However, the effects of ribavirin alone or in combination with IFNa on NK cells are unknown.
Methods: Extensive ex vivo phenotyping and functional analysis of NK cells from hepatitis C patients was performed during antiviral therapy. Patients were treated for 6 weeks with RBV monotherapy (n = 11), placebo (n = 13) or PEG-IFNa-2a alone (n = 6) followed by PEG-IFNa/RBV combination therapy. The effects of RBV and PEG-IFNa-2a on NK cells were also studied in vitro after co-culture with K562 or Huh7.5 cells.
Results: Ribavirin monotherapy had no obvious effects on NK cell phenotype or function, neither ex vivo in patients nor in vitro. In contrast, PEG-IFNa-2a therapy was associated with an increase of CD56bright cells and distinct changes in expression profiles leading to an activated NK cell phenotype, increased functionality and decline of terminally differentiated NK cells. Ribavirin combination therapy reduced some of the IFN effects. An activated NK cell phenotype during therapy was inversely correlated with HCV viral load.
Conclusions: PEG-IFNa activates NK cells possibly contributing to virological responses independently of RBV. The role of NK cells during future IFN-free combination therapies including RBV remains to be determined.
Background: Many patients suffering from exercise-induced asthma (EIA) have normal lung function at rest and show symptoms and a decline in FEV1 when they do sports or during exercise-challenge. It has been described that long-chain polyunsaturated fatty acids (LCPUFA) could exert a protective effect on EIA.
Methods: In this study the protective effect of supplementation with a special combination of n-3 and n-6 LCPUFA (sc-LCPUFA) (total 1.19 g/ day) were investigated in an EIA cold air provocation model. Primary outcome measure: Decrease in FEV1 after exercise challenge and secondary outcome measure: anti-inflammatory effects monitored by exhaled NO (eNO) before and after sc-LCPUFA supplementation versus placebo.
Results: Ninety-nine patients with exercise-induced symptoms aged 10 to 45 were screened by a standardized exercise challenge in a cold air chamber at 4 °C. Seventy-three patients fulfilled the inclusion criteria of a FEV1 decrease > 15% and were treated double-blind placebo-controlled for 4 weeks either with sc-LCPUFA or placebo. Thirty-two patients in each group completed the study. Mean FEV1 decrease after cold air exercise challenge and eNO were unchanged after 4 weeks sc-LCPUFA supplementation.
Conclusion: Supplementation with sc-LCPUFA at a dose of 1.19 g/d did not have any broncho-protective and anti-inflammatory effects on EIA.
Trial registration: Clinical trial registration number: NCT02410096. Registered 7 February 2015 at Clinicaltrial.gov
The lipid status in patients with ulcerative colitis : Sphingolipids are disease-dependent regulated
(2019)
The factors that contribute to the development of ulcerative colitis (UC), are still not fully identified. Disruption of the colon barrier is one of the first events leading to invasion of bacteria and activation of the immune system. The colon barrier is strongly influenced by sphingolipids. Sphingolipids impact cell–cell contacts and function as second messengers. We collected blood and colon tissue samples from UC patients and healthy controls and investigated the sphingolipids and other lipids by LC-MS/MS or LC-QTOFMS. The expression of enzymes of the sphingolipid pathway were determined by RT-PCR and immunohistochemistry. In inflamed colon tissue, the de novo-synthesis of sphingolipids is reduced, whereas lactosylceramides are increased. Reduction of dihydroceramides was due to posttranslational inhibition rather than altered serine palmitoyl transferase or ceramide synthase expression in inflamed colon tissue. Furthermore, in human plasma from UC-patients, several sphinglipids change significantly in comparison to healthy controls. Beside sphingolipids free fatty acids, lysophosphatidylcholines and triglycerides changed significantly in the blood of colitis patients dependent on the disease severity. Our data indicate that detraction of the sphingolipid de novo synthesis in colon tissue might be an important trigger for UC. Several lipids changed significantly in the blood, which might be used as biomarkers for disease control; however, diet-related variabilities need to be considered.
In-line filtration of intravenous infusion may reduce organ dysfunction of adult critical patients
(2019)
Background: The potential harmful effects of particle-contaminated infusions for critically ill adult patients are yet unclear. So far, only significant improved outcome in critically ill children and new-borns was demonstrated when using in-line filters, but for adult patients, evidence is still missing.
Methods: This single-centre, retrospective controlled cohort study assessed the effect of in-line filtration of intravenous fluids with finer 0.2 or 1.2 μm vs 5.0 μm filters in critically ill adult patients. From a total of n = 3215 adult patients, n = 3012 patients were selected by propensity score matching (adjusting for sex, age, and surgery group) and assigned to either a fine filter cohort (with 0.2/1.2 μm filters, n = 1506, time period from February 2013 to January 2014) or a control filter cohort (with 5.0 μm filters, n = 1506, time period from April 2014 to March 2015). The cohorts were compared regarding the occurrence of severe vasoplegia, organ dysfunctions (lung, kidney, and brain), inflammation, in-hospital complications (myocardial infarction, ischemic stroke, pneumonia, and sepsis), in-hospital mortality, and length of ICU and hospital stay.
Results: Comparing fine filter vs control filter cohort, respiratory dysfunction (Horowitz index 206 (119–290) vs 191 (104.75–280); P = 0.04), pneumonia (11.4% vs 14.4%; P = 0.02), sepsis (9.6% vs 12.2%; P = 0.03), interleukin-6 (471.5 (258.8–1062.8) ng/l vs 540.5 (284.5–1147.5) ng/l; P = 0.01), and length of ICU (1.2 (0.6–4.9) vs 1.7 (0.8–6.9) days; P < 0.01) and hospital stay (14.0 (9.2–22.2) vs 14.8 (10.0–26.8) days; P = 0.01) were reduced. Rate of severe vasoplegia (21.0% vs 19.6%; P > 0.20) and acute kidney injury (11.8% vs 13.7%; P = 0.11) was not significantly different between the cohorts.
Conclusions: In-line filtration with finer 0.2 and 1.2 μm filters may be associated with less organ dysfunction and less inflammation in critically ill adult patients.
Trial registration: The study was registered at ClinicalTrials.gov (number: NCT02281604).
Ataxia telangiectasia (A-T) is a devastating multi-system disorder characterized by progressive cerebellar ataxia, immunodeficiency, genetic instability, premature aging and growth retardation. Due to better care the patients get older than in the past and new disease entities like disturbed glucose tolerance and liver disease emerge. The objective of the present investigation is to determine the evolution of liver disease and its relation to age and neurological deterioration. The study included 67 patients aged 1 to 38 years with classical A-T. At least two measurements of liver enzymes were performed within a minimum interval of 6 months in 56 patients. The median follow-up period was 4 years (1–16 years). A total of 316 liver enzyme measurements were performed. For analysis, patients were divided into two age groups (Group 1: <12 years; group 2: ≥12 years). In addition, ultrasound of the liver and Klockgether Ataxia Score (KAS) were analyzed. We found significantly higher levels of alpha-fetoprotein (AFP) (226,8 ± 20.87 ng/ml vs. 565,1 ± 24.3 ng/ml, p < 0.0001), and liver enzymes like ALT (23.52 ± 0.77 IU/L vs. 87.83 ± 5.31 IU/L, p < 0.0001) in patients in group 2. In addition, we could show a significant correlation between age and AFP, GGT, and KAS. Ultrasound revealed hepatic steatosis in 11/19 (57.9%) patients in group 2. One female patient aged 37 years died due to a hepato-cellular carcinoma (HCC). Liver disease is present in the majority of older A-T patients. Structural changes, non-alcoholic fatty liver disease and fibrosis are frequent findings. Progress of liver disease is concomitant to neurological deterioration.
Aim. To compare the efficacy, safety, and patient’s perception of two prostaglandin E2 application methods for induction of labour.
Method. Above 36th weeks of gestation, all women, who were admitted to hospital for induction of labour, were prospectively randomised to intravaginal 1 mg or intracervical 0.5 mg irrespective of cervical Bishop score. The main outcome variables were induction-to-delivery interval, number of foetal blood samples, PDA rate, rate of oxytocin augmentation, rate of vaginal delivery, and patient’s perception using semantic differential questionnaire.
Results. Thirty-nine patients were enrolled in this study. There was no statistical significant difference between the two groups in regard to perceptions of induction. The median induction delivery time using intravaginal versus intracervical administration was 29.9 versus 12.8 hours, respectively (). No statistically difference between the groups was detected in regard to parity, gestation age, cervical Bishop score, number of foetal blood samples, PDA rate, rate of oxytocin augmentation, and mode of birth.
Summary. Irrespective of the cervical Bishop Score, intracervical gel had a shorter induction delivery time without impingement on the women’s perception of induction.
Aim: It can be challenging to distinguish COVID-19 in children from other common infections. We set out to determine the rate at which children consulting a primary care paediatrician with an acute infection are infected with SARS-CoV-2 and to compare distinct findings. Method: In seven out-patient clinics, children aged 0–13 years with any new respiratory or gastrointestinal symptoms and presumed infection were invited to be tested for SARS-CoV-2. Factors that were correlated with testing positive were determined. Samples were collected from 25 January 2021 to 01 April 2021. Results: Seven hundred and eighty-three children participated in the study (median age 3 years and 0 months, range 1 month to 12 years and 11 months). Three hundred and fifty-eight were female (45.7%). SARS-CoV-2 RNA was detected in 19 (2.4%). The most common symptoms in children with as well as without detectable SARS-CoV-2 RNA were rhinitis, fever and cough. Known recent exposure to a case of COVID-19 was significantly correlated with testing positive, but symptoms or clinical findings were not. Conclusion: COVID-19 among the children with symptoms of an acute infection was uncommon, and the clinical presentation did not differ significantly between children with and without evidence of an infection with SARS-CoV-2.
Objective: To assess the prevalence of prenatal screening and of adverse outcome in high-risk pregnancies due to maternal HIV infection.
Study design: The prevalence of prenatal screening in 330 pregnancies of HIV-positive women attending the department for prenatal screening and/or during labour between January 1, 2002 and December 31, 2012, was recorded. Screening results were compared with the postnatal outcome and maternal morbidity, and mother-to-child transmission (MTCT) was evaluated.
Results: One hundred of 330 women (30.5%) had an early anomaly scan, 252 (74.5%) had a detailed scan at 20–22 weeks, 18 (5.5%) had a detailed scan prior to birth, and three (0.9%) had an amniocentesis. In seven cases (2.12%), a fetal anomaly was detected prenatally and confirmed postnatally, while in eight (2.42%) an anomaly was only detected postnatally, even though a prenatal scan was performed. There were no anomalies in the unscreened group. MTCT occurred in three cases (0.9%) and seven fetal and neonatal deaths (2.1%) were reported.
Conclusion: The overall prevalence of prenatal ultrasound screening in our cohort is 74.5%, but often the opportunity for prenatal ultrasonography in the first trimester is missed. In general, the aim should be to offer prenatal ultrasonography in the first trimester in all pregnancies. This allows early reassurance or if fetal disease is suspected, further steps can be taken.
Estimating the age of the developmental stages of the blow fly Calliphora vicina (Diptera: Calliphoridae) is of forensic relevance for the determination of the minimum post-mortem interval (PMImin). Fly eggs and larvae can be aged using anatomical and morphological characters and their modification during development. However, such methods can only hardly be applied for aging fly pupae. Previous study described age estimation of C. vicina pupae using gene expression, but just when reared at constant temperatures, but fluctuating temperatures represent a more realistic scenario at a crime scene. Therefore, age-dependent gene expression of C. vicina pupae were compared at 3 fluctuating and 3 constant temperatures, the latter representing the mean values of the fluctuating profiles. The chosen marker genes showed uniform expression patterns during metamorphosis of C. vicina pupae bred at different temperature conditions (constant or fluctuating) but the same mean temperature (e.g. constant 10 °C vs. fluctuating 5–15 °C). We present an R-based statistical tool, which enables estimation of the age of the examined pupa based on the analysed gene expression data.
Long-term effects on cirrhosis and portal hypertension of direct antiviral agent (DAA)-based eradication of hepatitis C virus (HCV) are still under debate. We analysed dynamics of liver and spleen elastography to assess potential regression of cirrhosis and portal hypertension 3 years post-treatment. Fifty-four patients with HCV-associated cirrhosis and DAA-induced SVR were included. Liver and spleen stiffness were measured at baseline (BL), end of treatment (EOT), 24 weeks after EOT (FU24) and 1, 2 and 3 (FU144) years post-treatment by transient liver elastography (L-TE) and point shear wave elastography (pSWE) using acoustic radiation force impulse (ARFI) of the liver (L-ARFI) and spleen (S-ARFI). Biochemical, virological and clinical data were also obtained. Liver stiffness assessed by L-TE decreased between BL [median (range), 32.5(9.1–75) kPa] and EOT [21.3(6.7–73.5) kPa; p < .0001] and EOT and FU144 [16(4.1–75) kPa; p = .006]. L-ARFI values improved between EOT [2.5(1.2–4.1) m/s] and FU144 [1.7(0.9–4.1) m/s; p = .001], while spleen stiffness remained unchanged. Overall, L-TE improved in 38 of 54 (70.4%) patients at EOT and 29 of 38 (76.3%) declined further until FU144, whereas L-ARFI values decreased in 30/54 (55.6%) patients at EOT and continued to decrease in 28/30 (93.3%) patients at FU144. Low bilirubin and high albumin levels at BL were associated with improved L-ARFI values (p = .048) at EOT or regression of cirrhosis (<12.5 kPa) by L-TE at FU144 (p = .005), respectively. Liver stiffness, but not spleen stiffness, continued to decline in a considerable proportion of patients with advanced liver disease after HCV eradication.
Background/aims: Hepatocellular carcinoma (HCC) is a leading indication for liver transplantation (LT) worldwide. Early identification of patients at risk for HCC recurrence is of paramount importance since early treatment of recurrent HCC after LT may be associated with increased survival. We evaluated incidence of and predictors for HCC recurrence, with a focus on the course of AFP levels.
Methods: We performed a retrospective, single-center study of 99 HCC patients who underwent LT between January 28th, 1997 and May 11th, 2016. A multi-stage proportional hazards model with three stages was used to evaluate potential predictive markers, both by univariate and multivariable analysis, for influences on 1) recurrence after transplantation, 2) mortality without HCC recurrence, and 3) mortality after recurrence.
Results: 19/99 HCC patients showed recurrence after LT. Waiting time was not associated with overall HCC recurrence (HR = 1, p = 0.979). Similarly, waiting time did not affect mortality in LT recipients both with (HR = 0.97, p = 0.282) or without (HR = 0.99, p = 0.685) HCC recurrence. Log10-transformed AFP values at the time of LT (HR 1.75, p = 0.023) as well as after LT (HR 2.07, p = 0.037) were significantly associated with recurrence. Median survival in patients with a ratio (AFP at recurrence divided by AFP 3 months before recurrence) of 0.5 was greater than 70 months, as compared to a median of only 8 months in patients with a ratio of 5.
Conclusion: A rise in AFP levels rather than an absolute threshold could help to identify patients at short-term risk for HCC recurrence post LT, which may allow intensification of the surveillance strategy on an individualized basis.
Aim: It can be challenging to distinguish COVID-19 in children from other common infections. We set out to determine the rate at which children consulting a primary care paediatrician with an acute infection are infected with SARS-CoV-2 and to compare distinct findings. Method: In seven out-patient clinics, children aged 0–13 years with any new respiratory or gastrointestinal symptoms and presumed infection were invited to be tested for SARS-CoV-2. Factors that were correlated with testing positive were determined. Samples were collected from 25 January 2021 to 01 April 2021. Results: Seven hundred and eighty-three children participated in the study (median age 3 years and 0 months, range 1 month to 12 years and 11 months). Three hundred and fifty-eight were female (45.7%). SARS-CoV-2 RNA was detected in 19 (2.4%). The most common symptoms in children with as well as without detectable SARS-CoV-2 RNA were rhinitis, fever and cough. Known recent exposure to a case of COVID-19 was significantly correlated with testing positive, but symptoms or clinical findings were not. Conclusion: COVID-19 among the children with symptoms of an acute infection was uncommon, and the clinical presentation did not differ significantly between children with and without evidence of an infection with SARS-CoV-2.
Objective. To examine the effects of clinical hypnosis versus NLP intervention on the success rate of ECV procedures in comparison to a control group.
Methods. A prospective off-centre randomised trial of a clinical hypnosis intervention against NLP of women with a singleton breech fetus at or after 370/7 (259 days) weeks of gestation and normal amniotic fluid index. All 80 participants heard a 20-minute recorded intervention via head phones. Main outcome assessed was success rate of ECV. The intervention groups were compared with a control group with standard medical care alone (n=122).
Results. A total of 42 women, who received a hypnosis intervention prior to ECV, had a 40.5% (n=17), successful ECV, whereas 38 women, who received NLP, had a 44.7% (n=17) successful ECV (P > 0.05). The control group had similar patient characteristics compared to the intervention groups (P > 0.05). In the control group (n = 122) 27.3% (n = 33) had a statistically significant lower successful ECV procedure than NLP (P = 0.05) and hypnosis and NLP (P = 0.03).
Conclusions. These findings suggest that prior clinical hypnosis and NLP have similar success rates of ECV procedures and are both superior to standard medical care alone.
One of the major challenges of allogeneic stem cell transplantation (allo-SCT) is to reduce the risk of graft-versus-host disease (GVHD) while boosting the graft-versus-leukemia (GVL) effect. The reconstitution of natural killer (NK) cells following allo-SCT is of notable interest due to their known capability to induce GVL without GVHD. Here, in this study, we investigate the association between the incidence and severity of acute graft-versus-host disease (aGVHD) and the early reconstitution of NK cell subsets following allo-SCT. We analyzed 342 samples from 107 patients using flow cytometry, with a focus on immature CD56high and mature cytotoxic CD56dim NK cells. Longitudinal analysis of immune reconstitution after allo-SCT showed that the incidence of aGVHD was associated with a delayed expansion of the entire NK cell population, in particular the CD56high subset. Notably, the disturbed reconstitution of the CD56high NK cells also correlated with the severity of aGVHD.
Background: The aim of this meta-analysis was to evaluate efficacy and safety of first-line chemotherapy with or without a monoclonal antibody in elderly patients ( ≥ 70 years) with metastatic colorectal cancer (mCRC), since they are frequently underrepresented in clinical trials.
Results: Individual data from 10 studies were included. From a total of 3271 patients, 604 patients (18%) were ≥ 70 years (median 73 years, range 70–88). Of these, 335 patients were treated with a bevacizumab-based first-line regimen and 265 were treated with chemotherapy only. The median PFS was 8.2 vs. 6.5 months and the median OS was 16.7 vs. 13.0 months in patients treated with and without bevacizumab, respectively. The safety profile of bevacizumab in combination with first-line chemotherapy did not differ from published clinical trials.
Materials and Methods: PubMed and Cochrane Library searches were performed on 29 April 2013 and studies published to this date were included. Authors were contacted to request progression-free survival (PFS), overall survival (OS) data, patient data on treatment regimens, age, sex and potential signs of toxicity in patients ≥ 70 years of age.
Conclusions: This meta-analysis suggests that the addition of bevacizumab to standard first-line chemotherapy improves clinical outcome in elderly patients with mCRC and is well tolerated.
Introduction Patients undergoing heart valve surgery are predominantly transferred postoperatively to the intensive care unit (ICU) under continuous sedation. Volatile anaesthetics are an increasingly used treatment alternative to intravenous substances in the ICU. As subject to inhalational uptake and elimination, the resulting pharmacological benefits have been repeatedly demonstrated. Therefore, volatile anaesthetics appear suitable to meet the growing demands of fast-track cardiac surgery. However, their use requires special preparation at the bedside and trained medical and nursing staff, which might limit the pharmacological benefits. The aim of our work is to assess whether the temporal advantages of recovery under volatile sedation outweigh the higher effort of special preparation.
Methods and analysis The study is designed to evaluate the differences between intravenous sedatives (n=48) and volatile sedatives (n=48) in continued intensive care sedation. This study will be conducted as a prospective, randomised, controlled, single-blinded, monocentre trial at a German university hospital in consenting adult patients undergoing heart valve surgery at a university hospital. This observational study will examine the necessary preparation time, staff consultation and overall feasibility of the chosen sedation method. For this purpose, the continuation of sedation in the ICU with volatile sedatives is considered as one study arm and with intravenous sedatives as the comparison group. Due to rapid elimination and quick awakening after the termination of sedation, closer consultation between the attending physician and the ICU nursing staff is required, in addition to a prolonged setup time. Study analysis will include the required setup time, time from admission to extubation as primary outcome and neurocognitive assessability. In addition, possible operation-specific (blood loss, complications), treatment parameters (catecholamine dosages, lung function) and laboratory results (acute kidney injury, acid base balance (lactataemia), liver failure) as influencing factors will be collected. The study-relevant data will be extracted from the continuous digital records of the patient data management system after the patient has been discharged from the ICU. For statistical evaluation, 95% CIs will be calculated for the median time to extubation and neurocognitive assessability, and the association will be assessed with a Cox regression model. In addition, secondary binary outcome measures will be evaluated using Fisher’s exact tests. Further descriptive and exploratory statistical analyses are also planned.
Ethics and dissemination The study was approved by the Institutional Ethics Board of the University of Frankfurt, Germany (#20-1050). Informed consent of all individual patients will be obtained before randomisation. Results will be disseminated via publication in peer-reviewed journals.
Rapid immune reconstitution (IR) following stem cell transplantation (SCT) is essential for a favorable outcome. The optimization of graft composition should not only enable a sufficient IR but also improve graft vs. leukemia/tumor effects, overcome infectious complications and, finally, improve patient survival. Especially in haploidentical SCT, the optimization of graft composition is controversial. Therefore, we analyzed the influence of graft manipulation on IR in 40 patients with acute leukemia in remission. We examined the cell recovery post haploidentical SCT in patients receiving a CD34+-selected or CD3/CD19-depleted graft, considering the applied conditioning regimen. We used joint model analysis for overall survival (OS) and analyzed the dynamics of age-adjusted leukocytes; lymphocytes; monocytes; CD3+, CD3+CD4+, and CD3+CD8+ T cells; natural killer (NK) cells; and B cells over the course of time after SCT. Lymphocytes, NK cells, and B cells expanded more rapidly after SCT with CD34+-selected grafts (P = 0.036, P = 0.002, and P < 0.001, respectively). Contrarily, CD3+CD4+ helper T cells recovered delayer in the CD34 selected group (P = 0.026). Furthermore, reduced intensity conditioning facilitated faster immune recovery of lymphocytes and T cells and their subsets (P < 0.001). However, the immune recovery for NK cells and B cells was comparable for patients who received reduced-intensity or full preparative regimens. Dynamics of all cell types had a significant influence on OS, which did not differ between patients receiving CD34+-selected and those receiving CD3/CD19-depleted grafts. In conclusion, cell reconstitution dynamics showed complex diversity with regard to the graft manufacturing procedure and conditioning regimen.
Background: Patients with liver cirrhosis have a highly elevated risk of developing bacterial infections that significantly decrease survival rates. One of the most relevant infections is spontaneous bacterial peritonitis (SBP). Recently, NOD2 germline variants were found to be potential predictors of the development of infectious complications and mortality in patients with cirrhosis. The aim of the INCA (Impact of NOD2 genotype-guided antibiotic prevention on survival in patients with liver Cirrhosis and Ascites) trial is to investigate whether survival of this genetically defined high-risk group of patients with cirrhosis defined by the presence of NOD2 variants is improved by primary antibiotic prophylaxis of SBP.
Methods/Design: The INCA trial is a double-blind, placebo-controlled clinical trial with two parallel treatment arms (arm 1: norfloxacin 400 mg once daily; arm 2: placebo once daily; 12-month treatment and observational period). Balanced randomization of 186 eligible patients with stratification for the protein content of the ascites (<15 versus ≥15 g/L) and the study site is planned. In this multicenter national study, patients are recruited in at least 13 centers throughout Germany. The key inclusion criterion is the presence of a NOD2 risk variant in patients with decompensated liver cirrhosis. The most important exclusion criteria are current SBP or previous history of SBP and any long-term antibiotic prophylaxis. The primary endpoint is overall survival after 12 months of treatment. Secondary objectives are to evaluate whether the frequencies of SBP and other clinically relevant infections necessitating antibiotic treatment, as well as the total duration of unplanned hospitalization due to cirrhosis, differ in both study arms. Recruitment started in February 2014.
Discussion: Preventive strategies are required to avoid life-threatening infections in patients with liver cirrhosis, but unselected use of antibiotics can trigger resistant bacteria and worsen outcome. Thus, individualized approaches that direct intervention only to patients with the highest risk are urgently needed. This trial meets this need by suggesting stratified prevention based on genetic risk assessment. To our knowledge, the INCA trial is first in the field of hepatology aimed at rapidly transferring and validating information on individual genetic risk into clinical decision algorithms.
Trial registrations: German Clinical Trials Register DRKS00005616. Registered 22 January 2014. EU Clinical Trials Register EudraCT 2013-001626-26. Registered 26 January 2015.
Rationale: Postinfectious bronchiolitis obliterans (PIBO) is a rare, chronic respiratory condition, which follows an acute insult due to a severe infection of the lower airways. Objectives: The objective of this study was to investigate the long-term course of bronchial inflammation and pulmonary function testing in children with PIBO. Methods: Medical charts of 21 children with PIBO were analyzed retrospectively at the Children's University Hospital Frankfurt/Main Germany. Pulmonary function tests (PFTs) with an interval of at least 1 month were studied between 2002 and 2019. A total of 382 PFTs were analyzed retrospectively and per year, the two best PFTs, in total 217, were evaluated. Additionally, 56 sputum analysis were assessed and the sputum neutrophils were evaluated. Results: The evaluation of the 217 PFTs showed a decrease in FEV1 with a loss of 1.07% and a loss in z score of −0.075 per year. FEV1/FVC decreased by 1.44 per year. FVC remained stable, showing a nonsignificant increase by 0.006 in z score per year. However, FEV1 and FVC in L increased significantly with FEV1 0.032 L per cm and FVC 0.048 L/cm in height. Sputum neutrophils showed a significant increase of 2.12% per year. Conclusion: Our results demonstrated that in patients with PIBO pulmonary function decreased significantly showing persistent obstruction over an average follow-up period of 8 years. However, persistent lung growth was revealed. In addition, pulmonary inflammation persisted clearly showing an increasing amount of neutrophils in induced sputum. Patients did not present with a general susceptibility to respiratory infections.
High sedation needs of critically ill COVID-19 ARDS patients - a monocentric observational study
(2021)
Background: Therapy of severely affected coronavirus patient, requiring intubation and sedation is still challenging. Recently, difficulties in sedating these patients have been discussed. This study aims to describe sedation practices in patients with 2019 coronavirus disease (COVID-19)-induced acute respiratory distress syndrome (ARDS). Methods: We performed a retrospective monocentric analysis of sedation regimens in critically ill intubated patients with respiratory failure who required sedation in our mixed 32-bed university intensive care unit. All mechanically ventilated adults with COVID-19-induced ARDS requiring continuously infused sedative therapy admitted between April 4, 2020, and June 30, 2020 were included. We recorded demographic data, sedative dosages, prone positioning, sedation levels and duration. Descriptive data analysis was performed; for additional analysis, a logistic regression with mixed effect was used. Results: In total, 56 patients (mean age 67 (±14) years) were included. The mean observed sedation period was 224 (±139) hours. To achieve the prescribed sedation level, we observed the need for two or three sedatives in 48.7% and 12.8% of the cases, respectively. In cases with a triple sedation regimen, the combination of clonidine, esketamine and midazolam was observed in most cases (75.7%). Analgesia was achieved using sufentanil in 98.6% of the cases. The analysis showed that the majority of COVID-19 patients required an unusually high sedation dose compared to those available in the literature. Conclusion: The global pandemic continues to affect patients severely requiring ventilation and sedation, but optimal sedation strategies are still lacking. The findings of our observation suggest unusual high dosages of sedatives in mechanically ventilated patients with COVID-19. Prescribed sedation levels appear to be achievable only with several combinations of sedatives in most critically ill patients suffering from COVID-19-induced ARDS and a potential association to the often required sophisticated critical care including prone positioning and ECMO treatment seems conceivable.
Background: The development of robotic systems has provided an alternative to frame-based stereotactic procedures. The aim of this experimental phantom study was to compare the mechanical accuracy of the Robotic Surgery Assistant (ROSA) and the Leksell stereotactic frame by reducing clinical and procedural factors to a minimum.
Methods: To precisely compare mechanical accuracy, a stereotactic system was chosen as reference for both methods. A thin layer CT scan with an acrylic phantom fixed to the frame and a localizer enabling the software to recognize the coordinate system was performed. For each of the five phantom targets, two different trajectories were planned, resulting in 10 trajectories. A series of five repetitions was performed, each time based on a new CT scan. Hence, 50 trajectories were analyzed for each method. X-rays of the final cannula position were fused with the planning data. The coordinates of the target point and the endpoint of the robot- or frame-guided probe were visually determined using the robotic software. The target point error (TPE) was calculated applying the Euclidian distance. The depth deviation along the trajectory and the lateral deviation were separately calculated.
Results: Robotics was significantly more accurate, with an arithmetic TPE mean of 0.53 mm (95% CI 0.41–0.55 mm) compared to 0.72 mm (95% CI 0.63–0.8 mm) in stereotaxy (p < 0.05). In robotics, the mean depth deviation along the trajectory was −0.22 mm (95% CI −0.25 to −0.14 mm). The mean lateral deviation was 0.43 mm (95% CI 0.32–0.49 mm). In frame-based stereotaxy, the mean depth deviation amounted to −0.20 mm (95% CI −0.26 to −0.14 mm), the mean lateral deviation to 0.65 mm (95% CI 0.55–0.74 mm).
Conclusion: Both the robotic and frame-based approach proved accurate. The robotic procedure showed significantly higher accuracy. For both methods, procedural factors occurring during surgery might have a more relevant impact on overall accuracy.
Background: IL28B gene polymorphism is the best baseline predictor of response to interferon alfa-based antiviral therapies in chronic hepatitis C. Recently, a new IFN-L4 polymorphism was identified as first potential functional variant for induction of IL28B expression. Individualization of interferon alfa-based therapies based on a combination of IL28B/IFN-L4 polymorphisms may help to optimize virologic outcome and economic resources.
Methods: Optimization of treatment outcome prediction was assessed by combination of different IL28B and IFN-L4 polymorphisms in patients with chronic HCV genotype 1 (n = 385), 2/3 (n = 267), and 4 (n = 220) infection treated with pegylated interferon alfa (PEG-IFN) and ribavirin with (n = 79) or without telaprevir. Healthy people from Germany (n = 283) and Egypt (n = 96) served as controls.
Results: Frequencies of beneficial IL28B rs12979860 C/C genotypes were lower in HCV genotype 1/4 infected patients in comparison to controls (20–35% vs. 46–47%) this was also true for ss469415590 TT/TT (20–35% vs. 45–47%). Single interferon-lambda SNPs (rs12979860, rs8099917, ss469415590) correlated with sustained virologic response (SVR) in genotype 1, 3, and 4 infected patients while no association was observed for genotype 2. Interestingly, in genotype 3 infected patients, best SVR prediction was based on IFN-L4 genotype. Prediction of SVR with high accuracy (71–96%) was possible in genotype 1, 2, 3 and 4 infected patients who received PEG-IFN/ribavirin combination therapy by selection of beneficial IL28B rs12979860 C/C and/or ss469415590 TT/TT genotypes (p<0.001). For triple therapy with first generation protease inhibitors (PIs) (boceprevir, telaprevir) prediction of high SVR (90%) rates was based on the presence of at least one beneficial genotype of the 3 IFN-lambda SNPs.
Conclusion: IFN-L4 seems to be the best single predictor of SVR in genotype 3 infected patients. For optimized prediction of SVR by treatment with dual combination or first generation PI triple therapies, grouping of interferon-lambda haplotypes may be helpful with positive predictive values of 71–96%.
Triple therapy of chronic hepatitis C virus (HCV) infection with boceprevir (BOC) or telaprevir (TVR) leads to virologic failure in many patients which is often associated with the selection of resistance-associated variants (RAVs). These resistance profiles are of importance for the selection of potential rescue treatment options. In this study, we sequenced baseline NS3 RAVs population-based and investigated the sensitivity of NS3 phenotypes in an HCV replicon assay together with clinical factors for a prediction of treatment response in a cohort of 165 German and Swiss patients treated with a BOC or TVR-based triple therapy. Overall, the prevalence of baseline RAVs was low, although the frequency of RAVs was higher in patients with virologic failure compared to those who achieved a sustained virologic response (SVR) (7% versus 1%, P = 0.06). The occurrence of RAVs was associated with a resistant NS3 quasispecies phenotype (P<0.001), but the sensitivity of phenotypes was not associated with treatment outcome (P = 0.2). The majority of single viral and host predictors of SVR was only weakly associated with treatment response. In multivariate analyses, low AST levels, female sex and an IFNL4 CC genotype were independently associated with SVR. However, a combined analysis of negative predictors revealed a significantly lower overall number of negative predictors in patients with SVR in comparison to individuals with virologic failure (P<0.0001) and the presence of 2 or less negative predictors was indicative for SVR. These results demonstrate that most single baseline viral and host parameters have a weak influence on the response to triple therapy, whereas the overall number of negative predictors has a high predictive value for SVR.
Interleukin-22 predicts severity and death in advanced liver cirrhosis: a prospective cohort study
(2012)
Background: Interleukin-22 (IL-22), recently identified as a crucial parameter of pathology in experimental liver damage, may determine survival in clinical end-stage liver disease. Systematic analysis of serum IL-22 in relation to morbidity and mortality of patients with advanced liver cirrhosis has not been performed so far.
Methods: This is a prospective cohort study including 120 liver cirrhosis patients and 40 healthy donors to analyze systemic levels of IL-22 in relation to survival and hepatic complications.
Results: A total of 71% of patients displayed liver cirrhosis-related complications at study inclusion. A total of 23% of the patients died during a mean follow-up of 196 +/- 165 days. Systemic IL-22 was detectable in 74% of patients but only in 10% of healthy donors (P <0.001). Elevated levels of IL-22 were associated with ascites (P = 0.006), hepatorenal syndrome (P <0.0001), and spontaneous bacterial peritonitis (P = 0.001). Patients with elevated IL-22 (>18 pg/ml, n = 57) showed significantly reduced survival compared to patients with regular ([less than or equal to]18 pg/ml) levels of IL-22 (321 days versus 526 days, P = 0.003). Other factors associated with overall survival were high CRP ([greater than or equal to]2.9 mg/dl, P = 0.005, hazard ratio (HR) 0.314, confidence interval (CI) (0.141 to 0.702)), elevated serum creatinine (P = 0.05, HR 0.453, CI (0.203 to 1.012)), presence of liver-related complications (P = 0.028, HR 0.258 CI (0.077 to 0.862)), model of end stage liver disease (MELD) score [greater than or equal to]20 (P = 0.017, HR 0.364, CI (0.159 to 0.835)) and age (P = 0.011, HR 1.047, CI (1.011 to 1.085)). Adjusted multivariate Cox proportional-hazards analysis identified elevated systemic IL-22 levels as independent predictors of reduced survival (P = 0.007, HR 0.218, CI (0.072 to 0.662)).
Conclusions: In patients with liver cirrhosis, elevated systemic IL-22 levels are predictive for reduced survival independently from age, liver-related complications, CRP, creatinine and the MELD score. Thus, processes that lead to a rise in systemic interleukin-22 may be relevant for prognosis of advanced liver cirrhosis.
Background and Aims: In patients with advanced liver cirrhosis due to chronic hepatitis C virus (HCV) infection antiviral therapy with peginterferon and ribavirin is feasible in selected cases only due to potentially life-threatening side effects. However, predictive factors associated with hepatic decompensation during antiviral therapy are poorly defined.
Methods: In a retrospective cohort study, 68 patients with HCV-associated liver cirrhosis (mean MELD score 9.18±2.72) were treated with peginterferon and ribavirin. Clinical events indicating hepatic decompensation (onset of ascites, hepatic encephalopathy, upper gastrointestinal bleeding, hospitalization) as well as laboratory data were recorded at baseline and during a follow up period of 72 weeks after initiation of antiviral therapy. To monitor long term sequelae of end stage liver disease an extended follow up for HCC development, transplantation and death was applied (240weeks, ±SD 136weeks).
Results: Eighteen patients (26.5%) achieved a sustained virologic response. During the observational period a hepatic decompensation was observed in 36.8%. Patients with hepatic decompensation had higher MELD scores (10.84 vs. 8.23, p<0.001) and higher mean bilirubin levels (26.74 vs. 14.63 µmol/l, p<0.001), as well as lower serum albumin levels (38.2 vs. 41.1 g/l, p = 0.015), mean platelets (102.64 vs. 138.95/nl, p = 0.014) and mean leukocytes (4.02 vs. 5.68/nl, p = 0.002) at baseline as compared to those without decompensation. In the multivariate analysis the MELD score remained independently associated with hepatic decompensation (OR 1.56, 1.18–2.07; p = 0.002). When the patients were grouped according to their baseline MELD scores, hepatic decompensation occurred in 22%, 59%, and 83% of patients with MELD scores of 6–9, 10–13, and >14, respectively. Baseline MELD score was significantly associated with the risk for transplantation/death (p<0.001).
Conclusions: Our data suggest that the baseline MELD score predicts the risk of hepatic decompensation during antiviral therapy and thus contributes to decision making when antiviral therapy is discussed in HCV patients with advanced liver cirrhosis.
CD4+ T cell lymphopenia predicts mortality from Pneumocystis pneumonia in kidney transplant patients
(2020)
Background: Pneumocystis jirovecii pneumonia (PcP) remains a life-threatening opportunistic infection after solid organ transplantation, even in the era of Pneumocystis prophylaxis. The association between risk of developing PcP and low CD4+ T cell counts has been well established. However, it is unknown whether lymphopenia in the context of post-renal transplant PcP increases the risk of mortality. Methods: We carried out a retrospective analysis of a cohort of kidney transplant patients with PcP (n = 49) to determine the risk factors for mortality associated with PcP. We correlated clinical and demographic data with the outcome of the disease. For CD4+ T cell counts, we used the Wilcoxon rank sum test for in-hospital mortality and a Cox proportional-hazards regression model for 60-day mortality. Results: In univariate analyses, high CRP, high neutrophils, CD4+ T cell lymphopenia, mechanical ventilation, and high acute kidney injury network stage were associated with in-hospital mortality following presentation with PcP. In a receiver-operator characteristic (ROC) analysis, an optimum cutoff of ≤200 CD4+ T cells/µL predicted in-hospital mortality, CD4+ T cell lymphopenia remained a risk factor in a Cox regression model. Conclusions: Low CD4+ T cell count in kidney transplant recipients is a biomarker for disease severity and a risk factor for in-hospital mortality following presentation with PcP.
Introduction: The German PID-NET registry was founded in 2009, serving as the first national registry of patients with primary immunodeficiencies (PID) in Germany. It is part of the European Society for Immunodeficiencies (ESID) registry. The primary purpose of the registry is to gather data on the epidemiology, diagnostic delay, diagnosis, and treatment of PIDs.
Methods: Clinical and laboratory data was collected from 2,453 patients from 36 German PID centres in an online registry. Data was analysed with the software Stata® and Excel.
Results: The minimum prevalence of PID in Germany is 2.72 per 100,000 inhabitants. Among patients aged 1–25, there was a clear predominance of males. The median age of living patients ranged between 7 and 40 years, depending on the respective PID. Predominantly antibody disorders were the most prevalent group with 57% of all 2,453 PID patients (including 728 CVID patients). A gene defect was identified in 36% of patients. Familial cases were observed in 21% of patients. The age of onset for presenting symptoms ranged from birth to late adulthood (range 0–88 years). Presenting symptoms comprised infections (74%) and immune dysregulation (22%). Ninety-three patients were diagnosed without prior clinical symptoms. Regarding the general and clinical diagnostic delay, no PID had undergone a slight decrease within the last decade. However, both, SCID and hyper IgE- syndrome showed a substantial improvement in shortening the time between onset of symptoms and genetic diagnosis. Regarding treatment, 49% of all patients received immunoglobulin G (IgG) substitution (70%—subcutaneous; 29%—intravenous; 1%—unknown). Three-hundred patients underwent at least one hematopoietic stem cell transplantation (HSCT). Five patients had gene therapy.
Conclusion: The German PID-NET registry is a precious tool for physicians, researchers, the pharmaceutical industry, politicians, and ultimately the patients, for whom the outcomes will eventually lead to a more timely diagnosis and better treatment.
Eosinophilic cholangitis is a potentially underdiagnosed etiology in indeterminate biliary stricture
(2017)
AIM: To investigate presence and extent of eosinophilic cholangitis (EC) as well as IgG4-related disease in patients with indeterminate biliary stricture (IBS).
METHODS: All patients with diagnosis of sclerosing cholangitis (SC) and histopathological samples such as biopsies or surgical specimens at University Hospital Frankfurt from 2005-2015 were included. Histopathological diagnoses as well as further clinical course were reviewed. Tissue samples of patients without definite diagnosis after complete diagnostic work-up were reviewed regarding presence of eosinophilic infiltration and IgG4 positive plasma cells. Eosinophilic infiltration was as well assessed in a control group of liver transplant donors and patients with primary sclerosing cholangitis.
RESULTS: one hundred and thirty-five patients with SC were included. In 10/135 (13.5%) patients, no potential cause of IBS could be identified after complete diagnostic work-up and further clinical course. After histopathological review, a post-hoc diagnosis of EC was established in three patients resulting in a prevalence of 2.2% (3/135) of all patients with SC as well as 30% (3/10) of patients, where no cause of IBS was identified. 2/3 patients with post-hoc diagnosis of EC underwent surgical resection with suspicion for malignancy. Diagnosis of IgG4-related cholangitis was observed in 7/135 patients (5.1%), whereas 3 cases were discovered in post-hoc analysis. 6/7 cases with IgG4-related cholangitis (85.7%) presented with eosinophilic infiltration in addition to IgG4 positive plasma cells. There was no patient with eosinophilic infiltration in the control group of liver transplant donors (n = 27) and patients with primary sclerosing cholangitis (n = 14).
CONCLUSION: EC is an underdiagnosed benign etiology of SC and IBS, which has to be considered in differential diagnosis of IBS.
Seroconversion rates following influenza vaccination in patients with hematologic malignancies after hematopoietic stem cell transplantation (HSCT) are known to be lower compared to healthy adults. The aim of our diagnostic study was to determine the rate of seroconversion after 1 or 2 doses of a novel split virion, inactivated, AS03-adjuvanted pandemic H1N1 influenza vaccine (A/California/7/2009) in HSCT recipients (ClinicalTrials.gov Identifier: NCT01017172). Blood samples were taken before and 21 days after a first dose and 21 days after a second dose of the vaccine. Antibody (AB) titers were determined by hemagglutination inhibition assay. Seroconversion was defined by either an AB titer of ≤1:10 before and ≥1:40 after or ≥1:10 before and ≥4-fold increase in AB titer 21 days after vaccination. Seventeen patients (14 allogeneic, 3 autologous HSCT) received 1 dose and 11 of these patients 2 doses of the vaccine. The rate of seroconversion was 41.2% (95% confidence interval [CI] 18.4-67.1) after the first and 81.8% (95% CI 48.2-97.7) after the second dose. Patients who failed to seroconvert after 1 dose of the vaccine were more likely to receive any immunosuppressive agent (P = .003), but time elapsed after or type of HSCT, age, sex, or chronic graft-versus-host disease was not different when compared to patients with seroconversion. In patients with hematologic malignancies after HSCT the rate of seroconversion after a first dose of an adjuvanted H1N1 influenza A vaccine was poor, but increased after a second dose.
Testing for Severe Acute Respiratory Syndrome Coronavirus 2 (SARS-CoV-2) by RT-PCR is a vital public health tool in the pandemic. Self-collected samples are increasingly used as an alternative to nasopharyngeal swabs. Several studies suggested that they are sufficiently sensitive to be a useful alternative. However, there are limited data directly comparing several different types of self-collected materials to determine which material is preferable. A total of 102 predominantly symptomatic adults with a confirmed SARS-CoV-2 infection self-collected native saliva, a tongue swab, a mid-turbinate nasal swab, saliva obtained by chewing a cotton pad and gargle lavage, within 48 h of initial diagnosis. Sample collection was unsupervised. Both native saliva and gargling with tap water had high diagnostic sensitivity of 92.8% and 89.1%, respectively. Nasal swabs had a sensitivity of 85.1%, which was not significantly inferior to saliva (p = 0.092), but 16.6% of participants reported they had difficult in self-collection of this sample. A tongue swab and saliva obtained by chewing a cotton pad had a significantly lower sensitivity of 74.2% and 70.2%, respectively. Diagnostic sensitivity was not related to the presence of clinical symptoms or to age. When comparing self-collected specimens from different material, saliva, gargle lavage or mid-turbinate nasal swabs may be considered for most symptomatic patients. However, complementary experiments are required to verify that differences in performance observed among the five sampling modes were not attributed to collection impairment.
Association of mortality and early tracheostomy in patients with COVID-19: a retrospective analysis
(2022)
COVID-19 adds to the complexity of optimal timing for tracheostomy. Over the course of this pandemic, and expanded knowledge of the disease, many centers have changed their operating procedures and performed an early tracheostomy. We studied the data on early and delayed tracheostomy regarding patient outcome such as mortality. We performed a retrospective analysis of all tracheostomies at our institution in patients diagnosed with COVID-19 from March 2020 to June 2021. Time from intubation to tracheostomy and mortality of early (≤ 10 days) vs. late (> 10 days) tracheostomy were the primary objectives of this study. We used mixed cox-regression models to calculate the effect of distinct variables on events. We studied 117 tracheostomies. Intubation to tracheostomy shortened significantly (Spearman’s correlation coefficient; rho = − 0.44, p ≤ 0.001) during the course of this pandemic. Early tracheostomy was associated with a significant increase in mortality in uni- and multivariate analysis (Hazard ratio 1.83, 95% CI 1.07–3.17, p = 0.029). The timing of tracheostomy in COVID-19 patients has a potentially critical impact on mortality. The timing of tracheostomy has changed during this pandemic tending to be performed earlier. Future prospective research is necessary to substantiate these results.
Compressive knee joint contact force during walking is thought to be related to initiation and progression of knee osteoarthritis. However, joint loading is often evaluated with surrogate measures, like the external knee adduction moment, due to the complexity of computing joint contact forces. Statistical models have shown promising correlations between medial knee joint contact forces and knee adduction moments in particularly in individuals with knee osteoarthritis or after total knee replacements (R2 = 0.44–0.60). The purpose of this study was to evaluate how accurately model-based predictions of peak medial and lateral knee joint contact forces during walking could be estimated by linear mixed-effects models including joint moments for children and adolescents with and without valgus malalignment. Peak knee joint moments were strongly correlated (R2 > 0.85, p < 0.001) with both peak medial and lateral knee joint contact forces. The knee flexion and adduction moments were significant covariates in the models, strengthening the understanding of the statistical relationship between both moments and medial and lateral knee joint contact forces. In the future, these models could be used to evaluate peak knee joint contact forces from musculoskeletal simulations using peak joint moments from motion capture software, obviating the need for time-consuming musculoskeletal simulations.
The immune response is known to wane after vaccination with BNT162b2, but the role of age, morbidity and body composition is not well understood. We conducted a cross-sectional study in long-term care facilities (LTCFs) for the elderly. All study participants had completed two-dose vaccination with BNT162b2 five to 7 months before sample collection. In 298 residents (median age 86 years, range 75–101), anti-SARS-CoV-2 rector binding IgG antibody (anti-RBD-IgG) concentrations were low and inversely correlated with age (mean 51.60 BAU/ml). We compared the results to Health Care Workers (HCW) aged 18–70 years (n = 114, median age: 53 years), who had a higher mean anti-RBD-IgG concentration of 156.99 BAU/ml. Neutralization against the Delta variant was low in both groups (9.5% in LTCF residents and 31.6% in HCWs). The Charlson Comorbidity Index was inversely correlated with anti-RBD-IgG, but not the body mass index (BMI). A control group of 14 LTCF residents with known breakthrough infection had significant higher antibody concentrations (mean 3,199.65 BAU/ml), and 85.7% had detectable neutralization against the Delta variant. Our results demonstrate low but recoverable markers of immunity in LTCF residents five to 7 months after vaccination.
After myocardial infarction in the adult heart the remaining, non-infarcted tissue adapts to compensate the loss of functional tissue. This adaptation requires changes in gene expression networks, which are mostly controlled by transcription regulating proteins. Long non-coding transcripts (lncRNAs) are taking part in fine-tuning such gene programs. We describe and characterize the cardiomyocyte specific lncRNA Sweetheart RNA (Swhtr), an approximately 10 kb long transcript divergently expressed from the cardiac core transcription factor coding gene Nkx2-5. We show that Swhtr is dispensable for normal heart development and function but becomes essential for the tissue adaptation process after myocardial infarction in murine males. Re-expressing Swhtr from an exogenous locus rescues the Swhtr null phenotype. Genes that depend on Swhtr after cardiac stress are significantly occupied and therefore most likely regulated by NKX2-5. The Swhtr transcript interacts with NKX2-5 and disperses upon hypoxic stress in cardiomyocytes, indicating an auxiliary role of Swhtr for NKX2-5 function in tissue adaptation after myocardial injury.
Estimating intraoperative blood loss is one of the daily challenges for clinicians. Despite the knowledge of the inaccuracy of visual estimation by anaesthetists and surgeons, this is still the mainstay to estimate surgical blood loss. This review aims at highlighting the strengths and weaknesses of currently used measurement methods. A systematic review of studies on estimation of blood loss was carried out. Studies were included investigating the accuracy of techniques for quantifying blood loss in vivo and in vitro. We excluded nonhuman trials and studies using only monitoring parameters to estimate blood loss. A meta-analysis was performed to evaluate systematic measurement errors of the different methods. Only studies that were compared with a validated reference e.g. Haemoglobin extraction assay were included. 90 studies met the inclusion criteria for systematic review and were analyzed. Six studies were included in the meta-analysis, as only these were conducted with a validated reference. The mixed effect meta-analysis showed the highest correlation to the reference for colorimetric methods (0.93 95% CI 0.91–0.96), followed by gravimetric (0.77 95% CI 0.61–0.93) and finally visual methods (0.61 95% CI 0.40–0.82). The bias for estimated blood loss (ml) was lowest for colorimetric methods (57.59 95% CI 23.88–91.3) compared to the reference, followed by gravimetric (326.36 95% CI 201.65–450.86) and visual methods (456.51 95% CI 395.19–517.83). Of the many studies included, only a few were compared with a validated reference. The majority of the studies chose known imprecise procedures as the method of comparison. Colorimetric methods offer the highest degree of accuracy in blood loss estimation. Systems that use colorimetric techniques have a significant advantage in the real-time assessment of blood loss.
Background: The objective of the STREAM Trial was to evaluate the effect of simulation training on process times in acute stroke care.
Methods: The multicenter prospective interventional STREAM Trial was conducted between 10/2017 and 04/2019 at seven tertiary care neurocenters in Germany with a pre- and post-interventional observation phase. We recorded patient characteristics, acute stroke care process times, stroke team composition and simulation experience for consecutive direct-to-center patients receiving intravenous thrombolysis (IVT) and/or endovascular therapy (EVT). The intervention consisted of a composite intervention centered around stroke-specific in situ simulation training. Primary outcome measure was the ‘door-to-needle’ time (DTN) for IVT. Secondary outcome measures included process times of EVT and measures taken to streamline the pre-existing treatment algorithm.
Results: The effect of the STREAM intervention on the process times of all acute stroke operations was neutral. However, secondary analyses showed a DTN reduction of 5 min from 38 min pre-intervention (interquartile range [IQR] 25–43 min) to 33 min (IQR 23–39 min, p = 0.03) post-intervention achieved by simulation-experienced stroke teams. Concerning EVT, we found significantly shorter door-to-groin times in patients who were treated by teams with simulation experience as compared to simulation-naive teams in the post-interventional phase (−21 min, simulation-naive: 95 min, IQR 69–111 vs. simulation-experienced: 74 min, IQR 51–92, p = 0.04).
Conclusion: An intervention combining workflow refinement and simulation-based stroke team training has the potential to improve process times in acute stroke care.
The coronavirus pandemic continues to challenge global healthcare. Severely affected patients are often in need of high doses of analgesics and sedatives. The latter was studied in critically ill coronavirus disease 2019 (COVID-19) patients in this prospective monocentric analysis. COVID-19 acute respiratory distress syndrome (ARDS) patients admitted between 1 April and 1 December 2020 were enrolled in the study. A statistical analysis of impeded sedation using mixed-effect linear regression models was performed. Overall, 114 patients were enrolled, requiring unusual high levels of sedatives. During 67.9% of the observation period, a combination of sedatives was required in addition to continuous analgesia. During ARDS therapy, 85.1% (n = 97) underwent prone positioning. Veno-venous extracorporeal membrane oxygenation (vv-ECMO) was required in 20.2% (n = 23) of all patients. vv-ECMO patients showed significantly higher sedation needs (p < 0.001). Patients with hepatic (p = 0.01) or renal (p = 0.01) dysfunction showed significantly lower sedation requirements. Except for patient age (p = 0.01), we could not find any significant influence of pre-existing conditions. Age, vv-ECMO therapy and additional organ failure could be demonstrated as factors influencing sedation needs. Young patients and those receiving vv-ECMO usually require increased sedation for intensive care therapy. However, further studies are needed to elucidate the causes and mechanisms of impeded sedation.
After myocardial infarction in the adult heart the remaining, non-infarcted tissue adapts to compensate the loss of functional tissue. This adaptation requires changes in gene expression networks, which are mostly controlled by transcription regulating proteins. Long non-coding transcripts (lncRNAs) are now recognized for taking part in fine-tuning such gene programs. We identified and characterized the cardiomyocyte specific lncRNA Sweetheart RNA (Swhtr), an approximately 10 kb long transcript divergently expressed from the cardiac core transcription factor coding gene Nkx2-5. We show that Swhtr is dispensable for normal heart development and function, but becomes essential for the tissue adaptation process after myocardial infarction. Re-expressing Swhtr from an exogenous locus rescues the Swhtr null phenotype. Genes depending on Swhtr after cardiac stress are significantly occupied, and therefore most likely regulated by NKX2-5. Our results indicate a synergistic role for Swhtr and the developmentally essential transcription factor NKX2-5 in tissue adaptation after myocardial injury.
Objectives: Regarding reactogenicity and immunogenicity, heterologous COVID-19 vaccination regimens are considered as an alternative to conventional immunization schemes.
Methods: Individuals receiving either heterologous (ChAdOx1-S [AstraZeneca, Cambridge, UK]/BNT162b2 [Pfizer-BioNTech, Mainz, Germany]; n = 306) or homologous (messenger RNA [mRNA]-1273 [Moderna, Cambridge, Massachusetts, USA]; n = 139) vaccination were asked to participate when receiving their second dose. Reactogenicity was assessed after 1 month, immunogenicity after 1, 3, and/or 6 months, including a third dose, through SARS-CoV-2 antispike immunoglobulin G, surrogate virus neutralization test, and a plaque reduction neutralization test against the Delta (B.1.167.2) and Omicron (B.1.1.529; BA.1) variants of concern.
Results: The overall reactogenicity was lower after heterologous vaccination. In both cohorts, SARS-CoV-2 antispike immunoglobulin G concentrations waned over time with the heterologous vaccination demonstrating higher neutralizing activity than homologous mRNA vaccination after 3 months to low neutralizing levels in the Delta plaque reduction neutralization test after 6 months. At this point, 3.2% of the heterologous and 11.4% of the homologous cohort yielded low neutralizing activity against Omicron. After a third dose of an mRNA vaccine, ≥99% of vaccinees demonstrated positive neutralizing activity against Delta. Depending on the vaccination scheme and against Omicron, 60% to 87.5% of vaccinees demonstrated positive neutralizing activity.
Conclusion: ChAdOx1-S/BNT162b2 vaccination demonstrated an acceptable reactogenicity and immunogenicity profile. A third dose of an mRNA vaccine is necessary to maintain neutralizing activity against SARS-CoV-2. However, variants of concern-adapted versions of the vaccines would be desirable.
Intrahepatic cholangiocarcinoma (iCCA) is the most frequent subtype of cholangiocarcinoma (CCA), and the incidence has globally increased in recent years. In contrast to surgically treated iCCA, data on the impact of fibrosis on survival in patients undergoing palliative chemotherapy are missing. We retrospectively analyzed the cases of 70 patients diagnosed with iCCA between 2007 and 2020 in our tertiary hospital. Histopathological assessment of fibrosis was performed by an expert hepatobiliary pathologist. Additionally, the fibrosis-4 score (FIB-4) was calculated as a non-invasive surrogate marker for liver fibrosis. For overall survival (OS) and progression-free survival (PFS), Kaplan–Meier curves and Cox-regression analyses were performed. Subgroup analyses revealed a median OS of 21 months (95% CI = 16.7–25.2 months) and 16 months (95% CI = 7.6–24.4 months) for low and high fibrosis, respectively (p = 0.152). In non-cirrhotic patients, the median OS was 21.8 months (95% CI = 17.1–26.4 months), compared with 9.5 months (95% CI = 4.6–14.3 months) in cirrhotic patients (p = 0.007). In conclusion, patients with iCCA and cirrhosis receiving palliative chemotherapy have decreased OS rates, while fibrosis has no significant impact on OS or PFS. These patients should not be prevented from state-of-the-art first-line chemotherapy.
Standard monitoring of heart rate, blood pressure and arterial oxygen saturation during endoscopy is recommended by current guidelines on procedural sedation. A number of studies indicated a reduction of hypoxic (art. oxygenation < 90% for > 15 s) and severe hypoxic events (art. oxygenation < 85%) by additional use of capnography. Therefore, U.S. and the European guidelines comment that additional capnography monitoring can be considered in long or deep sedation. Integrated Pulmonary Index® (IPI) is an algorithm-based monitoring parameter that combines oxygenation measured by pulse oximetry (art. oxygenation, heart rate) and ventilation measured by capnography (respiratory rate, apnea > 10 s, partial pressure of end-tidal carbon dioxide [PetCO2]). The aim of this paper was to analyze the value of IPI as parameter to monitor the respiratory status in patients receiving propofol sedation during PEG-procedure. Patients reporting for PEG-placement under sedation were randomized 1:1 in either standard monitoring group (SM) or capnography monitoring group including IPI (IM). Heart rate, blood pressure and arterial oxygen saturation were monitored in SM. In IM additional monitoring was performed measuring PetCO2, respiratory rate and IPI. Capnography and IPI values were recorded for all patients but were only visible to the endoscopic team for the IM-group. IPI values range between 1 and 10 (10 = normal; 8–9 = within normal range; 7 = close to normal range, requires attention; 5–6 = requires attention and may require intervention; 3–4 = requires intervention; 1–2 requires immediate intervention). Results on capnography versus standard monitoring of the same study population was published previously. A total of 147 patients (74 in SM and 73 in IM) were included in the present study. Hypoxic events occurred in 62 patients (42%) and severe hypoxic events in 44 patients (29%), respectively. Baseline characteristics were equally distributed in both groups. IPI = 1, IPI < 7 as well as the parameters PetCO2 = 0 mmHg and apnea > 10 s had a high sensitivity for hypoxic and severe hypoxic events, respectively (IPI = 1: 81%/81% [hypoxic/severe hypoxic event], IPI < 7: 82%/88%, PetCO2: 69%/68%, apnea > 10 s: 84%/84%). All four parameters had a low specificity for both hypoxic and severe hypoxic events (IPI = 1: 13%/12%, IPI < 7: 7%/7%, PetCO2: 29%/27%, apnea > 10 s: 7%/7%). In multivariate analysis, only SM and PetCO2 = 0 mmHg were independent risk factors for hypoxia. IPI (IPI = 1 and IPI < 7) as well as the individual parameters PetCO2 = 0 mmHg and apnea > 10 s allow a fast and convenient conclusion on patients’ respiratory status in a morbid patient population. Sensitivity is good for most parameters, but specificity is poor. In conclusion, IPI can be a useful metric to assess respiratory status during propofol-sedation in PEG-placement. However, IPI was not superior to PetCO2 and apnea > 10 s.
Objectives In this early retrospective cohort study, a total of 26 patients with SARS-CoV-2 were treated with bamlanivimab or casirivimab/imdevimab, and the reduction of the viral load associated with the developed clinical symptoms was analyzed.
Methods: Patients in the intervention groups received bamlanivimab or casirivimab/imdevimab. Patients without treatment served as control. Outcomes were assessed by clinical symptoms and change in log viral load from baseline based on the cycle threshold over a period of 18 days.
Results: Median log viral load decline was higher in both intervention groups after 3 and 6 days compared to control. However, at later time points, the decline of the viral load was more distinct in the control group. Mild symptoms of COVID-19 were observed in 6.3% of the intervention groups and in no patient of the control. No patients treated with bamlanivimab, 18.8% treated with casirivimab/imdevimab, and 14.2% in the control group developed moderate symptoms. Severe symptoms were recorded only in the control group (14.2%), including one related death.
Conclusion: Treatment with monoclonal SARS-CoV-2 antibodies seems to accelerate decline of virus loads, especially in the first 6 days after administration, compared to control. This may be associated with a reduced likeliness of a severe course of COVID-19.
Background: Epileptic seizures are common clinical features in patients with acute subdural hematoma (aSDH); however, diagnostic feasibility and therapeutic monitoring remain limited. Surface electroencephalography (EEG) is the major diagnostic tool for the detection of seizures but it might be not sensitive enough to detect all subclinical or nonconvulsive seizures or status epilepticus. Therefore, we have planned a clinical trial to evaluate a novel treatment modality by perioperatively implanting subdural EEG electrodes to diagnose seizures; we will then treat the seizures under therapeutic monitoring and analyze the clinical benefit.
Methods: In a prospective nonrandomized trial, we aim to include 110 patients with aSDH. Only patients undergoing surgical removal of aSDH will be included; one arm will be treated according to the guidelines of the Brain Trauma Foundation, while the other arm will additionally receive a subdural grid electrode. The study's primary outcome is the comparison of incidence of seizures and time-to-seizure between the interventional and control arms. Invasive therapeutic monitoring will guide treatment with antiseizure drugs (ASDs). The secondary outcome will be the functional outcome for both groups as assessed via the Glasgow Outcome Scale and modified Rankin Scale both at discharge and during 6 months of follow-up. The tertiary outcome will be the evaluation of chronic epilepsy within 2-4 years of follow-up.
Discussion: The implantation of a subdural EEG grid electrode in patients with aSDH is expected to be effective in diagnosing seizures in a timely manner, facilitating treatment with ASDs and monitoring of treatment success. Moreover, the occurrence of epileptiform discharges prior to the manifestation of seizure patterns could be evaluated in order to identify high-risk patients who might benefit from prophylactic treatment with ASDs.
Trial registration: ClinicalTrials.gov identifier no. NCT04211233.
Introduction: Recommendations for venous thromboembolism and deep venous thrombosis (DVT) prophylaxis using graduated compression stockings (GCS) is historically based and has been critically examined in current publications. Existing guidelines are inconclusive as to recommend the general use of GCS.
Patients/Methods: 24 273 in-patients (general surgery and orthopedic patients) undergoing surgery between 2006 and 2016 were included in a retrospectively analysis from a single center. From January 2006 to January 2011 perioperative GCS was employed additionally to drug prophylaxis and from February 2011 to March 2016 patients received drug prophylaxis alone. According to german guidelines all patients received venous thromboembolism prophylaxis with weight-adapted LMWH. Risk stratification (low risk, moderate risk, high risk) was based on the guideline of the American College of Chest Physicians. Data analysis was performed before and after propensity matching (PM). The defined primary endpoint was the incidence of symptomatic or fatal pulmonary embolism (PE). A secondary endpoint was the incidence of deep venous thromboembolism (DVT).
Results: After risk stratification (low risk n = 16 483; moderate risk n = 4464; high risk n = 3326) a total of 24 273 patient were analyzed. Before to PM the relative risk for the occurrence of a PE or DVT was not increased by abstaining from GCS. After PM two groups of 11 312 patients each, one with and one without GCS application, were formed. When comparing the two groups, the relative risk (RR) for the occurrence of a pulmonary embolism was: Low Risk 0.99 [CI95% 0.998–1.000]; Moderate Risk 0.999 [CI95% 0.95–1.003]; High Risk 0.996 [CI95% 0.992–1.000] (p > 0.05). The incidence of PE in the total group LMWH alone was 0.1% (n = 16). In the total group using LMWH + GCS, the incidence was 0.3% (n = 29). RR after PM was 0.999 [CI95% 0.998–1.00].
Conclusion: In comparison to prior studies with only small numbers of patients our trial shows in a large group of patients with moderate and high risk developing VTE we can support the view that abstaining from GCS-use does not increase the incidence of symptomatic or fatal PE and symptomatic DVT.
Background: Postoperative complication rates using 3D visualization are rarely reported. The primary aim of our study is to detect a possible advantage of using 3D on postoperative complication rates in a real-world setting.
Method: With a sample size calculation for a medium effect size difference that 3D reduces significantly postoperative complications, data of 287 patients with 3D visualization and 832 with 2D procedure were screened. The groups underwent an exact propensity score-matching to be comparable. Comprehensive complication index (CCI) for every procedure was calculated and Operation Time was determined.
Results: Including 1078 patients in the study, 213 exact propensity score-matched pairs could finally be established. Concerning overall CCI (3D: 5.70 ± 13.63 vs. 2D: 3.37 ± 9.89; p = 0.076) and operation time (3D: 103.98 ± 93.26 min vs. 2D: 88.60 ±6 9.32 min; p = 0.2569) there was no significant difference between the groups.
Conclusion: Our study shows no advantage of 3D over 2D laparoscopy regarding postoperative complications in a real-world setting, the second endpoint operation time, too, was not influenced by 3D overall.
Keywords: 3D laparoscopy; Comprehensive complication index; Propensity score matching
Background: In recent months, Omicron variants of SARS-CoV-2 have become dominant in many regions of the world, and case numbers with Omicron subvariants BA.1 and BA.2 continue to increase. Due to numerous mutations in the spike protein, the efficacy of currently available vaccines, which are based on Wuhan-Hu 1 isolate of SARS-CoV-2, is reduced, leading to breakthrough infections. Efficacy of monoclonal antibody therapy is also likely impaired.
Methods: In our in vitro study using A549-AT cells constitutively expressing ACE2 and TMPRSS2, we determined and compared the neutralizing capacity of vaccine-elicited sera, convalescent sera and monoclonal antibodies against authentic SARS-CoV-2 Omicron BA.1 and BA.2 compared with Delta.
Findings: Almost no neutralisation of Omicron BA.1 and BA.2 was observed using sera from individuals vaccinated with two doses 6 months earlier, regardless of the type of vaccine taken. Shortly after the booster dose, most sera from triple BNT162b2-vaccinated individuals were able to neutralise both Omicron variants. In line with waning antibody levels three months after the booster, only weak residual neutralisation was observed for BA.1 (26%, n = 34, 0 median NT50) and BA.2 (44%, n = 34, 0 median NT50). In addition, BA.1 but not BA.2 was resistant to the neutralising monoclonal antibodies casirivimab/imdevimab, while BA.2 exhibited almost a complete evasion from the neutralisation induced by sotrovimab.
Interpretation: Both SARS-CoV-2 Omicron subvariants BA.1 and BA.2 escape antibody-mediated neutralisation elicited by vaccination, previous infection with SARS-CoV-2, and monoclonal antibodies. Waning immunity renders the majority of tested sera obtained three months after booster vaccination negative in BA.1 and BA.2 neutralisation. Omicron subvariant specific resistance to the monoclonal antibodies casirivimab/imdevimab and sotrovimab emphasizes the importance of genotype-surveillance and guided application.
Funding: This study was supported in part by the Goethe-Corona-Fund of the Goethe University Frankfurt (M.W.) and the Federal Ministry of Education and Research (COVIDready; grant 02WRS1621C (M.W.).
Background: Thyroid Imaging Reporting and Data System (TIRADS) was developed to improve patient management and cost-effectiveness by avoiding unnecessary fine needle aspiration biopsy (FNAB) in patients with thyroid nodules. However, its clinical use is still very limited. Strain elastography (SE) enables the determination of tissue elasticity and has shown promising results for the differentiation of thyroid nodules.
Methods: The aim of the present study was to evaluate the interobserver agreement (IA) of TIRADS developed by Horvath et al. and SE. Three blinded observers independently scored stored images of TIRADS and SE in 114 thyroid nodules (114 patients). Cytology and/or histology was available for all benign (n = 99) and histology for all malignant nodules (n = 15).
Results: The IA between the 3 observers was only fair for TIRADS categories 2–5 (Coheńs kappa = 0.27,p = 0.000001) and TIRADS categories 2/3 versus 4/5 (ck = 0.25,p = 0.0020). The IA was substantial for SE scores 1–4 (ck = 0.66,p<0.000001) and very good for SE scores 1/2 versus 3/4 (ck = 0.81,p<0.000001). 92–100% of patients with TIRADS-2 had benign lesions, while 28–42% with TIRADS-5 had malignant cytology/histology. The negative-predictive-value (NPV) was 92–100% for TIRADS using TIRADS-categories 4&5 and 96–98% for SE using score ES-3&4 for the diagnosis of malignancy, respectively. However, only 11–42% of nodules were in TIRADS-categories 2&3, as compared to 58–60% with ES-1&2.
Conclusions: IA of TIRADS developed by Horvath et al. is only fair. TIRADS and SE have high NPV for excluding malignancy in the diagnostic work-up of thyroid nodules.
Background: Acoustic Radiation Force Impulse (ARFI)-imaging is an ultrasound-based elastography method enabling quantitative measurement of tissue stiffness. The aim of the present study was to evaluate sensitivity and specificity of ARFI-imaging for differentiation of thyroid nodules and to compare it to the well evaluated qualitative real-time elastography (RTE).
Methods: ARFI-imaging involves the mechanical excitation of tissue using acoustic pulses to generate localized displacements resulting in shear-wave propagation which is tracked using correlation-based methods and recorded in m/s. Inclusion criteria were: nodules $5 mm, and cytological/histological assessment. All patients received conventional ultrasound, real-time elastography (RTE) and ARFI-imaging.
Results: One-hundred-fifty-eight nodules in 138 patients were available for analysis. One-hundred-thirty-seven nodules were benign on cytology/histology, and twenty-one nodules were malignant. The median velocity of ARFI-imaging in the healthy thyroid tissue, as well as in benign and malignant thyroid nodules was 1.76 m/s, 1.90 m/s, and 2.69 m/s, respectively. While no significant difference in median velocity was found between healthy thyroid tissue and benign thyroid nodules, a significant difference was found between malignant thyroid nodules on the one hand and healthy thyroid tissue (p = 0.0019) or benign thyroid nodules (p = 0.0039) on the other hand. No significant difference of diagnostic accuracy for the diagnosis of malignant thyroid nodules was found between RTE and ARFI-imaging (0.74 vs. 0.69, p = 0.54). The combination of RTE with ARFI did not improve diagnostic accuracy.
Conclusions: ARFI can be used as an additional tool in the diagnostic work up of thyroid nodules with high negative predictive value and comparable results to RTE.
The genetic make-up of an individual contributes to the susceptibility and response to viral infection. Although environmental, clinical and social factors have a role in the chance of exposure to SARS-CoV-2 and the severity of COVID-191,2, host genetics may also be important. Identifying host-specific genetic factors may reveal biological mechanisms of therapeutic relevance and clarify causal relationships of modifiable environmental risk factors for SARS-CoV-2 infection and outcomes. We formed a global network of researchers to investigate the role of human genetics in SARS-CoV-2 infection and COVID-19 severity. Here we describe the results of three genome-wide association meta-analyses that consist of up to 49,562 patients with COVID-19 from 46 studies across 19 countries. We report 13 genome-wide significant loci that are associated with SARS-CoV-2 infection or severe manifestations of COVID-19. Several of these loci correspond to previously documented associations to lung or autoimmune and inflammatory diseases3,4,5,6,7. They also represent potentially actionable mechanisms in response to infection. Mendelian randomization analyses support a causal role for smoking and body-mass index for severe COVID-19 although not for type II diabetes. The identification of novel host genetic factors associated with COVID-19 was made possible by the community of human genetics researchers coming together to prioritize the sharing of data, results, resources and analytical frameworks. This working model of international collaboration underscores what is possible for future genetic discoveries in emerging pandemics, or indeed for any complex human disease.
Hintergrund: Ein steigendes Einsatzaufkommen lässt sich sowohl im Rettungsdienst als auch im notärztlichen System in Deutschland verzeichnen. Oft werden dabei Fehleinsätze durch leicht erkrankte/verletzte Patienten als wachsende Problematik vermutet. Die vorliegende Untersuchung überprüft die Hypothese von steigenden Einsatzzahlen mit gleichzeitiger Zunahme von gegebenenfalls nichtindizierten Einsätzen.
Material und Methoden: Es erfolgte eine retrospektive Analyse der notärztlichen Einsätze des an der Universitätsklinik Frankfurt am Main stationierten Notarzteinsatzfahrzeugs von 2014 bis 2019. Die Analyse berücksichtigt zudem Faktoren wie die notärztliche Tätigkeit, Behandlungspriorität, Alarmierungsart und das Patientenalter.
Ergebnisse: Im beobachteten Zeitraum lässt sich ein Anstieg der notärztlichen Einsatzzahlen um mehr als 20 % erkennen. Der größte Anstieg zeigt sich bei Einsätzen, bei denen keine notärztliche Tätigkeit (+80 %) notwendig war. Einsätze der niedrigsten Behandlungspriorität (+61 %) sowie der höchsten Behandlungspriorität (+61 %) nahmen ebenfalls signifikant zu.
Diskussion: Die vorliegenden Zahlen stützen die Hypothese, dass bei signifikant gesteigertem Einsatzaufkommen mehr Einsätze durch den Notarzt bewältigt werden müssen, bei denen er rückblickend nicht notwendig gewesen wäre. Trotzdem gibt es auch mehr Patienten, die einen sofortigen Arztkontakt benötigen. Die hieraus resultierende erhöhte Einsatzfrequenz kann zu einer erhöhten Belastung sowie erschwerten zeitgerechten Disposition der notärztlichen Ressource führen.
A 23-year observational follow-up clinical evaluation of direct posterior composite restorations
(2023)
The purpose of this observational follow-up clinical study was to observe the quality of posterior composite restorations more than 23 years after application. A total of 22 patients, 13 male and 9 female (mean age 66.1 years, range 50–84), with a total of 42 restorations attended the first and second follow-up examinations. The restorations were examined by one operator using modified FDI criteria. Statistical analysis was performed with the Wilcoxon Mann–Whitney U test and Wilcoxon exact matched-pairs test with a significance level of p = 0.05. Bonferroni–Holm with an adjusted significance level of alpha = 0.05 was applied. With the exception of approximal anatomical form, significantly worse scores were seen for six out of seven criteria at the second follow-up evaluation. There was no significant difference in the first and second follow-up evaluations in the grades of the restorations with regard to having been placed in the maxilla or mandible, as well as for one-surface or multiple-surface restorations. The approximal anatomical form showed significantly worse grades at the second follow-up when having been placed in molars. In conclusion, the study results show that significant differences regarding FDI criteria in posterior composite restorations occur after more than 23 years of service. Further studies with extended follow-up time and at regular and short time intervals are recommended.