Refine
Year of publication
Document Type
- Article (21)
Language
- English (21)
Has Fulltext
- yes (21)
Is part of the Bibliography
- no (21)
Keywords
- Cancer treatment (2)
- Crohn’s disease (2)
- IBD (2)
- Liver cirrhosis (2)
- Ulcerative colitis (2)
- Alcoholic liver disease (1)
- COVID-19 (1)
- Cancer chemotherapy (1)
- Computed axial tomography (1)
- Crohn's disease (1)
Institute
- Medizin (21)
- Biowissenschaften (1)
- Pharmazie (1)
- Sportwissenschaften (1)
Background/aims: Hepatocellular carcinoma (HCC) is a leading indication for liver transplantation (LT) worldwide. Early identification of patients at risk for HCC recurrence is of paramount importance since early treatment of recurrent HCC after LT may be associated with increased survival. We evaluated incidence of and predictors for HCC recurrence, with a focus on the course of AFP levels.
Methods: We performed a retrospective, single-center study of 99 HCC patients who underwent LT between January 28th, 1997 and May 11th, 2016. A multi-stage proportional hazards model with three stages was used to evaluate potential predictive markers, both by univariate and multivariable analysis, for influences on 1) recurrence after transplantation, 2) mortality without HCC recurrence, and 3) mortality after recurrence.
Results: 19/99 HCC patients showed recurrence after LT. Waiting time was not associated with overall HCC recurrence (HR = 1, p = 0.979). Similarly, waiting time did not affect mortality in LT recipients both with (HR = 0.97, p = 0.282) or without (HR = 0.99, p = 0.685) HCC recurrence. Log10-transformed AFP values at the time of LT (HR 1.75, p = 0.023) as well as after LT (HR 2.07, p = 0.037) were significantly associated with recurrence. Median survival in patients with a ratio (AFP at recurrence divided by AFP 3 months before recurrence) of 0.5 was greater than 70 months, as compared to a median of only 8 months in patients with a ratio of 5.
Conclusion: A rise in AFP levels rather than an absolute threshold could help to identify patients at short-term risk for HCC recurrence post LT, which may allow intensification of the surveillance strategy on an individualized basis.
Objectives: Rising prevalence of multidrug-resistant organisms (MDRO) is a major health problem in patients with liver cirrhosis. The impact of MDRO colonization in liver transplantation (LT) candidates and recipients on mortality has not been determined in detail.
Methods: Patients consecutively evaluated and listed for LT in a tertiary German liver transplant center from 2008 to 2018 underwent screening for MDRO colonization including methicillin-resistant Staphylococcus aureus (MRSA), multidrug-resistant gram-negative bacteria (MDRGN), and vancomycin-resistant enterococci (VRE). MDRO colonization and infection status were obtained at LT evaluation, planned and unplanned hospitalization, three months upon graft allocation, or at last follow-up on the waiting list.
Results: In total, 351 patients were listed for LT, of whom 164 (47%) underwent LT after a median of 249 (range 0–1662) days. Incidence of MDRO colonization increased during waiting time for LT, and MRDO colonization was associated with increased mortality on the waiting list (HR = 2.57, p<0.0001. One patients was colonized with a carbapenem-resistant strain at listing, 9 patients acquired carbapenem-resistant gram-negative bacteria (CRGN) on the waiting list, and 4 more after LT. In total, 10 of these 14 patients died.
Conclusions: Colonization with MDRO is associated with increased mortality on the waiting list, but not in short-term follow-up after LT. Moreover, colonization with CRGN seems associated with high mortality in liver transplant candidates and recipients.
Background: Essential Tremor (ET) is a progressive neurological disorder characterized by postural and kinetic tremor most commonly affecting the hands and arms. Medically intractable ET can be treated by deep brain stimulation (DBS) of the ventral intermediate nucleus of thalamus (VIM). We investigated whether the location of the effective contact (most tremor suppression with at least side effects) in VIM-DBS for ET changes over time, indicating a distinct mechanism of loss of efficacy that goes beyond progression of tremor severity, or a mere reduction of DBS efficacy.
Methods: We performed programming sessions in 10 patients who underwent bilateral vim-DBS surgery between 2009 and 2017 at our department. In addition to the intraoperative (T1) and first clinical programming session (T2) a third programming session (T3) was performed to assess the effect- and side effect threshold (minimum voltage at which a tremor suppression or side effects occurred). Additionally, we compared the choice of the effective contact between T1 and T2 which might be affected by a surgical induced “brain shift.”
Discussion: Over a time span of about 4 years VIM-DBS in ET showed continuous efficacy in tremor suppression during stim-ON compared to stim-OFF. Compared to immediate postoperative programming sessions in ET-patients with DBS, long-term evaluationshowednorelevantchangeinthechoiceofcontactwithrespecttosideeffects andefficacy.InthemajorityofthecasestheactivecontactatT2didnotcorrespondtothe most effective intraoperative stimulation site T1, which might be explained by a brain-shift due to cerebral spinal fluid loss after neurosurgical procedure.
Fit to play : posture and seating position analysis with professional musicians - a study protocol
(2017)
Background: Musical performance-associated musculoskeletal disorders (MSD) are a common health problem among professional musicians. Considering the manifold consequences arising for the musicians, they can be seen as a threat for their professional activity. String players are the most affected group of musicians in this matter. Faults in upper body posture while playing the instrument, causing un-ergonomic static strain on the back and unergonomic limp-movements, are a main reason for musculoskeletal disorders and pain syndromes.
Methods: A total of 66 professional musicians, divided into three groups, are measured.
The division is performed by average duration of performance, intensity of daily exercise and professional experience. Video raster stereography, a three-dimensional analysis of the body posture, is used to analyse the instrument-specific posture. Furthermore the pressure distribution during seating is analysed. Measurements are performed because the musician is sitting on varying music chairs differing in structure and/or construction of the seating surface. The measurements take place in habitual seating position as well as during playing the instrument.
Results: To analyse the influence of different chairs, ANOVA for repeated measurements or Friedman-test is used, depending on normality assumptions. Comparison of posture between amateur musicians, students, and professional orchestral musicians is carried out the non-parametric Jonckheere-Terpstra-test.
Conclusions: Our method attempts to give the musicians indications for the right music chair choice by analyzing the chair concepts, so that thereby preemptively MSD can be reduced or prevented.
Background: Intestinal perforation or leakage increases morbidity and mortality of surgical and endoscopic interventions. We identified criteria for use of full-covered, extractable self-expanding metal stents (cSEMS) vs. "Over the scope"-clips (OTSC) for leak closure.
Methods: Patients who underwent endoscopic treatment for postoperative leakage, endoscopic perforation, or spontaneous rupture of the upper gastrointestinal tract between 2006 and 2013 were identified at four tertiary endoscopic centers. Technical success, outcome (e.g. duration of hospitalization, in-hospital mortality), and complications were assessed and analyzed with respect to etiology, size and location of leakage.
Results: Of 106 patients (male: 75 (71%), female: 31 (29%); age (mean ± SD): 62.5 ± 1.3 years, 72 (69%) were treated by cSEMS and 34 (31%) by OTSC. For cSEMS vs. OTSC, mean treatment duration was 41.1 vs. 25 days, p<0.001, leakage size 10 (1-50) vs. 5 (1-30) mm (median (range)), and complications were observed in 68% vs. 8.8%, p<0.001, respectively. Clinical success for primary interventional treatment was observed in 29/72 (40%) vs. 24/34 (70%, p = 0.006), and clinical success at the end of follow-up was 46/72 (64%) vs. 29/34 (85%) for patients treated by cSEMS vs. OTSC; p = 0.04.
Conclusion: OTSC is preferred in small-sized lesions and in perforation caused by endoscopic interventions, cSEMS in patients with concomitant local infection or abscess. cSEMS is associated with a higher frequency of complications. Therefore, OTSC might be preferred if technically feasible. Indication criteria for cSEMS vs. OTSC vary and might impede design of randomized studies.
Influence of antibiotic-regimens on intensive-care unit-mortality and liver-cirrhosis as risk factor
(2016)
AIM: To assess the rate of infection, appropriateness of antimicrobial-therapy and mortality on intensive care unit (ICU). Special focus was drawn on patients with liver cirrhosis.
METHODS: The study was approved by the local ethical committee. All patients admitted to the Internal Medicine-ICU between April 1, 2007 and December 31, 2009 were included. Data were extracted retrospectively from all patients using patient charts and electronic documentations on infection, microbiological laboratory reports, diagnosis and therapy. Due to the large hepatology department and liver transplantation center, special interest was on the subgroup of patients with liver cirrhosis. The primary statistical-endpoint was the evaluation of the influence of appropriate versus inappropriate antimicrobial-therapy on in-hospital-mortality.
RESULTS: Charts of 1979 patients were available. The overall infection-rate was 53%. Multiresistant-bacteria were present in 23% of patients with infection and were associated with increased mortality (P < 0.000001). Patients with infection had significantly increased in-hospital-mortality (34% vs 17%, P < 0.000001). Only 9% of patients with infection received inappropriate initial antimicrobial-therapy, no influence on mortality was observed. Independent risk-factors for in-hospital-mortality were the presence of septic-shock, prior chemotherapy for malignoma and infection with Pseudomonas spp. Infection and mortality-rate among 175 patients with liver-cirrhosis was significantly higher than in patients without liver-cirrhosis. Infection increased mortality 2.24-fold in patients with cirrhosis. Patients with liver cirrhosis were at an increased risk to receive inappropriate initial antimicrobial therapy.
CONCLUSION: The results of the present study report the successful implementation of early-goal-directed therapy. Liver cirrhosis patients are at increased risk of infection, mortality and to receive inappropriate therapy. Increasing burden are multiresistant-bacteria.
Interleukin-22 predicts severity and death in advanced liver cirrhosis: a prospective cohort study
(2012)
Background: Interleukin-22 (IL-22), recently identified as a crucial parameter of pathology in experimental liver damage, may determine survival in clinical end-stage liver disease. Systematic analysis of serum IL-22 in relation to morbidity and mortality of patients with advanced liver cirrhosis has not been performed so far.
Methods: This is a prospective cohort study including 120 liver cirrhosis patients and 40 healthy donors to analyze systemic levels of IL-22 in relation to survival and hepatic complications.
Results: A total of 71% of patients displayed liver cirrhosis-related complications at study inclusion. A total of 23% of the patients died during a mean follow-up of 196 +/- 165 days. Systemic IL-22 was detectable in 74% of patients but only in 10% of healthy donors (P <0.001). Elevated levels of IL-22 were associated with ascites (P = 0.006), hepatorenal syndrome (P <0.0001), and spontaneous bacterial peritonitis (P = 0.001). Patients with elevated IL-22 (>18 pg/ml, n = 57) showed significantly reduced survival compared to patients with regular ([less than or equal to]18 pg/ml) levels of IL-22 (321 days versus 526 days, P = 0.003). Other factors associated with overall survival were high CRP ([greater than or equal to]2.9 mg/dl, P = 0.005, hazard ratio (HR) 0.314, confidence interval (CI) (0.141 to 0.702)), elevated serum creatinine (P = 0.05, HR 0.453, CI (0.203 to 1.012)), presence of liver-related complications (P = 0.028, HR 0.258 CI (0.077 to 0.862)), model of end stage liver disease (MELD) score [greater than or equal to]20 (P = 0.017, HR 0.364, CI (0.159 to 0.835)) and age (P = 0.011, HR 1.047, CI (1.011 to 1.085)). Adjusted multivariate Cox proportional-hazards analysis identified elevated systemic IL-22 levels as independent predictors of reduced survival (P = 0.007, HR 0.218, CI (0.072 to 0.662)).
Conclusions: In patients with liver cirrhosis, elevated systemic IL-22 levels are predictive for reduced survival independently from age, liver-related complications, CRP, creatinine and the MELD score. Thus, processes that lead to a rise in systemic interleukin-22 may be relevant for prognosis of advanced liver cirrhosis.
Objectives: The four-dimensional ultrasound (4D-US) enables imaging of the aortic segment and simultaneous determination of the wall expansion. The method shows a high spatial and temporal resolution, but its in vivo reliability is so far unknown for low-measure values. The present study determines the intraobserver repeatability and interobserver reproducibility of 4D-US in the atherosclerotic and non-atherosclerotic infrarenal aorta. Methods: In all, 22 patients with non-aneurysmal aorta were examined by an experienced examiner and a medical student. After registration of 4D images, both the examiners marked the aortic wall manually before the commercially implemented speckle tracking algorithm was applied. The cyclic changes of the aortic diameter and circumferential strain were determined with the help of custom-made software. The reliability of 4D-US was tested by the intraclass correlation coefficient (ICC). Results: The 4D-US measurements showed very good reliability for the maximum aortic diameter and the circumferential strain for all patients and for the non-atherosclerotic aortae (ICC >0.7), but low reliability for circumferential strain in calcified aortae (ICC = 0.29). The observer- and masking-related variances for both maximum diameter and circumferential strain were close to zero. Conclusions: Despite the low-measured values, the high spatial and temporal resolution of the 4D-US enables a reliable evaluation of cyclic diameter changes and circumferential strain in non-aneurysmal aortae independent from the observer experience but with some limitations for calcified aortae. The 4D-US opens up a new perspective with regard to noninvasive, in vivo assessment of kinematic properties of the vessel wall in the abdominal aorta.
Triple therapy of chronic hepatitis C virus (HCV) infection with boceprevir (BOC) or telaprevir (TVR) leads to virologic failure in many patients which is often associated with the selection of resistance-associated variants (RAVs). These resistance profiles are of importance for the selection of potential rescue treatment options. In this study, we sequenced baseline NS3 RAVs population-based and investigated the sensitivity of NS3 phenotypes in an HCV replicon assay together with clinical factors for a prediction of treatment response in a cohort of 165 German and Swiss patients treated with a BOC or TVR-based triple therapy. Overall, the prevalence of baseline RAVs was low, although the frequency of RAVs was higher in patients with virologic failure compared to those who achieved a sustained virologic response (SVR) (7% versus 1%, P = 0.06). The occurrence of RAVs was associated with a resistant NS3 quasispecies phenotype (P<0.001), but the sensitivity of phenotypes was not associated with treatment outcome (P = 0.2). The majority of single viral and host predictors of SVR was only weakly associated with treatment response. In multivariate analyses, low AST levels, female sex and an IFNL4 CC genotype were independently associated with SVR. However, a combined analysis of negative predictors revealed a significantly lower overall number of negative predictors in patients with SVR in comparison to individuals with virologic failure (P<0.0001) and the presence of 2 or less negative predictors was indicative for SVR. These results demonstrate that most single baseline viral and host parameters have a weak influence on the response to triple therapy, whereas the overall number of negative predictors has a high predictive value for SVR.
Background and aims: Spontaneous bacterial peritonitis (SBP) is a severe complication of decompensated cirrhosis. The prevalence of multidrug-resistant organisms (MDROs) in patients with cirrhosis is increasing. Identification of patients at risk for SBP due to MDROs (ie, SBP with the evidence of MDROs or Stenotrophomonas maltophilia in ascitic culture, MDRO-SBP) is crucial to the early adaptation of antibiotic treatment in such patients. We therefore investigated whether MDROs found in ascitic cultures can also be found in specimens determined by noninvasive screening procedures.
Patients and methods: This retrospective study was conducted at the liver center of the University Hospital Frankfurt, Germany. Between 2011 and 2016, patients with cirrhosis were included upon diagnosis of SBP and sample collection of aerobic/anaerobic ascitic cultures. Furthermore, the performance of at least one complete MDRO screening was mandatory for study inclusion.
Results: Of 133 patients diagnosed with SBP, 75 (56.4%) had culture-positive SBP and 22 (16.5%) had MDRO-SBP. Multidrug-resistant Escherichia coli (10/22; 45.5%) and vancomycin-resistant enterococci (7/22; 36.4%) resembled the major causatives of MDRO-SBP. Rectal swabs identified MDROs in 17 of 22 patients (77.3%) who developed MDRO-SBP with a time-dependent sensitivity of 77% and 87% after 30 and 90 days upon testing, while negative predictive value was 83% and 76%, respectively. The majority of patients were included from intensive care unit or intermediate care unit.
Conclusion: MDRO screening may serve as a noninvasive diagnostic tool to identify patients at risk for MDRO-SBP. Patients with decompensated cirrhosis should be screened for MDROs from the first day of inpatient treatment onward.