Refine
Year of publication
Document Type
- Article (26)
Has Fulltext
- yes (26)
Is part of the Bibliography
- no (26)
Keywords
- COVID-19 (2)
- Cancer treatment (2)
- Crohn’s disease (2)
- IBD (2)
- Liver cirrhosis (2)
- Posture analysis (2)
- Ulcerative colitis (2)
- Alcoholic liver disease (1)
- Cancer chemotherapy (1)
- Colorectal cancer (1)
Institute
- Medizin (26)
- Biowissenschaften (1)
- Pharmazie (1)
- Sportwissenschaften (1)
Purpose: Colorectal cancer (CRC) is the second most common cancer in Germany. Around 60,000 people were diagnosed CRC in 2016 in Germany. Since 2019, screening colonoscopies are offered in Germany for men by the age of 50 and for women by the age of 55. It is recently discussed if women should also undergo a screening colonoscopy by the age of 50 and if there are any predictors for getting CRC.
Methods: Colonoscopies of 1553 symptomatic patients younger than 55 years were compared with colonoscopies of 1075 symptomatic patients older than 55 years. We analyzed if there are any significant differences between those two groups in the prevalence of CRC and its precursor lesions or between symptomatic men and women. We evaluated if there is a correlation between abdominal symptoms and the prevalence of CRC.
Results: In 164/1553 symptomatic patients, 194 (12.5%) polyps were detected. In total, six colorectal carcinomas (0.4%) were detected. There were no significant differences between men and women. In symptomatic patients ≥ 55 years, significantly more polyps were found (p<0.0001; 26.6% vs. 12.5%). Totally, 286 polyps (26.6%) were removed in 1075 symptomatic patients older than 55 years. Anorectal bleeding was the only abdominal symptom being a significant indicator for the prevalence of the occurrence of colon and rectum cancer in both groups (p=0.03, OR=2.73 95%-CI [1.11;6.70]), but with only low sensitivity (44%).
Conclusion: Due to no significant differences in men and women, we recommend screening colonoscopies also for women by the age of 50.
Interleukin-22 predicts severity and death in advanced liver cirrhosis: a prospective cohort study
(2012)
Background: Interleukin-22 (IL-22), recently identified as a crucial parameter of pathology in experimental liver damage, may determine survival in clinical end-stage liver disease. Systematic analysis of serum IL-22 in relation to morbidity and mortality of patients with advanced liver cirrhosis has not been performed so far.
Methods: This is a prospective cohort study including 120 liver cirrhosis patients and 40 healthy donors to analyze systemic levels of IL-22 in relation to survival and hepatic complications.
Results: A total of 71% of patients displayed liver cirrhosis-related complications at study inclusion. A total of 23% of the patients died during a mean follow-up of 196 +/- 165 days. Systemic IL-22 was detectable in 74% of patients but only in 10% of healthy donors (P <0.001). Elevated levels of IL-22 were associated with ascites (P = 0.006), hepatorenal syndrome (P <0.0001), and spontaneous bacterial peritonitis (P = 0.001). Patients with elevated IL-22 (>18 pg/ml, n = 57) showed significantly reduced survival compared to patients with regular ([less than or equal to]18 pg/ml) levels of IL-22 (321 days versus 526 days, P = 0.003). Other factors associated with overall survival were high CRP ([greater than or equal to]2.9 mg/dl, P = 0.005, hazard ratio (HR) 0.314, confidence interval (CI) (0.141 to 0.702)), elevated serum creatinine (P = 0.05, HR 0.453, CI (0.203 to 1.012)), presence of liver-related complications (P = 0.028, HR 0.258 CI (0.077 to 0.862)), model of end stage liver disease (MELD) score [greater than or equal to]20 (P = 0.017, HR 0.364, CI (0.159 to 0.835)) and age (P = 0.011, HR 1.047, CI (1.011 to 1.085)). Adjusted multivariate Cox proportional-hazards analysis identified elevated systemic IL-22 levels as independent predictors of reduced survival (P = 0.007, HR 0.218, CI (0.072 to 0.662)).
Conclusions: In patients with liver cirrhosis, elevated systemic IL-22 levels are predictive for reduced survival independently from age, liver-related complications, CRP, creatinine and the MELD score. Thus, processes that lead to a rise in systemic interleukin-22 may be relevant for prognosis of advanced liver cirrhosis.
Background and aims: Spontaneous bacterial peritonitis (SBP) is a severe complication of decompensated cirrhosis. The prevalence of multidrug-resistant organisms (MDROs) in patients with cirrhosis is increasing. Identification of patients at risk for SBP due to MDROs (ie, SBP with the evidence of MDROs or Stenotrophomonas maltophilia in ascitic culture, MDRO-SBP) is crucial to the early adaptation of antibiotic treatment in such patients. We therefore investigated whether MDROs found in ascitic cultures can also be found in specimens determined by noninvasive screening procedures.
Patients and methods: This retrospective study was conducted at the liver center of the University Hospital Frankfurt, Germany. Between 2011 and 2016, patients with cirrhosis were included upon diagnosis of SBP and sample collection of aerobic/anaerobic ascitic cultures. Furthermore, the performance of at least one complete MDRO screening was mandatory for study inclusion.
Results: Of 133 patients diagnosed with SBP, 75 (56.4%) had culture-positive SBP and 22 (16.5%) had MDRO-SBP. Multidrug-resistant Escherichia coli (10/22; 45.5%) and vancomycin-resistant enterococci (7/22; 36.4%) resembled the major causatives of MDRO-SBP. Rectal swabs identified MDROs in 17 of 22 patients (77.3%) who developed MDRO-SBP with a time-dependent sensitivity of 77% and 87% after 30 and 90 days upon testing, while negative predictive value was 83% and 76%, respectively. The majority of patients were included from intensive care unit or intermediate care unit.
Conclusion: MDRO screening may serve as a noninvasive diagnostic tool to identify patients at risk for MDRO-SBP. Patients with decompensated cirrhosis should be screened for MDROs from the first day of inpatient treatment onward.
Background: Patients with head and neck cancer (HNC) are at high risk for malnutrition because of tumour localisation and therapy. Prophylactic percutaneous endoscopic gastrostomy (PEG) tube placement is common practice to prevent malnutrition.
Objective: To investigate the benefits of prophylactic PEG tube placement for HNC patients in terms of the influence on patients’ nutritional status, utilisation rate, complications and to identify the predictors of PEG tube utilisation.
Methods: All consecutive HNC patients who underwent prophylactic PEG tube insertion between 1 January 2011 and 31 December 2012 prior to therapy were enrolled. The PEG tube utilisation rate, complications, the patients’ nutritional status and tumour therapy were evaluated with the help of electronic patient charts and telephone interviews.
Results: A total of 181 patients (48 female, median 67.5 years) were included. The PEG utilisation rate in the entire cohort was 91.7%. One hundred and forty‐nine patients (82.3%) used the PEG tube for total enteral nutrition, 17 patients (9.4%) for supplemental nutrition and 15 patients (8.3%) made no use of the PEG tube. Peristomal wound infections were the most common complications (40.3%) in this study. A high Nutritional Risk Screening (NRS) score prior to tube insertion was found to be independently associated with PEG utilisation. No significant weight changes were observed across the three patient subgroups.
Conclusions: The overall PEG tube utilisation rate was high in this study. However, given the high rate of infections, diligent patient selection is crucial in order to determine which patients benefit most from prophylactic PEG tube insertion.
Objectives: The four-dimensional ultrasound (4D-US) enables imaging of the aortic segment and simultaneous determination of the wall expansion. The method shows a high spatial and temporal resolution, but its in vivo reliability is so far unknown for low-measure values. The present study determines the intraobserver repeatability and interobserver reproducibility of 4D-US in the atherosclerotic and non-atherosclerotic infrarenal aorta. Methods: In all, 22 patients with non-aneurysmal aorta were examined by an experienced examiner and a medical student. After registration of 4D images, both the examiners marked the aortic wall manually before the commercially implemented speckle tracking algorithm was applied. The cyclic changes of the aortic diameter and circumferential strain were determined with the help of custom-made software. The reliability of 4D-US was tested by the intraclass correlation coefficient (ICC). Results: The 4D-US measurements showed very good reliability for the maximum aortic diameter and the circumferential strain for all patients and for the non-atherosclerotic aortae (ICC >0.7), but low reliability for circumferential strain in calcified aortae (ICC = 0.29). The observer- and masking-related variances for both maximum diameter and circumferential strain were close to zero. Conclusions: Despite the low-measured values, the high spatial and temporal resolution of the 4D-US enables a reliable evaluation of cyclic diameter changes and circumferential strain in non-aneurysmal aortae independent from the observer experience but with some limitations for calcified aortae. The 4D-US opens up a new perspective with regard to noninvasive, in vivo assessment of kinematic properties of the vessel wall in the abdominal aorta.
Background: Hypoxia is a key driver for infiltrative growth in experimental gliomas. It has remained elusive whether tumor hypoxia in glioblastoma patients contributes to distant or diffuse recurrences. We therefore investigated the influence of perioperative cerebral ischemia on patterns of progression in glioblastoma patients.
Methods: We retrospectively screened MRI scans of 245 patients with newly diagnosed glioblastoma undergoing resection for perioperative ischemia near the resection cavity. 46 showed relevant ischemia nearby the resection cavity. A control cohort without perioperative ischemia was generated by a 1:1 matching using an algorithm based on gender, age and adjuvant treatment. Both cohorts were analyzed for patterns of progression by a blinded neuroradiologist.
Results: The percentage of diffuse or distant recurrences at first relapse was significantly higher in the cohort with perioperative ischemia (61.1%) compared to the control cohort (19.4%). The results of the control cohort matched well with historical data. The change in patterns of progression was not associated with a difference in survival.
Conclusions: This study reveals an unrecognized association of perioperative cerebral ischemia with distant or diffuse recurrence in glioblastoma. It is the first clinical study supporting the concept that hypoxia is a key driver of infiltrative tumor growth in glioblastoma patients.
Triple therapy of chronic hepatitis C virus (HCV) infection with boceprevir (BOC) or telaprevir (TVR) leads to virologic failure in many patients which is often associated with the selection of resistance-associated variants (RAVs). These resistance profiles are of importance for the selection of potential rescue treatment options. In this study, we sequenced baseline NS3 RAVs population-based and investigated the sensitivity of NS3 phenotypes in an HCV replicon assay together with clinical factors for a prediction of treatment response in a cohort of 165 German and Swiss patients treated with a BOC or TVR-based triple therapy. Overall, the prevalence of baseline RAVs was low, although the frequency of RAVs was higher in patients with virologic failure compared to those who achieved a sustained virologic response (SVR) (7% versus 1%, P = 0.06). The occurrence of RAVs was associated with a resistant NS3 quasispecies phenotype (P<0.001), but the sensitivity of phenotypes was not associated with treatment outcome (P = 0.2). The majority of single viral and host predictors of SVR was only weakly associated with treatment response. In multivariate analyses, low AST levels, female sex and an IFNL4 CC genotype were independently associated with SVR. However, a combined analysis of negative predictors revealed a significantly lower overall number of negative predictors in patients with SVR in comparison to individuals with virologic failure (P<0.0001) and the presence of 2 or less negative predictors was indicative for SVR. These results demonstrate that most single baseline viral and host parameters have a weak influence on the response to triple therapy, whereas the overall number of negative predictors has a high predictive value for SVR.
Background: Intestinal perforation or leakage increases morbidity and mortality of surgical and endoscopic interventions. We identified criteria for use of full-covered, extractable self-expanding metal stents (cSEMS) vs. "Over the scope"-clips (OTSC) for leak closure.
Methods: Patients who underwent endoscopic treatment for postoperative leakage, endoscopic perforation, or spontaneous rupture of the upper gastrointestinal tract between 2006 and 2013 were identified at four tertiary endoscopic centers. Technical success, outcome (e.g. duration of hospitalization, in-hospital mortality), and complications were assessed and analyzed with respect to etiology, size and location of leakage.
Results: Of 106 patients (male: 75 (71%), female: 31 (29%); age (mean ± SD): 62.5 ± 1.3 years, 72 (69%) were treated by cSEMS and 34 (31%) by OTSC. For cSEMS vs. OTSC, mean treatment duration was 41.1 vs. 25 days, p<0.001, leakage size 10 (1-50) vs. 5 (1-30) mm (median (range)), and complications were observed in 68% vs. 8.8%, p<0.001, respectively. Clinical success for primary interventional treatment was observed in 29/72 (40%) vs. 24/34 (70%, p = 0.006), and clinical success at the end of follow-up was 46/72 (64%) vs. 29/34 (85%) for patients treated by cSEMS vs. OTSC; p = 0.04.
Conclusion: OTSC is preferred in small-sized lesions and in perforation caused by endoscopic interventions, cSEMS in patients with concomitant local infection or abscess. cSEMS is associated with a higher frequency of complications. Therefore, OTSC might be preferred if technically feasible. Indication criteria for cSEMS vs. OTSC vary and might impede design of randomized studies.
Introduction Occurrence of inaccurate or delayed diagnoses is a significant concern in patient care, particularly in emergency medicine, where decision making is often constrained by high throughput and inaccurate admission diagnoses. Artificial intelligence-based diagnostic decision support system have been developed to enhance clinical performance by suggesting differential diagnoses to a given case, based on an integrated medical knowledge base and machine learning techniques. The purpose of the study is to evaluate the diagnostic accuracy of Ada, an app-based diagnostic tool and the impact on patient outcome.
Methods and analysis The eRadaR trial is a prospective, double-blinded study with patients presenting to the emergency room (ER) with abdominal pain. At initial contact in the ER, a structured interview will be performed using the Ada-App and both, patients and attending physicians, will be blinded to the proposed diagnosis lists until trial completion. Throughout the study, clinical data relating to diagnostic findings and types of therapy will be obtained and the follow-up until day 90 will comprise occurrence of complications and overall survival of patients. The primary efficacy of the trial is defined by the percentage of correct diagnoses suggested by Ada compared with the final discharge diagnosis. Further, accuracy and timing of diagnosis will be compared with decision making of classical doctor–patient interaction. Secondary objectives are complications, length of hospital stay and overall survival.
Ethics and dissemination Ethical approval was received by the independent ethics committee (IEC) of the Goethe-University Frankfurt on 9 April 2020 including the patient information material and informed consent form. All protocol amendments must be reported to and adapted by the IEC. The results from this study will be submitted to peer-reviewed journals and reported at suitable national and international meetings.
Trial registration number DRKS00019098.
Objectives: Rising prevalence of multidrug-resistant organisms (MDRO) is a major health problem in patients with liver cirrhosis. The impact of MDRO colonization in liver transplantation (LT) candidates and recipients on mortality has not been determined in detail.
Methods: Patients consecutively evaluated and listed for LT in a tertiary German liver transplant center from 2008 to 2018 underwent screening for MDRO colonization including methicillin-resistant Staphylococcus aureus (MRSA), multidrug-resistant gram-negative bacteria (MDRGN), and vancomycin-resistant enterococci (VRE). MDRO colonization and infection status were obtained at LT evaluation, planned and unplanned hospitalization, three months upon graft allocation, or at last follow-up on the waiting list.
Results: In total, 351 patients were listed for LT, of whom 164 (47%) underwent LT after a median of 249 (range 0–1662) days. Incidence of MDRO colonization increased during waiting time for LT, and MRDO colonization was associated with increased mortality on the waiting list (HR = 2.57, p<0.0001. One patients was colonized with a carbapenem-resistant strain at listing, 9 patients acquired carbapenem-resistant gram-negative bacteria (CRGN) on the waiting list, and 4 more after LT. In total, 10 of these 14 patients died.
Conclusions: Colonization with MDRO is associated with increased mortality on the waiting list, but not in short-term follow-up after LT. Moreover, colonization with CRGN seems associated with high mortality in liver transplant candidates and recipients.