Refine
Year of publication
Document Type
- Article (31)
Has Fulltext
- yes (31)
Is part of the Bibliography
- no (31)
Keywords
- COVID-19 (2)
- Cancer treatment (2)
- Crohn’s disease (2)
- IBD (2)
- Liver cirrhosis (2)
- Posture analysis (2)
- Ulcerative colitis (2)
- biliary stricture (2)
- delay (2)
- diagnosis (2)
Institute
Objectives: Rising prevalence of multidrug-resistant organisms (MDRO) is a major health problem in patients with liver cirrhosis. The impact of MDRO colonization in liver transplantation (LT) candidates and recipients on mortality has not been determined in detail.
Methods: Patients consecutively evaluated and listed for LT in a tertiary German liver transplant center from 2008 to 2018 underwent screening for MDRO colonization including methicillin-resistant Staphylococcus aureus (MRSA), multidrug-resistant gram-negative bacteria (MDRGN), and vancomycin-resistant enterococci (VRE). MDRO colonization and infection status were obtained at LT evaluation, planned and unplanned hospitalization, three months upon graft allocation, or at last follow-up on the waiting list.
Results: In total, 351 patients were listed for LT, of whom 164 (47%) underwent LT after a median of 249 (range 0–1662) days. Incidence of MDRO colonization increased during waiting time for LT, and MRDO colonization was associated with increased mortality on the waiting list (HR = 2.57, p<0.0001. One patients was colonized with a carbapenem-resistant strain at listing, 9 patients acquired carbapenem-resistant gram-negative bacteria (CRGN) on the waiting list, and 4 more after LT. In total, 10 of these 14 patients died.
Conclusions: Colonization with MDRO is associated with increased mortality on the waiting list, but not in short-term follow-up after LT. Moreover, colonization with CRGN seems associated with high mortality in liver transplant candidates and recipients.
Fit to play : posture and seating position analysis with professional musicians - a study protocol
(2017)
Background: Musical performance-associated musculoskeletal disorders (MSD) are a common health problem among professional musicians. Considering the manifold consequences arising for the musicians, they can be seen as a threat for their professional activity. String players are the most affected group of musicians in this matter. Faults in upper body posture while playing the instrument, causing un-ergonomic static strain on the back and unergonomic limp-movements, are a main reason for musculoskeletal disorders and pain syndromes.
Methods: A total of 66 professional musicians, divided into three groups, are measured.
The division is performed by average duration of performance, intensity of daily exercise and professional experience. Video raster stereography, a three-dimensional analysis of the body posture, is used to analyse the instrument-specific posture. Furthermore the pressure distribution during seating is analysed. Measurements are performed because the musician is sitting on varying music chairs differing in structure and/or construction of the seating surface. The measurements take place in habitual seating position as well as during playing the instrument.
Results: To analyse the influence of different chairs, ANOVA for repeated measurements or Friedman-test is used, depending on normality assumptions. Comparison of posture between amateur musicians, students, and professional orchestral musicians is carried out the non-parametric Jonckheere-Terpstra-test.
Conclusions: Our method attempts to give the musicians indications for the right music chair choice by analyzing the chair concepts, so that thereby preemptively MSD can be reduced or prevented.
Influence of antibiotic-regimens on intensive-care unit-mortality and liver-cirrhosis as risk factor
(2016)
AIM: To assess the rate of infection, appropriateness of antimicrobial-therapy and mortality on intensive care unit (ICU). Special focus was drawn on patients with liver cirrhosis.
METHODS: The study was approved by the local ethical committee. All patients admitted to the Internal Medicine-ICU between April 1, 2007 and December 31, 2009 were included. Data were extracted retrospectively from all patients using patient charts and electronic documentations on infection, microbiological laboratory reports, diagnosis and therapy. Due to the large hepatology department and liver transplantation center, special interest was on the subgroup of patients with liver cirrhosis. The primary statistical-endpoint was the evaluation of the influence of appropriate versus inappropriate antimicrobial-therapy on in-hospital-mortality.
RESULTS: Charts of 1979 patients were available. The overall infection-rate was 53%. Multiresistant-bacteria were present in 23% of patients with infection and were associated with increased mortality (P < 0.000001). Patients with infection had significantly increased in-hospital-mortality (34% vs 17%, P < 0.000001). Only 9% of patients with infection received inappropriate initial antimicrobial-therapy, no influence on mortality was observed. Independent risk-factors for in-hospital-mortality were the presence of septic-shock, prior chemotherapy for malignoma and infection with Pseudomonas spp. Infection and mortality-rate among 175 patients with liver-cirrhosis was significantly higher than in patients without liver-cirrhosis. Infection increased mortality 2.24-fold in patients with cirrhosis. Patients with liver cirrhosis were at an increased risk to receive inappropriate initial antimicrobial therapy.
CONCLUSION: The results of the present study report the successful implementation of early-goal-directed therapy. Liver cirrhosis patients are at increased risk of infection, mortality and to receive inappropriate therapy. Increasing burden are multiresistant-bacteria.
Purpose: Colorectal cancer (CRC) is the second most common cancer in Germany. Around 60,000 people were diagnosed CRC in 2016 in Germany. Since 2019, screening colonoscopies are offered in Germany for men by the age of 50 and for women by the age of 55. It is recently discussed if women should also undergo a screening colonoscopy by the age of 50 and if there are any predictors for getting CRC.
Methods: Colonoscopies of 1553 symptomatic patients younger than 55 years were compared with colonoscopies of 1075 symptomatic patients older than 55 years. We analyzed if there are any significant differences between those two groups in the prevalence of CRC and its precursor lesions or between symptomatic men and women. We evaluated if there is a correlation between abdominal symptoms and the prevalence of CRC.
Results: In 164/1553 symptomatic patients, 194 (12.5%) polyps were detected. In total, six colorectal carcinomas (0.4%) were detected. There were no significant differences between men and women. In symptomatic patients ≥ 55 years, significantly more polyps were found (p<0.0001; 26.6% vs. 12.5%). Totally, 286 polyps (26.6%) were removed in 1075 symptomatic patients older than 55 years. Anorectal bleeding was the only abdominal symptom being a significant indicator for the prevalence of the occurrence of colon and rectum cancer in both groups (p=0.03, OR=2.73 95%-CI [1.11;6.70]), but with only low sensitivity (44%).
Conclusion: Due to no significant differences in men and women, we recommend screening colonoscopies also for women by the age of 50.
Chronic viral hepatitis is associated with substantial morbidity and mortality worldwide. The aim of our study was to assess the ability of point shear‐wave elastography (pSWE) using acoustic radiation force impulse imaging for the prediction of the following liver‐related events (LREs): new diagnosis of HCC, liver transplantation, or liver‐related death (hepatic decompensation was not included as an LRE). pSWE was performed at study inclusion and compared with liver histology, transient elastography (TE), and serologic biomarkers (aspartate aminotransferase to platelet ratio index, Fibrosis‐4, FibroTest). The performance of pSWE and TE to predict LREs was assessed by calculating the area under the receiver operating characteristic curve and a Cox proportional‐hazards regression model. A total of 254 patients with a median follow‐up of 78 months were included in the study. LRE occurred in 28 patients (11%) during follow‐up. In both patients with hepatitis B virus and hepatitis C virus (HCV), pSWE showed significant correlations with noninvasive tests and TE, and median pSWE and TE values were significantly different between patients with LREs and patients without LREs (both P < 0.0001). In patients with HCV, the area under the receiver operating characteristic curve for pSWE and TE to predict LREs were comparable: 0.859 (95% confidence interval [CI], 0.747‐0.969) and 0.852 (95% CI, 0.737‐0.967) (P = 0.93). In Cox regression analysis, pSWE independently predicted LREs in all patients with HCV (hazard ratio, 17.9; 95% CI, 5.21‐61‐17; P < 0.0001) and those who later received direct‐acting antiviral therapy (hazard ratio, 17.11; 95% CI, 3.88‐75.55; P = 0.0002). Conclusion: Our study shows good comparability between pSWE and TE. pSWE is a promising tool for the prediction of LREs in patients with viral hepatitis, particularly those with chronic HCV. Further studies are needed to confirm our data and assess their prognostic value in other liver diseases.
Background; Musculoskeletal disorders (MSD) are a common health problem among dentists. Dental treatment is mainly performed in a sitting position. The aim of the study was to quantify the effect of different ergonomic chairs on the sitting position. In addition, it was tested if the sitting position of experienced workers is different from a non-dental group.
Methods; A total of 59 (28 m/31f) subjects, divided into two dentist groups according to their work experience (students and dentists (9 m/11f) < 10 years, dentists (9 m/10f) ≥ 10 years) and a control group (10 m/10f) were measured. A three-dimensional back scanner captured the bare back of all subjects sitting on six dentist’s chairs of different design. Initially, inter-group comparisons per chair, firstly in the habitual and secondly in the working postures, were carried out. Furthermore, inter-chair comparison was conducted for the habitual as well as for the working postures of all subjects and for each group. Finally, a comparison between the habitual sitting posture and the working posture for each respective chair (intra-chair comparison) was conducted (for all subjects and for each group). In addition, a subjective assessment of each chair was made.
For the statistical analysis, non-parametric tests were conducted and the level of significance was set at 5%.
Results: When comparing the three subject groups, all chairs caused a more pronounced spinal kyphosis in experienced dentists. In both conditions (habitual and working postures), a symmetrical sitting position was assumed on each chair.
The inter-chair comparisons showed no differences regarding the ergonomic design of the chairs. The significances found in the inter-chair comparisons were all within the measurementerror and could, therefore, be classified as clinically irrelevant.
The intra-chair comparison (habitual sitting position vs. working sitting position) illustrated position-related changes in the sagittal, but not in the transverse, plane. These changes were only position-related (forward leaned working posture) and were not influenced by the ergonomic sitting design of the respective chair. There are no differences between the groups in the subjective assessment of each chair.
Conclusions; Regardless of the group or the dental experience, the ergonomic design of the dentist’s chair had only a marginal influence on the upper body posture in both the habitual and working sitting postures. Consequently, the focus of the dentist’s chair, in order to minimize MSD, should concentrate on adopting a symmetrical sitting posture rather than on its ergonomic design.
Background: Patients with head and neck cancer (HNC) are at high risk for malnutrition because of tumour localisation and therapy. Prophylactic percutaneous endoscopic gastrostomy (PEG) tube placement is common practice to prevent malnutrition.
Objective: To investigate the benefits of prophylactic PEG tube placement for HNC patients in terms of the influence on patients’ nutritional status, utilisation rate, complications and to identify the predictors of PEG tube utilisation.
Methods: All consecutive HNC patients who underwent prophylactic PEG tube insertion between 1 January 2011 and 31 December 2012 prior to therapy were enrolled. The PEG tube utilisation rate, complications, the patients’ nutritional status and tumour therapy were evaluated with the help of electronic patient charts and telephone interviews.
Results: A total of 181 patients (48 female, median 67.5 years) were included. The PEG utilisation rate in the entire cohort was 91.7%. One hundred and forty‐nine patients (82.3%) used the PEG tube for total enteral nutrition, 17 patients (9.4%) for supplemental nutrition and 15 patients (8.3%) made no use of the PEG tube. Peristomal wound infections were the most common complications (40.3%) in this study. A high Nutritional Risk Screening (NRS) score prior to tube insertion was found to be independently associated with PEG utilisation. No significant weight changes were observed across the three patient subgroups.
Conclusions: The overall PEG tube utilisation rate was high in this study. However, given the high rate of infections, diligent patient selection is crucial in order to determine which patients benefit most from prophylactic PEG tube insertion.
Background and Aims: The IL-12/23 inhibitor ustekinumab (UST) opened up new treatment options for patients with Crohn’s disease (CD). Due to the recent approval, real-world German data on long-term efficacy and safety are lacking. This study aimed to assess the clinical course of CD patients under UST therapy and to identify potential predictive markers.
Methods: Patients with CD receiving UST treatment in three hospitals and two outpatient centers were included and retrospectively analyzed. Rates for short- and long-term remission and response were analyzed with the help of clinical (Harvey–Bradshaw Index (HBI)) and biochemical (C-reactive protein (CRP), Fecal calprotectin (fCal)) parameters for disease activity.
Results: Data from 180 patients were evaluated. One-hundred-and-six patients had a follow-up of at least eight weeks and were included. 96.2% of the patients were pre-exposed to anti- TNFα agents and 34.4% to both anti-TNFα and anti-integrin antibodies. The median follow-up was 49.1 weeks (95% CI 42.03-56.25). At week 8, 51 patients (54.8%) showed response to UST, and 24 (24.7%) were in remission. At week 48, 48 (51.6%) responded to UST, and 25 patients (26.9%) were in remission. Steroid-free response and remission at week eight was achieved by 30.1% and 19.3% of patients, respectively. At week 48, 37.6% showed steroid-free response to UST, and 20.4% of the initial patient population was in steroid-free remission.
Conclusion: Our study confirms short- and long-term UST effectiveness and tolerability in a cohort of multi-treatment-exposed patients.
Background and Aims: Vitamin D has an inhibitory role in the inflammatory signaling pathways and supports the integrity of the intestinal barrier. Due to its immunomodulatory effect, vitamin D plays a role in chronic inflammatory bowel disease (IBD) and a deficiency is associated with an increased risk for a flare. We aimed to investigate to what extent the 25-hydroxyvitamin D (25(OH)D3) level correlates with disease activity and whether a cut-off value can be defined that discriminates between active disease and remission. Methods: Patients with IBD, treated at the University Hospital Frankfurt were analyzed retrospectively. The 25(OH)D3 levels were correlated with clinical activity indices and laboratory chemical activity parameters. A deficiency was defined as 25(OH)D3 levels <30 ng/mL. Results: A total of 470 (257 female) patients with IBD were included, 272 (57.9%) with Crohn’s disease (CD), 198 (42.1%) with ulcerative colitis (UC). The median age of the patients was 41 (18–84). In 283 patients (60.2%), a vitamin D deficiency was detected. 245 (53.6%) patients received oral vitamin D supplementation, and supplemented patients had significantly higher vitamin D levels (p < 0.0001). Remission, vitamin D substitution, and male gender were independently associated with the 25(OH)D3 serum concentration in our cohort in regression analysis. A 25(OH)D3 serum concentration of 27.5 ng/mL was the optimal cut-off value. Conclusion: Vitamin D deficiency is common in IBD patients and appears to be associated with increased disease activity. In our study, vitamin D levels were inversely associated with disease activity. Thus, close monitoring should be established, and optimized supplementation should take place.
To date, there is insufficient insight into inflammatory bowel disease (IBD)-associated stress, recognized disability, and contact with the social care system. We aimed to assess these parameters in IBD patients and a non-IBD control group, who were invited to participate in an online survey developed specifically for this study (www.soscisurvey.de) with the help of IBD patients. 505 IBD patients and 166 volunteers (i.e., control group) participated in the survey. IBD patients reported significantly increased levels of stress within the last six months and five years (p<0.0001) and were more likely to have a recognized disability (p<0.0001). A low academic status was the strongest indicator of a disability (p = 0.006). Only 153 IBD patients (30.3%) reported contact with the social care system, and a disability was the strongest indicator for this (p<0.0001). Our study provides data on stress and disability in a large unselected German IBD cohort. We showed that patients with IBD suffer more often from emotional stress and more often have a recognized disability. As only about 1/3 of the patients had come into contact with the social care system and the corresponding support, this patient group is undersupplied in this area.
Background: Essential Tremor (ET) is a progressive neurological disorder characterized by postural and kinetic tremor most commonly affecting the hands and arms. Medically intractable ET can be treated by deep brain stimulation (DBS) of the ventral intermediate nucleus of thalamus (VIM). We investigated whether the location of the effective contact (most tremor suppression with at least side effects) in VIM-DBS for ET changes over time, indicating a distinct mechanism of loss of efficacy that goes beyond progression of tremor severity, or a mere reduction of DBS efficacy.
Methods: We performed programming sessions in 10 patients who underwent bilateral vim-DBS surgery between 2009 and 2017 at our department. In addition to the intraoperative (T1) and first clinical programming session (T2) a third programming session (T3) was performed to assess the effect- and side effect threshold (minimum voltage at which a tremor suppression or side effects occurred). Additionally, we compared the choice of the effective contact between T1 and T2 which might be affected by a surgical induced “brain shift.”
Discussion: Over a time span of about 4 years VIM-DBS in ET showed continuous efficacy in tremor suppression during stim-ON compared to stim-OFF. Compared to immediate postoperative programming sessions in ET-patients with DBS, long-term evaluationshowednorelevantchangeinthechoiceofcontactwithrespecttosideeffects andefficacy.InthemajorityofthecasestheactivecontactatT2didnotcorrespondtothe most effective intraoperative stimulation site T1, which might be explained by a brain-shift due to cerebral spinal fluid loss after neurosurgical procedure.
Background: Ribavirin (RBV) remains part of several interferon-free treatment strategies even though its mechanisms of action are still not fully understood. One hypothesis is that RBV increases responsiveness to type I interferons. Pegylated Interferon alpha (PEG-IFNa) has recently been shown to alter natural killer (NK) cell function possibly contributing to control of hepatitis C virus (HCV) infection. However, the effects of ribavirin alone or in combination with IFNa on NK cells are unknown.
Methods: Extensive ex vivo phenotyping and functional analysis of NK cells from hepatitis C patients was performed during antiviral therapy. Patients were treated for 6 weeks with RBV monotherapy (n = 11), placebo (n = 13) or PEG-IFNa-2a alone (n = 6) followed by PEG-IFNa/RBV combination therapy. The effects of RBV and PEG-IFNa-2a on NK cells were also studied in vitro after co-culture with K562 or Huh7.5 cells.
Results: Ribavirin monotherapy had no obvious effects on NK cell phenotype or function, neither ex vivo in patients nor in vitro. In contrast, PEG-IFNa-2a therapy was associated with an increase of CD56bright cells and distinct changes in expression profiles leading to an activated NK cell phenotype, increased functionality and decline of terminally differentiated NK cells. Ribavirin combination therapy reduced some of the IFN effects. An activated NK cell phenotype during therapy was inversely correlated with HCV viral load.
Conclusions: PEG-IFNa activates NK cells possibly contributing to virological responses independently of RBV. The role of NK cells during future IFN-free combination therapies including RBV remains to be determined.
Background and aims: Spontaneous bacterial peritonitis (SBP) is a severe complication of decompensated cirrhosis. The prevalence of multidrug-resistant organisms (MDROs) in patients with cirrhosis is increasing. Identification of patients at risk for SBP due to MDROs (ie, SBP with the evidence of MDROs or Stenotrophomonas maltophilia in ascitic culture, MDRO-SBP) is crucial to the early adaptation of antibiotic treatment in such patients. We therefore investigated whether MDROs found in ascitic cultures can also be found in specimens determined by noninvasive screening procedures.
Patients and methods: This retrospective study was conducted at the liver center of the University Hospital Frankfurt, Germany. Between 2011 and 2016, patients with cirrhosis were included upon diagnosis of SBP and sample collection of aerobic/anaerobic ascitic cultures. Furthermore, the performance of at least one complete MDRO screening was mandatory for study inclusion.
Results: Of 133 patients diagnosed with SBP, 75 (56.4%) had culture-positive SBP and 22 (16.5%) had MDRO-SBP. Multidrug-resistant Escherichia coli (10/22; 45.5%) and vancomycin-resistant enterococci (7/22; 36.4%) resembled the major causatives of MDRO-SBP. Rectal swabs identified MDROs in 17 of 22 patients (77.3%) who developed MDRO-SBP with a time-dependent sensitivity of 77% and 87% after 30 and 90 days upon testing, while negative predictive value was 83% and 76%, respectively. The majority of patients were included from intensive care unit or intermediate care unit.
Conclusion: MDRO screening may serve as a noninvasive diagnostic tool to identify patients at risk for MDRO-SBP. Patients with decompensated cirrhosis should be screened for MDROs from the first day of inpatient treatment onward.
Background: Due to the coronavirus disease 2019 (COVID-19) pandemic, interventions in the upper airways are considered high-risk procedures for otolaryngologists and their colleagues. The purpose of this study was to evaluate limitations in hearing and communication when using a powered air-purifying respirator (PAPR) system to protect against severe acute respiratory syndrome coronavirus type 2 (SARS-CoV-2) transmission and to assess the benefit of a headset. Methods: Acoustic properties of the PAPR system were measured using a head and torso simulator. Audiological tests (tone audiometry, Freiburg speech test, Oldenburg sentence test (OLSA)) were performed in normal-hearing subjects (n = 10) to assess hearing with PAPR. The audiological test setup also included simulation of conditions in which the target speaker used either a PAPR, a filtering face piece (FFP) 3 respirator, or a surgical face mask. Results: Audiological measurements revealed that sound insulation by the PAPR headtop and noise, generated by the blower-assisted respiratory protection system, resulted in significantly deteriorated hearing thresholds (4.0 ± 7.2 dB hearing level (HL) vs. 49.2 ± 11.0
Hintergrund und Fragestellung: Die Severe acute respiratory syndrome coronavirus type 2(SARS-CoV-2)-Pandemie hat die Ausbildung von Medizinstudierenden grundlegend verändert. Die Notwendigkeit von Kontaktbeschränkungen und die damit einhergehende Forderung nach Distanzunterricht hat dazu geführt, dass innerhalb kurzer Zeit digitale Lehrformate umgesetzt werden mussten. Ziel dieser Arbeit war die Auswertung der studentischen Evaluationsergebnisse für virtuellen Unterricht im Fach Hals-Nasen-Ohren-Heilkunde während der SARS-CoV-2-Pandemie und ein Vergleich mit den zuvor erhobenen Evaluationsergebnissen unter Präsenzbedingungen.
Material und Methoden: Untersucht wurden die Evaluationsergebnisse für die Blockpraktika im Wintersemester 2020/21 und im Sommersemester 2021, die in einem virtuellen Format mit kurzer Präsenzphase durchgeführt wurden, sowie die der komplett im konventionellen Präsenzformat durchgeführten Praktika von Sommersemester 2018 bis Wintersemester 2019/20. Die anonyme Befragung der Studierenden bezog sich auf verschiedene Aspekte der Lehrveranstaltung, wie z. B. Organisation, Didaktik und Lernatmosphäre.
Ergebnisse: Von 16 abgefragten Kategorien zeigten 14 (87,5%) signifikant bessere Evaluationsergebnisse für die virtuellen Praktika verglichen mit den zuvor im Präsenzformat durchgeführten Praktika. Diese sehr positive Bewertung des digitalen Lehrangebots zeigte im Pandemieverlauf über die Dauer von zwei Semestern keine signifikante Änderung.
Schlussfolgerung: Die vorliegenden Daten belegen die hohe Akzeptanz eines digitalen Lehrangebots im Fach HNO-Heilkunde für Studierende. Auch wenn unerlässliche Bestandteile der ärztlichen Ausbildung, wie der Unterricht am Patienten und das Erlernen klinisch-praktischer Fertigkeiten, weiterhin nur im Präsenzformat realisiert werden können, legen die Ergebnisse nahe, dass digitale Elemente auch nach der SARS-CoV-2-Pandemie eine Rolle im Medizinstudium spielen könnten.
Background and Aim: The main disadvantage of plastic stents is the high rate of stent occlusion. The usual replacement interval of biliary plastic stents is 3 months. This study aimed to investigate if a shorter interval of 6–8 weeks impacts the median premature exchange rate (mPER) in benign and malignant biliary strictures.
Methods: All cases with endoscopic retrograde cholangiopancreatography (ERCP) and plastic stent placement were retrospectively analyzed since establishing an elective replacement interval of every 6–8 weeks at our institution and mPER was determined.
Results: A total of 3979 ERCPs (1199 patients) were analyzed, including 1262 (31.7%) malignant and 2717 (68.3%) benign cases, respectively. The median stent patency (mSP) was 41 days (range 14–120) for scheduled stent exchanges, whereas it was 17 days (1–75) for prematurely exchanged stents. The mPER was significantly higher for malignant (28.1%, 35–50%) compared with benign strictures (15.2%, 10–28%), P < 0.0001, respectively. mSP was significantly shorter in cases with only one stent (34 days [1–87] vs 41 days [1–120]) and in cases with only a 7-Fr stent (28 days [2–79]) compared with a larger stent (34 days [1–87], P = 0.001). Correspondingly, mPER was significantly higher in cases with only one stent (23% vs 16.2%, P < 0.0001) and only a 7-Fr stent (31.3% vs 22.4%, P = 0.03).
Conclusion: A shorter replacement interval does not seem to lead to a clinically meaningful reduction of mPER in benign and malignant strictures. Large stents and multiple stenting should be favored as possible.
Patients with neuroendocrine tumors (NET) often go through a long phase between onset of symptoms and initial diagnosis. Assessment of time to diagnosis and pre-clinical pathway in patients with gastroenteropancreatic NET (GEP-NET) with regard to metastases and symptoms. Retrospective analysis of patients with GEP-NET at a tertiary referral center from 1984 to 2019; inclusion criteria: Patients ≥18 years, diagnosis of GEP-NET; statistical analysis using non-parametrical methods. Four hundred eighty-six patients with 488 tumors were identified; median age at first diagnosis (478/486, 8 unknown) was 59 years; 52.9% male patients. Pancreatic NET: 143/488 tumors; 29.3%; small intestinal NET: 145/488 tumors, 29.7%. 128/303 patients (42.2%) showed NET specific and 122/486 (25%) patients other tumor-specific symptoms. 222/279 patients had distant metastases at initial diagnosis (187/222 liver metastases). 154/488 (31.6%) of GEP-NET were incidental findings. Median time from tumor manifestation (e.g., symptoms related to NET) to initial diagnosis across all entities was 19.5 (95% CI: 12–28) days. No significant difference in patients with or without distant metastases (median 73 vs 105 days, P = .42). A large proportion of GEP-NET are incidental findings and only about half of all patients are symptomatic at the time of diagnosis. We did not find a significant influence of the presence of metastases on time to diagnosis, which shows a large variability with a median of <30 days.
Background/aims: Hepatocellular carcinoma (HCC) is a leading indication for liver transplantation (LT) worldwide. Early identification of patients at risk for HCC recurrence is of paramount importance since early treatment of recurrent HCC after LT may be associated with increased survival. We evaluated incidence of and predictors for HCC recurrence, with a focus on the course of AFP levels.
Methods: We performed a retrospective, single-center study of 99 HCC patients who underwent LT between January 28th, 1997 and May 11th, 2016. A multi-stage proportional hazards model with three stages was used to evaluate potential predictive markers, both by univariate and multivariable analysis, for influences on 1) recurrence after transplantation, 2) mortality without HCC recurrence, and 3) mortality after recurrence.
Results: 19/99 HCC patients showed recurrence after LT. Waiting time was not associated with overall HCC recurrence (HR = 1, p = 0.979). Similarly, waiting time did not affect mortality in LT recipients both with (HR = 0.97, p = 0.282) or without (HR = 0.99, p = 0.685) HCC recurrence. Log10-transformed AFP values at the time of LT (HR 1.75, p = 0.023) as well as after LT (HR 2.07, p = 0.037) were significantly associated with recurrence. Median survival in patients with a ratio (AFP at recurrence divided by AFP 3 months before recurrence) of 0.5 was greater than 70 months, as compared to a median of only 8 months in patients with a ratio of 5.
Conclusion: A rise in AFP levels rather than an absolute threshold could help to identify patients at short-term risk for HCC recurrence post LT, which may allow intensification of the surveillance strategy on an individualized basis.
Penile squamous cell carcinomas are rare tumor entities throughout Europe. Early lymphonodal spread urges for aggressive therapeutic approaches in advanced tumor stages. Therefore, understanding tumor biology and its microenvironment and correlation with known survival data is of substantial interest in order to establish treatment strategies adapted to the individual patient. Fifty-five therapy naïve squamous cell carcinomas, age range between 41 and 85 years with known clinicopathological data, were investigated with the use of tissue microarrays (TMA) regarding the tumor-associated immune cell infiltrate density (ICID). Slides were stained with antibodies against CD3, CD8 and CD20. An image analysis software was applied for evaluation. Data were correlated with clinicopathological characteristics and overall survival. There was a significant increase of ICID in squamous cell carcinomas of the penis in relation to tumor adjacent physiological tissue. Higher CD3-positive ICID was significantly associated with lower tumor stage in our cohort. The ICID was not associated with overall survival. Our data sharpens the view on tumor-associated immune cell infiltrate in penile squamous cell carcinomas with an unbiased digital and automated cell count. Further investigations on the immune cell infiltrate and its prognostic and possible therapeutic impact are needed.
Objectives: The four-dimensional ultrasound (4D-US) enables imaging of the aortic segment and simultaneous determination of the wall expansion. The method shows a high spatial and temporal resolution, but its in vivo reliability is so far unknown for low-measure values. The present study determines the intraobserver repeatability and interobserver reproducibility of 4D-US in the atherosclerotic and non-atherosclerotic infrarenal aorta. Methods: In all, 22 patients with non-aneurysmal aorta were examined by an experienced examiner and a medical student. After registration of 4D images, both the examiners marked the aortic wall manually before the commercially implemented speckle tracking algorithm was applied. The cyclic changes of the aortic diameter and circumferential strain were determined with the help of custom-made software. The reliability of 4D-US was tested by the intraclass correlation coefficient (ICC). Results: The 4D-US measurements showed very good reliability for the maximum aortic diameter and the circumferential strain for all patients and for the non-atherosclerotic aortae (ICC >0.7), but low reliability for circumferential strain in calcified aortae (ICC = 0.29). The observer- and masking-related variances for both maximum diameter and circumferential strain were close to zero. Conclusions: Despite the low-measured values, the high spatial and temporal resolution of the 4D-US enables a reliable evaluation of cyclic diameter changes and circumferential strain in non-aneurysmal aortae independent from the observer experience but with some limitations for calcified aortae. The 4D-US opens up a new perspective with regard to noninvasive, in vivo assessment of kinematic properties of the vessel wall in the abdominal aorta.
In the application of range of motion (ROM) tests there is little agreement on the number of repetitions to be measured and the number of preceding warm-up protocols. In stretch training a plateau in ROM gains can be seen after four to five repetitions. With increasing number of repetitions, the gain in ROM is reduced. This study examines the question of whether such an effect occurs in common ROM tests. Twenty-two healthy sport students (10 m/12 f.) with an average age of 25.3 ± 1.94 years (average height 174.1 ± 9.8 cm; weight 66.6 ± 11.3 kg and BMI 21.9 ± 2.0 kg/cm2) volunteered in this study. Each subject performed five ROM tests in a randomized order—measured either via a tape measure or a digital inclinometer: Tape measure was used to evaluate the Fingertip-to-Floor test (FtF) and the Lateral Inclination test (LI). Retroflexion of the trunk modified after Janda (RF), Thomas test (TT) and a Shoulder test modified after Janda (ST) were evaluated with a digital inclinometer. In order to show general acute effects within 20 repetitions we performed ANOVA/Friedman-test with multiple comparisons. A non-linear regression was then performed to identify a plateau formation. Significance level was set at 5%. In seven out of eight ROM tests (five tests in total with three tests measured both left and right sides) significant flexibility gains were observed (FtF: p < 0.001; LI-left/right: p < 0.001/0.001; RF: p = 0.009; ST-left/right: p < 0.001/p = 0.003; TT-left: p < 0.001). A non-linear regression with random effects was successfully applied on FtF, RF, LI-left/right, ST-left and TT-left and thus, indicate a gradual decline in the amount of gained ROM. An acute effect was observed in most ROM tests, which is characterized by a gradual decline of ROM gain. For those tests, we can state that the acute effect described in the stretching literature also applies to the performance of typical ROM tests. Since a non-linear behavior was shown, it is the decision of the practitioner to weigh up between measurement accuracy and expenditure. Researchers and practitioners should consider this when applying ROM assessments to healthy young adults.
In the context of workplace health promotion, physical activity programs have been shown to reduce musculoskeletal diseases and stress, and to improve the quality of life. The aim of this study was to examine the effects of using the “five-Business” stretch training device for office workers on their quality of life. A total of 313 office workers (173m/137f) participated voluntarily in this intervention–control study with an average age of 43.37 ± 11.24 (SD) years, 175.37 ± 9.35 cm in height and 75.76 ± 15.23 kg in weight, with an average BMI of 24.5 ± 3.81 kg/m2. The participants completed the stretch training twice a week for approximately 10 minutes for a duration of 12 weeks. The SF-36 questionnaire was used to evaluate the effectiveness of the intervention at baseline and after 12 weeks. Significantly improved outcomes in mental sum score (p = 0.008), physical functioning (p < 0.001), bodily pain (p = 0.01), vitality (p = 0.025), role limitations due to physical problems (p = 0.018) and mental health (p = 0.012) were shown after the stretching training. The results suggest that a 12-week stretching program for office desk workers is suitable to improve significantly their health-related quality of life.
Background: Intestinal perforation or leakage increases morbidity and mortality of surgical and endoscopic interventions. We identified criteria for use of full-covered, extractable self-expanding metal stents (cSEMS) vs. "Over the scope"-clips (OTSC) for leak closure.
Methods: Patients who underwent endoscopic treatment for postoperative leakage, endoscopic perforation, or spontaneous rupture of the upper gastrointestinal tract between 2006 and 2013 were identified at four tertiary endoscopic centers. Technical success, outcome (e.g. duration of hospitalization, in-hospital mortality), and complications were assessed and analyzed with respect to etiology, size and location of leakage.
Results: Of 106 patients (male: 75 (71%), female: 31 (29%); age (mean ± SD): 62.5 ± 1.3 years, 72 (69%) were treated by cSEMS and 34 (31%) by OTSC. For cSEMS vs. OTSC, mean treatment duration was 41.1 vs. 25 days, p<0.001, leakage size 10 (1-50) vs. 5 (1-30) mm (median (range)), and complications were observed in 68% vs. 8.8%, p<0.001, respectively. Clinical success for primary interventional treatment was observed in 29/72 (40%) vs. 24/34 (70%, p = 0.006), and clinical success at the end of follow-up was 46/72 (64%) vs. 29/34 (85%) for patients treated by cSEMS vs. OTSC; p = 0.04.
Conclusion: OTSC is preferred in small-sized lesions and in perforation caused by endoscopic interventions, cSEMS in patients with concomitant local infection or abscess. cSEMS is associated with a higher frequency of complications. Therefore, OTSC might be preferred if technically feasible. Indication criteria for cSEMS vs. OTSC vary and might impede design of randomized studies.
Background: Hypoxia is a key driver for infiltrative growth in experimental gliomas. It has remained elusive whether tumor hypoxia in glioblastoma patients contributes to distant or diffuse recurrences. We therefore investigated the influence of perioperative cerebral ischemia on patterns of progression in glioblastoma patients.
Methods: We retrospectively screened MRI scans of 245 patients with newly diagnosed glioblastoma undergoing resection for perioperative ischemia near the resection cavity. 46 showed relevant ischemia nearby the resection cavity. A control cohort without perioperative ischemia was generated by a 1:1 matching using an algorithm based on gender, age and adjuvant treatment. Both cohorts were analyzed for patterns of progression by a blinded neuroradiologist.
Results: The percentage of diffuse or distant recurrences at first relapse was significantly higher in the cohort with perioperative ischemia (61.1%) compared to the control cohort (19.4%). The results of the control cohort matched well with historical data. The change in patterns of progression was not associated with a difference in survival.
Conclusions: This study reveals an unrecognized association of perioperative cerebral ischemia with distant or diffuse recurrence in glioblastoma. It is the first clinical study supporting the concept that hypoxia is a key driver of infiltrative tumor growth in glioblastoma patients.
Introduction Occurrence of inaccurate or delayed diagnoses is a significant concern in patient care, particularly in emergency medicine, where decision making is often constrained by high throughput and inaccurate admission diagnoses. Artificial intelligence-based diagnostic decision support system have been developed to enhance clinical performance by suggesting differential diagnoses to a given case, based on an integrated medical knowledge base and machine learning techniques. The purpose of the study is to evaluate the diagnostic accuracy of Ada, an app-based diagnostic tool and the impact on patient outcome.
Methods and analysis The eRadaR trial is a prospective, double-blinded study with patients presenting to the emergency room (ER) with abdominal pain. At initial contact in the ER, a structured interview will be performed using the Ada-App and both, patients and attending physicians, will be blinded to the proposed diagnosis lists until trial completion. Throughout the study, clinical data relating to diagnostic findings and types of therapy will be obtained and the follow-up until day 90 will comprise occurrence of complications and overall survival of patients. The primary efficacy of the trial is defined by the percentage of correct diagnoses suggested by Ada compared with the final discharge diagnosis. Further, accuracy and timing of diagnosis will be compared with decision making of classical doctor–patient interaction. Secondary objectives are complications, length of hospital stay and overall survival.
Ethics and dissemination Ethical approval was received by the independent ethics committee (IEC) of the Goethe-University Frankfurt on 9 April 2020 including the patient information material and informed consent form. All protocol amendments must be reported to and adapted by the IEC. The results from this study will be submitted to peer-reviewed journals and reported at suitable national and international meetings.
Trial registration number DRKS00019098.
Background: To study neoadjuvant chemoradiotherapy (nCRT) and potential predictive factors for response in locally advanced oral cavity cancer (LA-OCC).
Methods: The INVERT trial is an ongoing single-center, prospective phase 2, proof-of-principle trial. Operable patients with stage III-IVA squamous cell carcinomas of the oral cavity were eligible and received nCRT consisting of 60 Gy with concomitant cisplatin and 5-fluorouracil. Surgery was scheduled 6-8 weeks after completion of nCRT. Explorative, multiplex immunohistochemistry (IHC) was performed on pretreatment tumor specimen, and diffusion-weighted magnetic resonance imaging (DW-MRI) was conducted prior to, during nCRT (day 15), and before surgery to identify potential predictive biomarkers and imaging features. Primary endpoint was the pathological complete response (pCR) rate.
Results: Seventeen patients with stage IVA OCC were included in this interim analysis. All patients completed nCRT. One patient died from pneumonia 10 weeks after nCRT before surgery. Complete tumor resection (R0) was achieved in 16/17 patients, of whom 7 (41%, 95% CI: 18-67%) showed pCR. According to the Clavien-Dindo classification, grade 3a and 3b complications were found in 4 (25%) and 5 (31%) patients, respectively; grade 4-5 complications did not occur. Increased changes in the apparent diffusion coefficient signal intensities between MRI at day 15 of nCRT and before surgery were associated with better response (p=0.022). Higher abundances of programmed cell death protein 1 (PD1) positive cytotoxic T-cells (p=0.012), PD1+ macrophages (p=0.046), and cancer-associated fibroblasts (CAFs, p=0.036) were associated with incomplete response to nCRT.
Conclusion: nCRT for LA-OCC followed by radical surgery is feasible and shows high response rates. Larger patient cohorts from randomized trials are needed to further investigate nCRT and predictive biomarkers such as changes in DW-MRI signal intensities, tumor infiltrating immune cells, and CAFs.
Background and aims: Patients with gastric cancer often show signs of malnutrition. We sought to evaluate the influence of sarcopenia in patients with locally advanced, not metastasized, gastric or gastro-esophageal junction (GEJ) cancer undergoing curative treatment (perioperative chemotherapy and surgery) on morbidity and mortality in order to identify patients in need for nutritional intervention.
Patients and methods: Two-centre study, conducted in the Frankfurt University Clinic and Krankenhaus Nordwest (Frankfurt) as part of the University Cancer Center Frankfurt (UCT). 47/83 patients were treated in the FLOT trial (NCT01216644). Patients´ charts were reviewed for clinical data. Two consecutive CT scans were retrospectively analyzed to determine the degree of sarcopenia. Survival was calculated using the Kaplan-Meier method, multivariate analysis was performed using the Cox regression.
Results: 60 patients (72.3%) were male and 23 (27.7%) female. 45 patients (54.2%) had GEJ type 1–3 and 38 (45.8%) gastric tumors, respectively. Sarcopenic patients were significantly older than non-sarcopenic patients (mean age 65.1 years vs. 59.5 years, p = 0.042), terminated the chemotherapy significantly earlier (50% vs. 22.6%, p = 0.037) and showed higher Clavien-Dindo scores, indicating more severe perioperative complications (score ≥3 43.3 vs. 17.0%, p = 0.019). Sarcopenic patients had a significantly shorter survival than non-sarcopenic patients (139.6 ± 19.5 [95% CI, 101.3–177.9] vs. 206.7 ± 13.8 [95% CI, 179.5–233.8] weeks, p = 0.004). Multivariate Cox regression analysis showed that, besides UICC stage, sarcopenia significantly influenced survival.
Conclusion: Sarcopenia is present in a large proportion of patients with locally advanced gastric or GEJ cancer and significantly influences tolerability of chemotherapy, surgical complications and survival.
Objectives: Stenosis of the biliary anastomosis predisposes liver graft recipients to bacterial cholangitis. Antibiotic therapy (AT) is performed according to individual clinical judgment, but duration of AT remains unclear.
Methods: All liver graft recipients with acute cholangitis according to the Tokyo criteria grade 1 and 2 after endoscopic retrograde cholangiography (ERC) were included. Outcome of patients treated with short AT (<7 days) was compared to long AT (>6 days). Recurrent cholangitis (RC) within 28 days was the primary end point.
Results: In total, 30 patients were included with a median of 313 (range 34–9849) days after liver transplantation until first proven cholangitis. Among 62 cases in total, 51/62 (82%) were graded as Tokyo-1 and 11/62 (18%) as Tokyo-2. Overall median duration of AT was 6 days (1–14) with 36 cases (58%) receiving short AT and 26 (42%) receiving long AT. RC was observed in 10 (16%) cases, without significant difference in occurrence of RC in short versus long AT cases. CRP and bilirubin were significantly higher in patients with long AT, while low serum albumin and low platelets were associated with risk of RC.
Conclusion: A shorter antibiotic course than 7 days shows good results in selected, ERC-treated patients for post-transplantation biliary strictures.
Introduction: Scarce data exist for therapy regimens other than somatostatin analogues (SSA) and peptide receptor radiotherapy (PRRT) for siNET. We analyzed real world data for differences in survival according to therapy. Patients and methods: Analysis of 145 patients, diagnosed between 1993 and 2018 at a single institution, divided in treatment groups. Group (gr.) 0: no treatment (n = 10), gr 1: TACE and/or PRRT (n = 26), gr. 2: SSA (n = 32), gr. 3: SSA/PRRT (n = 8), gr. 4: chemotherapy (n = 8), gr. 5: not metastasized (at diagnosis), surgery only (n = 53), gr. 6 = metastasized (at diagnosis), surgery only (n = 10). Results: 45.5% female, median age 60 years (range, 27–84). A total of 125/145 patients with a resection of the primary tumor. For all patients, 1-year OS (%) was 93.8 (95%-CI: 90–98), 3-year OS = 84.3 (CI: 78–90) and 5-year OS = 77.5 (CI: 70–85). For analysis of survival according to therapy, only stage IV patients (baseline) that received treatment were included. Compared with reference gr. 2 (SSA only), HR for OS was 1.49 (p = 0.47) for gr. 1, 0.72 (p = 0.69) for gr. 3, 2.34 (p = 0.19) for gr. 4. The 5 y OS rate of patients whose primary tumor was resected (n = 125) was 73.1%, and without PTR was 33.3% (HR: 4.31; p = 0.003). Individual patients are represented in swimmer plots. Conclusions: For stage IV patients in this analysis (limited by low patient numbers in co. 3/4), multimodal treatment did not significantly improve survival over SSA treatment alone. A resection of primary tumor significantly improves survival.
Triple therapy of chronic hepatitis C virus (HCV) infection with boceprevir (BOC) or telaprevir (TVR) leads to virologic failure in many patients which is often associated with the selection of resistance-associated variants (RAVs). These resistance profiles are of importance for the selection of potential rescue treatment options. In this study, we sequenced baseline NS3 RAVs population-based and investigated the sensitivity of NS3 phenotypes in an HCV replicon assay together with clinical factors for a prediction of treatment response in a cohort of 165 German and Swiss patients treated with a BOC or TVR-based triple therapy. Overall, the prevalence of baseline RAVs was low, although the frequency of RAVs was higher in patients with virologic failure compared to those who achieved a sustained virologic response (SVR) (7% versus 1%, P = 0.06). The occurrence of RAVs was associated with a resistant NS3 quasispecies phenotype (P<0.001), but the sensitivity of phenotypes was not associated with treatment outcome (P = 0.2). The majority of single viral and host predictors of SVR was only weakly associated with treatment response. In multivariate analyses, low AST levels, female sex and an IFNL4 CC genotype were independently associated with SVR. However, a combined analysis of negative predictors revealed a significantly lower overall number of negative predictors in patients with SVR in comparison to individuals with virologic failure (P<0.0001) and the presence of 2 or less negative predictors was indicative for SVR. These results demonstrate that most single baseline viral and host parameters have a weak influence on the response to triple therapy, whereas the overall number of negative predictors has a high predictive value for SVR.
Interleukin-22 predicts severity and death in advanced liver cirrhosis: a prospective cohort study
(2012)
Background: Interleukin-22 (IL-22), recently identified as a crucial parameter of pathology in experimental liver damage, may determine survival in clinical end-stage liver disease. Systematic analysis of serum IL-22 in relation to morbidity and mortality of patients with advanced liver cirrhosis has not been performed so far.
Methods: This is a prospective cohort study including 120 liver cirrhosis patients and 40 healthy donors to analyze systemic levels of IL-22 in relation to survival and hepatic complications.
Results: A total of 71% of patients displayed liver cirrhosis-related complications at study inclusion. A total of 23% of the patients died during a mean follow-up of 196 +/- 165 days. Systemic IL-22 was detectable in 74% of patients but only in 10% of healthy donors (P <0.001). Elevated levels of IL-22 were associated with ascites (P = 0.006), hepatorenal syndrome (P <0.0001), and spontaneous bacterial peritonitis (P = 0.001). Patients with elevated IL-22 (>18 pg/ml, n = 57) showed significantly reduced survival compared to patients with regular ([less than or equal to]18 pg/ml) levels of IL-22 (321 days versus 526 days, P = 0.003). Other factors associated with overall survival were high CRP ([greater than or equal to]2.9 mg/dl, P = 0.005, hazard ratio (HR) 0.314, confidence interval (CI) (0.141 to 0.702)), elevated serum creatinine (P = 0.05, HR 0.453, CI (0.203 to 1.012)), presence of liver-related complications (P = 0.028, HR 0.258 CI (0.077 to 0.862)), model of end stage liver disease (MELD) score [greater than or equal to]20 (P = 0.017, HR 0.364, CI (0.159 to 0.835)) and age (P = 0.011, HR 1.047, CI (1.011 to 1.085)). Adjusted multivariate Cox proportional-hazards analysis identified elevated systemic IL-22 levels as independent predictors of reduced survival (P = 0.007, HR 0.218, CI (0.072 to 0.662)).
Conclusions: In patients with liver cirrhosis, elevated systemic IL-22 levels are predictive for reduced survival independently from age, liver-related complications, CRP, creatinine and the MELD score. Thus, processes that lead to a rise in systemic interleukin-22 may be relevant for prognosis of advanced liver cirrhosis.