Refine
Year of publication
Document Type
- Article (21)
Language
- English (21)
Has Fulltext
- yes (21)
Is part of the Bibliography
- no (21)
Keywords
- Cancer treatment (2)
- Crohn’s disease (2)
- IBD (2)
- Liver cirrhosis (2)
- Ulcerative colitis (2)
- Alcoholic liver disease (1)
- COVID-19 (1)
- Cancer chemotherapy (1)
- Computed axial tomography (1)
- Crohn's disease (1)
Institute
- Medizin (21)
- Biowissenschaften (1)
- Pharmazie (1)
- Sportwissenschaften (1)
Objectives: Rising prevalence of multidrug-resistant organisms (MDRO) is a major health problem in patients with liver cirrhosis. The impact of MDRO colonization in liver transplantation (LT) candidates and recipients on mortality has not been determined in detail.
Methods: Patients consecutively evaluated and listed for LT in a tertiary German liver transplant center from 2008 to 2018 underwent screening for MDRO colonization including methicillin-resistant Staphylococcus aureus (MRSA), multidrug-resistant gram-negative bacteria (MDRGN), and vancomycin-resistant enterococci (VRE). MDRO colonization and infection status were obtained at LT evaluation, planned and unplanned hospitalization, three months upon graft allocation, or at last follow-up on the waiting list.
Results: In total, 351 patients were listed for LT, of whom 164 (47%) underwent LT after a median of 249 (range 0–1662) days. Incidence of MDRO colonization increased during waiting time for LT, and MRDO colonization was associated with increased mortality on the waiting list (HR = 2.57, p<0.0001. One patients was colonized with a carbapenem-resistant strain at listing, 9 patients acquired carbapenem-resistant gram-negative bacteria (CRGN) on the waiting list, and 4 more after LT. In total, 10 of these 14 patients died.
Conclusions: Colonization with MDRO is associated with increased mortality on the waiting list, but not in short-term follow-up after LT. Moreover, colonization with CRGN seems associated with high mortality in liver transplant candidates and recipients.
Fit to play : posture and seating position analysis with professional musicians - a study protocol
(2017)
Background: Musical performance-associated musculoskeletal disorders (MSD) are a common health problem among professional musicians. Considering the manifold consequences arising for the musicians, they can be seen as a threat for their professional activity. String players are the most affected group of musicians in this matter. Faults in upper body posture while playing the instrument, causing un-ergonomic static strain on the back and unergonomic limp-movements, are a main reason for musculoskeletal disorders and pain syndromes.
Methods: A total of 66 professional musicians, divided into three groups, are measured.
The division is performed by average duration of performance, intensity of daily exercise and professional experience. Video raster stereography, a three-dimensional analysis of the body posture, is used to analyse the instrument-specific posture. Furthermore the pressure distribution during seating is analysed. Measurements are performed because the musician is sitting on varying music chairs differing in structure and/or construction of the seating surface. The measurements take place in habitual seating position as well as during playing the instrument.
Results: To analyse the influence of different chairs, ANOVA for repeated measurements or Friedman-test is used, depending on normality assumptions. Comparison of posture between amateur musicians, students, and professional orchestral musicians is carried out the non-parametric Jonckheere-Terpstra-test.
Conclusions: Our method attempts to give the musicians indications for the right music chair choice by analyzing the chair concepts, so that thereby preemptively MSD can be reduced or prevented.
Influence of antibiotic-regimens on intensive-care unit-mortality and liver-cirrhosis as risk factor
(2016)
AIM: To assess the rate of infection, appropriateness of antimicrobial-therapy and mortality on intensive care unit (ICU). Special focus was drawn on patients with liver cirrhosis.
METHODS: The study was approved by the local ethical committee. All patients admitted to the Internal Medicine-ICU between April 1, 2007 and December 31, 2009 were included. Data were extracted retrospectively from all patients using patient charts and electronic documentations on infection, microbiological laboratory reports, diagnosis and therapy. Due to the large hepatology department and liver transplantation center, special interest was on the subgroup of patients with liver cirrhosis. The primary statistical-endpoint was the evaluation of the influence of appropriate versus inappropriate antimicrobial-therapy on in-hospital-mortality.
RESULTS: Charts of 1979 patients were available. The overall infection-rate was 53%. Multiresistant-bacteria were present in 23% of patients with infection and were associated with increased mortality (P < 0.000001). Patients with infection had significantly increased in-hospital-mortality (34% vs 17%, P < 0.000001). Only 9% of patients with infection received inappropriate initial antimicrobial-therapy, no influence on mortality was observed. Independent risk-factors for in-hospital-mortality were the presence of septic-shock, prior chemotherapy for malignoma and infection with Pseudomonas spp. Infection and mortality-rate among 175 patients with liver-cirrhosis was significantly higher than in patients without liver-cirrhosis. Infection increased mortality 2.24-fold in patients with cirrhosis. Patients with liver cirrhosis were at an increased risk to receive inappropriate initial antimicrobial therapy.
CONCLUSION: The results of the present study report the successful implementation of early-goal-directed therapy. Liver cirrhosis patients are at increased risk of infection, mortality and to receive inappropriate therapy. Increasing burden are multiresistant-bacteria.
Chronic viral hepatitis is associated with substantial morbidity and mortality worldwide. The aim of our study was to assess the ability of point shear‐wave elastography (pSWE) using acoustic radiation force impulse imaging for the prediction of the following liver‐related events (LREs): new diagnosis of HCC, liver transplantation, or liver‐related death (hepatic decompensation was not included as an LRE). pSWE was performed at study inclusion and compared with liver histology, transient elastography (TE), and serologic biomarkers (aspartate aminotransferase to platelet ratio index, Fibrosis‐4, FibroTest). The performance of pSWE and TE to predict LREs was assessed by calculating the area under the receiver operating characteristic curve and a Cox proportional‐hazards regression model. A total of 254 patients with a median follow‐up of 78 months were included in the study. LRE occurred in 28 patients (11%) during follow‐up. In both patients with hepatitis B virus and hepatitis C virus (HCV), pSWE showed significant correlations with noninvasive tests and TE, and median pSWE and TE values were significantly different between patients with LREs and patients without LREs (both P < 0.0001). In patients with HCV, the area under the receiver operating characteristic curve for pSWE and TE to predict LREs were comparable: 0.859 (95% confidence interval [CI], 0.747‐0.969) and 0.852 (95% CI, 0.737‐0.967) (P = 0.93). In Cox regression analysis, pSWE independently predicted LREs in all patients with HCV (hazard ratio, 17.9; 95% CI, 5.21‐61‐17; P < 0.0001) and those who later received direct‐acting antiviral therapy (hazard ratio, 17.11; 95% CI, 3.88‐75.55; P = 0.0002). Conclusion: Our study shows good comparability between pSWE and TE. pSWE is a promising tool for the prediction of LREs in patients with viral hepatitis, particularly those with chronic HCV. Further studies are needed to confirm our data and assess their prognostic value in other liver diseases.
Background: Patients with head and neck cancer (HNC) are at high risk for malnutrition because of tumour localisation and therapy. Prophylactic percutaneous endoscopic gastrostomy (PEG) tube placement is common practice to prevent malnutrition.
Objective: To investigate the benefits of prophylactic PEG tube placement for HNC patients in terms of the influence on patients’ nutritional status, utilisation rate, complications and to identify the predictors of PEG tube utilisation.
Methods: All consecutive HNC patients who underwent prophylactic PEG tube insertion between 1 January 2011 and 31 December 2012 prior to therapy were enrolled. The PEG tube utilisation rate, complications, the patients’ nutritional status and tumour therapy were evaluated with the help of electronic patient charts and telephone interviews.
Results: A total of 181 patients (48 female, median 67.5 years) were included. The PEG utilisation rate in the entire cohort was 91.7%. One hundred and forty‐nine patients (82.3%) used the PEG tube for total enteral nutrition, 17 patients (9.4%) for supplemental nutrition and 15 patients (8.3%) made no use of the PEG tube. Peristomal wound infections were the most common complications (40.3%) in this study. A high Nutritional Risk Screening (NRS) score prior to tube insertion was found to be independently associated with PEG utilisation. No significant weight changes were observed across the three patient subgroups.
Conclusions: The overall PEG tube utilisation rate was high in this study. However, given the high rate of infections, diligent patient selection is crucial in order to determine which patients benefit most from prophylactic PEG tube insertion.
Background and Aims: The IL-12/23 inhibitor ustekinumab (UST) opened up new treatment options for patients with Crohn’s disease (CD). Due to the recent approval, real-world German data on long-term efficacy and safety are lacking. This study aimed to assess the clinical course of CD patients under UST therapy and to identify potential predictive markers.
Methods: Patients with CD receiving UST treatment in three hospitals and two outpatient centers were included and retrospectively analyzed. Rates for short- and long-term remission and response were analyzed with the help of clinical (Harvey–Bradshaw Index (HBI)) and biochemical (C-reactive protein (CRP), Fecal calprotectin (fCal)) parameters for disease activity.
Results: Data from 180 patients were evaluated. One-hundred-and-six patients had a follow-up of at least eight weeks and were included. 96.2% of the patients were pre-exposed to anti- TNFα agents and 34.4% to both anti-TNFα and anti-integrin antibodies. The median follow-up was 49.1 weeks (95% CI 42.03-56.25). At week 8, 51 patients (54.8%) showed response to UST, and 24 (24.7%) were in remission. At week 48, 48 (51.6%) responded to UST, and 25 patients (26.9%) were in remission. Steroid-free response and remission at week eight was achieved by 30.1% and 19.3% of patients, respectively. At week 48, 37.6% showed steroid-free response to UST, and 20.4% of the initial patient population was in steroid-free remission.
Conclusion: Our study confirms short- and long-term UST effectiveness and tolerability in a cohort of multi-treatment-exposed patients.
Background and Aims: Vitamin D has an inhibitory role in the inflammatory signaling pathways and supports the integrity of the intestinal barrier. Due to its immunomodulatory effect, vitamin D plays a role in chronic inflammatory bowel disease (IBD) and a deficiency is associated with an increased risk for a flare. We aimed to investigate to what extent the 25-hydroxyvitamin D (25(OH)D3) level correlates with disease activity and whether a cut-off value can be defined that discriminates between active disease and remission. Methods: Patients with IBD, treated at the University Hospital Frankfurt were analyzed retrospectively. The 25(OH)D3 levels were correlated with clinical activity indices and laboratory chemical activity parameters. A deficiency was defined as 25(OH)D3 levels <30 ng/mL. Results: A total of 470 (257 female) patients with IBD were included, 272 (57.9%) with Crohn’s disease (CD), 198 (42.1%) with ulcerative colitis (UC). The median age of the patients was 41 (18–84). In 283 patients (60.2%), a vitamin D deficiency was detected. 245 (53.6%) patients received oral vitamin D supplementation, and supplemented patients had significantly higher vitamin D levels (p < 0.0001). Remission, vitamin D substitution, and male gender were independently associated with the 25(OH)D3 serum concentration in our cohort in regression analysis. A 25(OH)D3 serum concentration of 27.5 ng/mL was the optimal cut-off value. Conclusion: Vitamin D deficiency is common in IBD patients and appears to be associated with increased disease activity. In our study, vitamin D levels were inversely associated with disease activity. Thus, close monitoring should be established, and optimized supplementation should take place.
To date, there is insufficient insight into inflammatory bowel disease (IBD)-associated stress, recognized disability, and contact with the social care system. We aimed to assess these parameters in IBD patients and a non-IBD control group, who were invited to participate in an online survey developed specifically for this study (www.soscisurvey.de) with the help of IBD patients. 505 IBD patients and 166 volunteers (i.e., control group) participated in the survey. IBD patients reported significantly increased levels of stress within the last six months and five years (p<0.0001) and were more likely to have a recognized disability (p<0.0001). A low academic status was the strongest indicator of a disability (p = 0.006). Only 153 IBD patients (30.3%) reported contact with the social care system, and a disability was the strongest indicator for this (p<0.0001). Our study provides data on stress and disability in a large unselected German IBD cohort. We showed that patients with IBD suffer more often from emotional stress and more often have a recognized disability. As only about 1/3 of the patients had come into contact with the social care system and the corresponding support, this patient group is undersupplied in this area.
Background: Essential Tremor (ET) is a progressive neurological disorder characterized by postural and kinetic tremor most commonly affecting the hands and arms. Medically intractable ET can be treated by deep brain stimulation (DBS) of the ventral intermediate nucleus of thalamus (VIM). We investigated whether the location of the effective contact (most tremor suppression with at least side effects) in VIM-DBS for ET changes over time, indicating a distinct mechanism of loss of efficacy that goes beyond progression of tremor severity, or a mere reduction of DBS efficacy.
Methods: We performed programming sessions in 10 patients who underwent bilateral vim-DBS surgery between 2009 and 2017 at our department. In addition to the intraoperative (T1) and first clinical programming session (T2) a third programming session (T3) was performed to assess the effect- and side effect threshold (minimum voltage at which a tremor suppression or side effects occurred). Additionally, we compared the choice of the effective contact between T1 and T2 which might be affected by a surgical induced “brain shift.”
Discussion: Over a time span of about 4 years VIM-DBS in ET showed continuous efficacy in tremor suppression during stim-ON compared to stim-OFF. Compared to immediate postoperative programming sessions in ET-patients with DBS, long-term evaluationshowednorelevantchangeinthechoiceofcontactwithrespecttosideeffects andefficacy.InthemajorityofthecasestheactivecontactatT2didnotcorrespondtothe most effective intraoperative stimulation site T1, which might be explained by a brain-shift due to cerebral spinal fluid loss after neurosurgical procedure.
Background: Ribavirin (RBV) remains part of several interferon-free treatment strategies even though its mechanisms of action are still not fully understood. One hypothesis is that RBV increases responsiveness to type I interferons. Pegylated Interferon alpha (PEG-IFNa) has recently been shown to alter natural killer (NK) cell function possibly contributing to control of hepatitis C virus (HCV) infection. However, the effects of ribavirin alone or in combination with IFNa on NK cells are unknown.
Methods: Extensive ex vivo phenotyping and functional analysis of NK cells from hepatitis C patients was performed during antiviral therapy. Patients were treated for 6 weeks with RBV monotherapy (n = 11), placebo (n = 13) or PEG-IFNa-2a alone (n = 6) followed by PEG-IFNa/RBV combination therapy. The effects of RBV and PEG-IFNa-2a on NK cells were also studied in vitro after co-culture with K562 or Huh7.5 cells.
Results: Ribavirin monotherapy had no obvious effects on NK cell phenotype or function, neither ex vivo in patients nor in vitro. In contrast, PEG-IFNa-2a therapy was associated with an increase of CD56bright cells and distinct changes in expression profiles leading to an activated NK cell phenotype, increased functionality and decline of terminally differentiated NK cells. Ribavirin combination therapy reduced some of the IFN effects. An activated NK cell phenotype during therapy was inversely correlated with HCV viral load.
Conclusions: PEG-IFNa activates NK cells possibly contributing to virological responses independently of RBV. The role of NK cells during future IFN-free combination therapies including RBV remains to be determined.
Background and aims: Spontaneous bacterial peritonitis (SBP) is a severe complication of decompensated cirrhosis. The prevalence of multidrug-resistant organisms (MDROs) in patients with cirrhosis is increasing. Identification of patients at risk for SBP due to MDROs (ie, SBP with the evidence of MDROs or Stenotrophomonas maltophilia in ascitic culture, MDRO-SBP) is crucial to the early adaptation of antibiotic treatment in such patients. We therefore investigated whether MDROs found in ascitic cultures can also be found in specimens determined by noninvasive screening procedures.
Patients and methods: This retrospective study was conducted at the liver center of the University Hospital Frankfurt, Germany. Between 2011 and 2016, patients with cirrhosis were included upon diagnosis of SBP and sample collection of aerobic/anaerobic ascitic cultures. Furthermore, the performance of at least one complete MDRO screening was mandatory for study inclusion.
Results: Of 133 patients diagnosed with SBP, 75 (56.4%) had culture-positive SBP and 22 (16.5%) had MDRO-SBP. Multidrug-resistant Escherichia coli (10/22; 45.5%) and vancomycin-resistant enterococci (7/22; 36.4%) resembled the major causatives of MDRO-SBP. Rectal swabs identified MDROs in 17 of 22 patients (77.3%) who developed MDRO-SBP with a time-dependent sensitivity of 77% and 87% after 30 and 90 days upon testing, while negative predictive value was 83% and 76%, respectively. The majority of patients were included from intensive care unit or intermediate care unit.
Conclusion: MDRO screening may serve as a noninvasive diagnostic tool to identify patients at risk for MDRO-SBP. Patients with decompensated cirrhosis should be screened for MDROs from the first day of inpatient treatment onward.
Background: Due to the coronavirus disease 2019 (COVID-19) pandemic, interventions in the upper airways are considered high-risk procedures for otolaryngologists and their colleagues. The purpose of this study was to evaluate limitations in hearing and communication when using a powered air-purifying respirator (PAPR) system to protect against severe acute respiratory syndrome coronavirus type 2 (SARS-CoV-2) transmission and to assess the benefit of a headset. Methods: Acoustic properties of the PAPR system were measured using a head and torso simulator. Audiological tests (tone audiometry, Freiburg speech test, Oldenburg sentence test (OLSA)) were performed in normal-hearing subjects (n = 10) to assess hearing with PAPR. The audiological test setup also included simulation of conditions in which the target speaker used either a PAPR, a filtering face piece (FFP) 3 respirator, or a surgical face mask. Results: Audiological measurements revealed that sound insulation by the PAPR headtop and noise, generated by the blower-assisted respiratory protection system, resulted in significantly deteriorated hearing thresholds (4.0 ± 7.2 dB hearing level (HL) vs. 49.2 ± 11.0
Background: Intestinal perforation or leakage increases morbidity and mortality of surgical and endoscopic interventions. We identified criteria for use of full-covered, extractable self-expanding metal stents (cSEMS) vs. "Over the scope"-clips (OTSC) for leak closure.
Methods: Patients who underwent endoscopic treatment for postoperative leakage, endoscopic perforation, or spontaneous rupture of the upper gastrointestinal tract between 2006 and 2013 were identified at four tertiary endoscopic centers. Technical success, outcome (e.g. duration of hospitalization, in-hospital mortality), and complications were assessed and analyzed with respect to etiology, size and location of leakage.
Results: Of 106 patients (male: 75 (71%), female: 31 (29%); age (mean ± SD): 62.5 ± 1.3 years, 72 (69%) were treated by cSEMS and 34 (31%) by OTSC. For cSEMS vs. OTSC, mean treatment duration was 41.1 vs. 25 days, p<0.001, leakage size 10 (1-50) vs. 5 (1-30) mm (median (range)), and complications were observed in 68% vs. 8.8%, p<0.001, respectively. Clinical success for primary interventional treatment was observed in 29/72 (40%) vs. 24/34 (70%, p = 0.006), and clinical success at the end of follow-up was 46/72 (64%) vs. 29/34 (85%) for patients treated by cSEMS vs. OTSC; p = 0.04.
Conclusion: OTSC is preferred in small-sized lesions and in perforation caused by endoscopic interventions, cSEMS in patients with concomitant local infection or abscess. cSEMS is associated with a higher frequency of complications. Therefore, OTSC might be preferred if technically feasible. Indication criteria for cSEMS vs. OTSC vary and might impede design of randomized studies.
Introduction Occurrence of inaccurate or delayed diagnoses is a significant concern in patient care, particularly in emergency medicine, where decision making is often constrained by high throughput and inaccurate admission diagnoses. Artificial intelligence-based diagnostic decision support system have been developed to enhance clinical performance by suggesting differential diagnoses to a given case, based on an integrated medical knowledge base and machine learning techniques. The purpose of the study is to evaluate the diagnostic accuracy of Ada, an app-based diagnostic tool and the impact on patient outcome.
Methods and analysis The eRadaR trial is a prospective, double-blinded study with patients presenting to the emergency room (ER) with abdominal pain. At initial contact in the ER, a structured interview will be performed using the Ada-App and both, patients and attending physicians, will be blinded to the proposed diagnosis lists until trial completion. Throughout the study, clinical data relating to diagnostic findings and types of therapy will be obtained and the follow-up until day 90 will comprise occurrence of complications and overall survival of patients. The primary efficacy of the trial is defined by the percentage of correct diagnoses suggested by Ada compared with the final discharge diagnosis. Further, accuracy and timing of diagnosis will be compared with decision making of classical doctor–patient interaction. Secondary objectives are complications, length of hospital stay and overall survival.
Ethics and dissemination Ethical approval was received by the independent ethics committee (IEC) of the Goethe-University Frankfurt on 9 April 2020 including the patient information material and informed consent form. All protocol amendments must be reported to and adapted by the IEC. The results from this study will be submitted to peer-reviewed journals and reported at suitable national and international meetings.
Trial registration number DRKS00019098.
Background/aims: Hepatocellular carcinoma (HCC) is a leading indication for liver transplantation (LT) worldwide. Early identification of patients at risk for HCC recurrence is of paramount importance since early treatment of recurrent HCC after LT may be associated with increased survival. We evaluated incidence of and predictors for HCC recurrence, with a focus on the course of AFP levels.
Methods: We performed a retrospective, single-center study of 99 HCC patients who underwent LT between January 28th, 1997 and May 11th, 2016. A multi-stage proportional hazards model with three stages was used to evaluate potential predictive markers, both by univariate and multivariable analysis, for influences on 1) recurrence after transplantation, 2) mortality without HCC recurrence, and 3) mortality after recurrence.
Results: 19/99 HCC patients showed recurrence after LT. Waiting time was not associated with overall HCC recurrence (HR = 1, p = 0.979). Similarly, waiting time did not affect mortality in LT recipients both with (HR = 0.97, p = 0.282) or without (HR = 0.99, p = 0.685) HCC recurrence. Log10-transformed AFP values at the time of LT (HR 1.75, p = 0.023) as well as after LT (HR 2.07, p = 0.037) were significantly associated with recurrence. Median survival in patients with a ratio (AFP at recurrence divided by AFP 3 months before recurrence) of 0.5 was greater than 70 months, as compared to a median of only 8 months in patients with a ratio of 5.
Conclusion: A rise in AFP levels rather than an absolute threshold could help to identify patients at short-term risk for HCC recurrence post LT, which may allow intensification of the surveillance strategy on an individualized basis.
In the context of workplace health promotion, physical activity programs have been shown to reduce musculoskeletal diseases and stress, and to improve the quality of life. The aim of this study was to examine the effects of using the “five-Business” stretch training device for office workers on their quality of life. A total of 313 office workers (173m/137f) participated voluntarily in this intervention–control study with an average age of 43.37 ± 11.24 (SD) years, 175.37 ± 9.35 cm in height and 75.76 ± 15.23 kg in weight, with an average BMI of 24.5 ± 3.81 kg/m2. The participants completed the stretch training twice a week for approximately 10 minutes for a duration of 12 weeks. The SF-36 questionnaire was used to evaluate the effectiveness of the intervention at baseline and after 12 weeks. Significantly improved outcomes in mental sum score (p = 0.008), physical functioning (p < 0.001), bodily pain (p = 0.01), vitality (p = 0.025), role limitations due to physical problems (p = 0.018) and mental health (p = 0.012) were shown after the stretching training. The results suggest that a 12-week stretching program for office desk workers is suitable to improve significantly their health-related quality of life.
Objectives: The four-dimensional ultrasound (4D-US) enables imaging of the aortic segment and simultaneous determination of the wall expansion. The method shows a high spatial and temporal resolution, but its in vivo reliability is so far unknown for low-measure values. The present study determines the intraobserver repeatability and interobserver reproducibility of 4D-US in the atherosclerotic and non-atherosclerotic infrarenal aorta. Methods: In all, 22 patients with non-aneurysmal aorta were examined by an experienced examiner and a medical student. After registration of 4D images, both the examiners marked the aortic wall manually before the commercially implemented speckle tracking algorithm was applied. The cyclic changes of the aortic diameter and circumferential strain were determined with the help of custom-made software. The reliability of 4D-US was tested by the intraclass correlation coefficient (ICC). Results: The 4D-US measurements showed very good reliability for the maximum aortic diameter and the circumferential strain for all patients and for the non-atherosclerotic aortae (ICC >0.7), but low reliability for circumferential strain in calcified aortae (ICC = 0.29). The observer- and masking-related variances for both maximum diameter and circumferential strain were close to zero. Conclusions: Despite the low-measured values, the high spatial and temporal resolution of the 4D-US enables a reliable evaluation of cyclic diameter changes and circumferential strain in non-aneurysmal aortae independent from the observer experience but with some limitations for calcified aortae. The 4D-US opens up a new perspective with regard to noninvasive, in vivo assessment of kinematic properties of the vessel wall in the abdominal aorta.
Background and aims: Patients with gastric cancer often show signs of malnutrition. We sought to evaluate the influence of sarcopenia in patients with locally advanced, not metastasized, gastric or gastro-esophageal junction (GEJ) cancer undergoing curative treatment (perioperative chemotherapy and surgery) on morbidity and mortality in order to identify patients in need for nutritional intervention.
Patients and methods: Two-centre study, conducted in the Frankfurt University Clinic and Krankenhaus Nordwest (Frankfurt) as part of the University Cancer Center Frankfurt (UCT). 47/83 patients were treated in the FLOT trial (NCT01216644). Patients´ charts were reviewed for clinical data. Two consecutive CT scans were retrospectively analyzed to determine the degree of sarcopenia. Survival was calculated using the Kaplan-Meier method, multivariate analysis was performed using the Cox regression.
Results: 60 patients (72.3%) were male and 23 (27.7%) female. 45 patients (54.2%) had GEJ type 1–3 and 38 (45.8%) gastric tumors, respectively. Sarcopenic patients were significantly older than non-sarcopenic patients (mean age 65.1 years vs. 59.5 years, p = 0.042), terminated the chemotherapy significantly earlier (50% vs. 22.6%, p = 0.037) and showed higher Clavien-Dindo scores, indicating more severe perioperative complications (score ≥3 43.3 vs. 17.0%, p = 0.019). Sarcopenic patients had a significantly shorter survival than non-sarcopenic patients (139.6 ± 19.5 [95% CI, 101.3–177.9] vs. 206.7 ± 13.8 [95% CI, 179.5–233.8] weeks, p = 0.004). Multivariate Cox regression analysis showed that, besides UICC stage, sarcopenia significantly influenced survival.
Conclusion: Sarcopenia is present in a large proportion of patients with locally advanced gastric or GEJ cancer and significantly influences tolerability of chemotherapy, surgical complications and survival.
Background: Hypoxia is a key driver for infiltrative growth in experimental gliomas. It has remained elusive whether tumor hypoxia in glioblastoma patients contributes to distant or diffuse recurrences. We therefore investigated the influence of perioperative cerebral ischemia on patterns of progression in glioblastoma patients.
Methods: We retrospectively screened MRI scans of 245 patients with newly diagnosed glioblastoma undergoing resection for perioperative ischemia near the resection cavity. 46 showed relevant ischemia nearby the resection cavity. A control cohort without perioperative ischemia was generated by a 1:1 matching using an algorithm based on gender, age and adjuvant treatment. Both cohorts were analyzed for patterns of progression by a blinded neuroradiologist.
Results: The percentage of diffuse or distant recurrences at first relapse was significantly higher in the cohort with perioperative ischemia (61.1%) compared to the control cohort (19.4%). The results of the control cohort matched well with historical data. The change in patterns of progression was not associated with a difference in survival.
Conclusions: This study reveals an unrecognized association of perioperative cerebral ischemia with distant or diffuse recurrence in glioblastoma. It is the first clinical study supporting the concept that hypoxia is a key driver of infiltrative tumor growth in glioblastoma patients.
Triple therapy of chronic hepatitis C virus (HCV) infection with boceprevir (BOC) or telaprevir (TVR) leads to virologic failure in many patients which is often associated with the selection of resistance-associated variants (RAVs). These resistance profiles are of importance for the selection of potential rescue treatment options. In this study, we sequenced baseline NS3 RAVs population-based and investigated the sensitivity of NS3 phenotypes in an HCV replicon assay together with clinical factors for a prediction of treatment response in a cohort of 165 German and Swiss patients treated with a BOC or TVR-based triple therapy. Overall, the prevalence of baseline RAVs was low, although the frequency of RAVs was higher in patients with virologic failure compared to those who achieved a sustained virologic response (SVR) (7% versus 1%, P = 0.06). The occurrence of RAVs was associated with a resistant NS3 quasispecies phenotype (P<0.001), but the sensitivity of phenotypes was not associated with treatment outcome (P = 0.2). The majority of single viral and host predictors of SVR was only weakly associated with treatment response. In multivariate analyses, low AST levels, female sex and an IFNL4 CC genotype were independently associated with SVR. However, a combined analysis of negative predictors revealed a significantly lower overall number of negative predictors in patients with SVR in comparison to individuals with virologic failure (P<0.0001) and the presence of 2 or less negative predictors was indicative for SVR. These results demonstrate that most single baseline viral and host parameters have a weak influence on the response to triple therapy, whereas the overall number of negative predictors has a high predictive value for SVR.