Refine
Year of publication
Document Type
- Article (31)
Has Fulltext
- yes (31)
Is part of the Bibliography
- no (31)
Keywords
- COVID-19 (2)
- Cancer treatment (2)
- Crohn’s disease (2)
- IBD (2)
- Liver cirrhosis (2)
- Posture analysis (2)
- Ulcerative colitis (2)
- biliary stricture (2)
- delay (2)
- diagnosis (2)
Institute
Objectives: Rising prevalence of multidrug-resistant organisms (MDRO) is a major health problem in patients with liver cirrhosis. The impact of MDRO colonization in liver transplantation (LT) candidates and recipients on mortality has not been determined in detail.
Methods: Patients consecutively evaluated and listed for LT in a tertiary German liver transplant center from 2008 to 2018 underwent screening for MDRO colonization including methicillin-resistant Staphylococcus aureus (MRSA), multidrug-resistant gram-negative bacteria (MDRGN), and vancomycin-resistant enterococci (VRE). MDRO colonization and infection status were obtained at LT evaluation, planned and unplanned hospitalization, three months upon graft allocation, or at last follow-up on the waiting list.
Results: In total, 351 patients were listed for LT, of whom 164 (47%) underwent LT after a median of 249 (range 0–1662) days. Incidence of MDRO colonization increased during waiting time for LT, and MRDO colonization was associated with increased mortality on the waiting list (HR = 2.57, p<0.0001. One patients was colonized with a carbapenem-resistant strain at listing, 9 patients acquired carbapenem-resistant gram-negative bacteria (CRGN) on the waiting list, and 4 more after LT. In total, 10 of these 14 patients died.
Conclusions: Colonization with MDRO is associated with increased mortality on the waiting list, but not in short-term follow-up after LT. Moreover, colonization with CRGN seems associated with high mortality in liver transplant candidates and recipients.
Fit to play : posture and seating position analysis with professional musicians - a study protocol
(2017)
Background: Musical performance-associated musculoskeletal disorders (MSD) are a common health problem among professional musicians. Considering the manifold consequences arising for the musicians, they can be seen as a threat for their professional activity. String players are the most affected group of musicians in this matter. Faults in upper body posture while playing the instrument, causing un-ergonomic static strain on the back and unergonomic limp-movements, are a main reason for musculoskeletal disorders and pain syndromes.
Methods: A total of 66 professional musicians, divided into three groups, are measured.
The division is performed by average duration of performance, intensity of daily exercise and professional experience. Video raster stereography, a three-dimensional analysis of the body posture, is used to analyse the instrument-specific posture. Furthermore the pressure distribution during seating is analysed. Measurements are performed because the musician is sitting on varying music chairs differing in structure and/or construction of the seating surface. The measurements take place in habitual seating position as well as during playing the instrument.
Results: To analyse the influence of different chairs, ANOVA for repeated measurements or Friedman-test is used, depending on normality assumptions. Comparison of posture between amateur musicians, students, and professional orchestral musicians is carried out the non-parametric Jonckheere-Terpstra-test.
Conclusions: Our method attempts to give the musicians indications for the right music chair choice by analyzing the chair concepts, so that thereby preemptively MSD can be reduced or prevented.
Influence of antibiotic-regimens on intensive-care unit-mortality and liver-cirrhosis as risk factor
(2016)
AIM: To assess the rate of infection, appropriateness of antimicrobial-therapy and mortality on intensive care unit (ICU). Special focus was drawn on patients with liver cirrhosis.
METHODS: The study was approved by the local ethical committee. All patients admitted to the Internal Medicine-ICU between April 1, 2007 and December 31, 2009 were included. Data were extracted retrospectively from all patients using patient charts and electronic documentations on infection, microbiological laboratory reports, diagnosis and therapy. Due to the large hepatology department and liver transplantation center, special interest was on the subgroup of patients with liver cirrhosis. The primary statistical-endpoint was the evaluation of the influence of appropriate versus inappropriate antimicrobial-therapy on in-hospital-mortality.
RESULTS: Charts of 1979 patients were available. The overall infection-rate was 53%. Multiresistant-bacteria were present in 23% of patients with infection and were associated with increased mortality (P < 0.000001). Patients with infection had significantly increased in-hospital-mortality (34% vs 17%, P < 0.000001). Only 9% of patients with infection received inappropriate initial antimicrobial-therapy, no influence on mortality was observed. Independent risk-factors for in-hospital-mortality were the presence of septic-shock, prior chemotherapy for malignoma and infection with Pseudomonas spp. Infection and mortality-rate among 175 patients with liver-cirrhosis was significantly higher than in patients without liver-cirrhosis. Infection increased mortality 2.24-fold in patients with cirrhosis. Patients with liver cirrhosis were at an increased risk to receive inappropriate initial antimicrobial therapy.
CONCLUSION: The results of the present study report the successful implementation of early-goal-directed therapy. Liver cirrhosis patients are at increased risk of infection, mortality and to receive inappropriate therapy. Increasing burden are multiresistant-bacteria.
Purpose: Colorectal cancer (CRC) is the second most common cancer in Germany. Around 60,000 people were diagnosed CRC in 2016 in Germany. Since 2019, screening colonoscopies are offered in Germany for men by the age of 50 and for women by the age of 55. It is recently discussed if women should also undergo a screening colonoscopy by the age of 50 and if there are any predictors for getting CRC.
Methods: Colonoscopies of 1553 symptomatic patients younger than 55 years were compared with colonoscopies of 1075 symptomatic patients older than 55 years. We analyzed if there are any significant differences between those two groups in the prevalence of CRC and its precursor lesions or between symptomatic men and women. We evaluated if there is a correlation between abdominal symptoms and the prevalence of CRC.
Results: In 164/1553 symptomatic patients, 194 (12.5%) polyps were detected. In total, six colorectal carcinomas (0.4%) were detected. There were no significant differences between men and women. In symptomatic patients ≥ 55 years, significantly more polyps were found (p<0.0001; 26.6% vs. 12.5%). Totally, 286 polyps (26.6%) were removed in 1075 symptomatic patients older than 55 years. Anorectal bleeding was the only abdominal symptom being a significant indicator for the prevalence of the occurrence of colon and rectum cancer in both groups (p=0.03, OR=2.73 95%-CI [1.11;6.70]), but with only low sensitivity (44%).
Conclusion: Due to no significant differences in men and women, we recommend screening colonoscopies also for women by the age of 50.
Chronic viral hepatitis is associated with substantial morbidity and mortality worldwide. The aim of our study was to assess the ability of point shear‐wave elastography (pSWE) using acoustic radiation force impulse imaging for the prediction of the following liver‐related events (LREs): new diagnosis of HCC, liver transplantation, or liver‐related death (hepatic decompensation was not included as an LRE). pSWE was performed at study inclusion and compared with liver histology, transient elastography (TE), and serologic biomarkers (aspartate aminotransferase to platelet ratio index, Fibrosis‐4, FibroTest). The performance of pSWE and TE to predict LREs was assessed by calculating the area under the receiver operating characteristic curve and a Cox proportional‐hazards regression model. A total of 254 patients with a median follow‐up of 78 months were included in the study. LRE occurred in 28 patients (11%) during follow‐up. In both patients with hepatitis B virus and hepatitis C virus (HCV), pSWE showed significant correlations with noninvasive tests and TE, and median pSWE and TE values were significantly different between patients with LREs and patients without LREs (both P < 0.0001). In patients with HCV, the area under the receiver operating characteristic curve for pSWE and TE to predict LREs were comparable: 0.859 (95% confidence interval [CI], 0.747‐0.969) and 0.852 (95% CI, 0.737‐0.967) (P = 0.93). In Cox regression analysis, pSWE independently predicted LREs in all patients with HCV (hazard ratio, 17.9; 95% CI, 5.21‐61‐17; P < 0.0001) and those who later received direct‐acting antiviral therapy (hazard ratio, 17.11; 95% CI, 3.88‐75.55; P = 0.0002). Conclusion: Our study shows good comparability between pSWE and TE. pSWE is a promising tool for the prediction of LREs in patients with viral hepatitis, particularly those with chronic HCV. Further studies are needed to confirm our data and assess their prognostic value in other liver diseases.
Background; Musculoskeletal disorders (MSD) are a common health problem among dentists. Dental treatment is mainly performed in a sitting position. The aim of the study was to quantify the effect of different ergonomic chairs on the sitting position. In addition, it was tested if the sitting position of experienced workers is different from a non-dental group.
Methods; A total of 59 (28 m/31f) subjects, divided into two dentist groups according to their work experience (students and dentists (9 m/11f) < 10 years, dentists (9 m/10f) ≥ 10 years) and a control group (10 m/10f) were measured. A three-dimensional back scanner captured the bare back of all subjects sitting on six dentist’s chairs of different design. Initially, inter-group comparisons per chair, firstly in the habitual and secondly in the working postures, were carried out. Furthermore, inter-chair comparison was conducted for the habitual as well as for the working postures of all subjects and for each group. Finally, a comparison between the habitual sitting posture and the working posture for each respective chair (intra-chair comparison) was conducted (for all subjects and for each group). In addition, a subjective assessment of each chair was made.
For the statistical analysis, non-parametric tests were conducted and the level of significance was set at 5%.
Results: When comparing the three subject groups, all chairs caused a more pronounced spinal kyphosis in experienced dentists. In both conditions (habitual and working postures), a symmetrical sitting position was assumed on each chair.
The inter-chair comparisons showed no differences regarding the ergonomic design of the chairs. The significances found in the inter-chair comparisons were all within the measurementerror and could, therefore, be classified as clinically irrelevant.
The intra-chair comparison (habitual sitting position vs. working sitting position) illustrated position-related changes in the sagittal, but not in the transverse, plane. These changes were only position-related (forward leaned working posture) and were not influenced by the ergonomic sitting design of the respective chair. There are no differences between the groups in the subjective assessment of each chair.
Conclusions; Regardless of the group or the dental experience, the ergonomic design of the dentist’s chair had only a marginal influence on the upper body posture in both the habitual and working sitting postures. Consequently, the focus of the dentist’s chair, in order to minimize MSD, should concentrate on adopting a symmetrical sitting posture rather than on its ergonomic design.
Background: Patients with head and neck cancer (HNC) are at high risk for malnutrition because of tumour localisation and therapy. Prophylactic percutaneous endoscopic gastrostomy (PEG) tube placement is common practice to prevent malnutrition.
Objective: To investigate the benefits of prophylactic PEG tube placement for HNC patients in terms of the influence on patients’ nutritional status, utilisation rate, complications and to identify the predictors of PEG tube utilisation.
Methods: All consecutive HNC patients who underwent prophylactic PEG tube insertion between 1 January 2011 and 31 December 2012 prior to therapy were enrolled. The PEG tube utilisation rate, complications, the patients’ nutritional status and tumour therapy were evaluated with the help of electronic patient charts and telephone interviews.
Results: A total of 181 patients (48 female, median 67.5 years) were included. The PEG utilisation rate in the entire cohort was 91.7%. One hundred and forty‐nine patients (82.3%) used the PEG tube for total enteral nutrition, 17 patients (9.4%) for supplemental nutrition and 15 patients (8.3%) made no use of the PEG tube. Peristomal wound infections were the most common complications (40.3%) in this study. A high Nutritional Risk Screening (NRS) score prior to tube insertion was found to be independently associated with PEG utilisation. No significant weight changes were observed across the three patient subgroups.
Conclusions: The overall PEG tube utilisation rate was high in this study. However, given the high rate of infections, diligent patient selection is crucial in order to determine which patients benefit most from prophylactic PEG tube insertion.
Background and Aims: The IL-12/23 inhibitor ustekinumab (UST) opened up new treatment options for patients with Crohn’s disease (CD). Due to the recent approval, real-world German data on long-term efficacy and safety are lacking. This study aimed to assess the clinical course of CD patients under UST therapy and to identify potential predictive markers.
Methods: Patients with CD receiving UST treatment in three hospitals and two outpatient centers were included and retrospectively analyzed. Rates for short- and long-term remission and response were analyzed with the help of clinical (Harvey–Bradshaw Index (HBI)) and biochemical (C-reactive protein (CRP), Fecal calprotectin (fCal)) parameters for disease activity.
Results: Data from 180 patients were evaluated. One-hundred-and-six patients had a follow-up of at least eight weeks and were included. 96.2% of the patients were pre-exposed to anti- TNFα agents and 34.4% to both anti-TNFα and anti-integrin antibodies. The median follow-up was 49.1 weeks (95% CI 42.03-56.25). At week 8, 51 patients (54.8%) showed response to UST, and 24 (24.7%) were in remission. At week 48, 48 (51.6%) responded to UST, and 25 patients (26.9%) were in remission. Steroid-free response and remission at week eight was achieved by 30.1% and 19.3% of patients, respectively. At week 48, 37.6% showed steroid-free response to UST, and 20.4% of the initial patient population was in steroid-free remission.
Conclusion: Our study confirms short- and long-term UST effectiveness and tolerability in a cohort of multi-treatment-exposed patients.
Background and Aims: Vitamin D has an inhibitory role in the inflammatory signaling pathways and supports the integrity of the intestinal barrier. Due to its immunomodulatory effect, vitamin D plays a role in chronic inflammatory bowel disease (IBD) and a deficiency is associated with an increased risk for a flare. We aimed to investigate to what extent the 25-hydroxyvitamin D (25(OH)D3) level correlates with disease activity and whether a cut-off value can be defined that discriminates between active disease and remission. Methods: Patients with IBD, treated at the University Hospital Frankfurt were analyzed retrospectively. The 25(OH)D3 levels were correlated with clinical activity indices and laboratory chemical activity parameters. A deficiency was defined as 25(OH)D3 levels <30 ng/mL. Results: A total of 470 (257 female) patients with IBD were included, 272 (57.9%) with Crohn’s disease (CD), 198 (42.1%) with ulcerative colitis (UC). The median age of the patients was 41 (18–84). In 283 patients (60.2%), a vitamin D deficiency was detected. 245 (53.6%) patients received oral vitamin D supplementation, and supplemented patients had significantly higher vitamin D levels (p < 0.0001). Remission, vitamin D substitution, and male gender were independently associated with the 25(OH)D3 serum concentration in our cohort in regression analysis. A 25(OH)D3 serum concentration of 27.5 ng/mL was the optimal cut-off value. Conclusion: Vitamin D deficiency is common in IBD patients and appears to be associated with increased disease activity. In our study, vitamin D levels were inversely associated with disease activity. Thus, close monitoring should be established, and optimized supplementation should take place.
To date, there is insufficient insight into inflammatory bowel disease (IBD)-associated stress, recognized disability, and contact with the social care system. We aimed to assess these parameters in IBD patients and a non-IBD control group, who were invited to participate in an online survey developed specifically for this study (www.soscisurvey.de) with the help of IBD patients. 505 IBD patients and 166 volunteers (i.e., control group) participated in the survey. IBD patients reported significantly increased levels of stress within the last six months and five years (p<0.0001) and were more likely to have a recognized disability (p<0.0001). A low academic status was the strongest indicator of a disability (p = 0.006). Only 153 IBD patients (30.3%) reported contact with the social care system, and a disability was the strongest indicator for this (p<0.0001). Our study provides data on stress and disability in a large unselected German IBD cohort. We showed that patients with IBD suffer more often from emotional stress and more often have a recognized disability. As only about 1/3 of the patients had come into contact with the social care system and the corresponding support, this patient group is undersupplied in this area.
Background: Essential Tremor (ET) is a progressive neurological disorder characterized by postural and kinetic tremor most commonly affecting the hands and arms. Medically intractable ET can be treated by deep brain stimulation (DBS) of the ventral intermediate nucleus of thalamus (VIM). We investigated whether the location of the effective contact (most tremor suppression with at least side effects) in VIM-DBS for ET changes over time, indicating a distinct mechanism of loss of efficacy that goes beyond progression of tremor severity, or a mere reduction of DBS efficacy.
Methods: We performed programming sessions in 10 patients who underwent bilateral vim-DBS surgery between 2009 and 2017 at our department. In addition to the intraoperative (T1) and first clinical programming session (T2) a third programming session (T3) was performed to assess the effect- and side effect threshold (minimum voltage at which a tremor suppression or side effects occurred). Additionally, we compared the choice of the effective contact between T1 and T2 which might be affected by a surgical induced “brain shift.”
Discussion: Over a time span of about 4 years VIM-DBS in ET showed continuous efficacy in tremor suppression during stim-ON compared to stim-OFF. Compared to immediate postoperative programming sessions in ET-patients with DBS, long-term evaluationshowednorelevantchangeinthechoiceofcontactwithrespecttosideeffects andefficacy.InthemajorityofthecasestheactivecontactatT2didnotcorrespondtothe most effective intraoperative stimulation site T1, which might be explained by a brain-shift due to cerebral spinal fluid loss after neurosurgical procedure.
Background: Ribavirin (RBV) remains part of several interferon-free treatment strategies even though its mechanisms of action are still not fully understood. One hypothesis is that RBV increases responsiveness to type I interferons. Pegylated Interferon alpha (PEG-IFNa) has recently been shown to alter natural killer (NK) cell function possibly contributing to control of hepatitis C virus (HCV) infection. However, the effects of ribavirin alone or in combination with IFNa on NK cells are unknown.
Methods: Extensive ex vivo phenotyping and functional analysis of NK cells from hepatitis C patients was performed during antiviral therapy. Patients were treated for 6 weeks with RBV monotherapy (n = 11), placebo (n = 13) or PEG-IFNa-2a alone (n = 6) followed by PEG-IFNa/RBV combination therapy. The effects of RBV and PEG-IFNa-2a on NK cells were also studied in vitro after co-culture with K562 or Huh7.5 cells.
Results: Ribavirin monotherapy had no obvious effects on NK cell phenotype or function, neither ex vivo in patients nor in vitro. In contrast, PEG-IFNa-2a therapy was associated with an increase of CD56bright cells and distinct changes in expression profiles leading to an activated NK cell phenotype, increased functionality and decline of terminally differentiated NK cells. Ribavirin combination therapy reduced some of the IFN effects. An activated NK cell phenotype during therapy was inversely correlated with HCV viral load.
Conclusions: PEG-IFNa activates NK cells possibly contributing to virological responses independently of RBV. The role of NK cells during future IFN-free combination therapies including RBV remains to be determined.
Background and aims: Spontaneous bacterial peritonitis (SBP) is a severe complication of decompensated cirrhosis. The prevalence of multidrug-resistant organisms (MDROs) in patients with cirrhosis is increasing. Identification of patients at risk for SBP due to MDROs (ie, SBP with the evidence of MDROs or Stenotrophomonas maltophilia in ascitic culture, MDRO-SBP) is crucial to the early adaptation of antibiotic treatment in such patients. We therefore investigated whether MDROs found in ascitic cultures can also be found in specimens determined by noninvasive screening procedures.
Patients and methods: This retrospective study was conducted at the liver center of the University Hospital Frankfurt, Germany. Between 2011 and 2016, patients with cirrhosis were included upon diagnosis of SBP and sample collection of aerobic/anaerobic ascitic cultures. Furthermore, the performance of at least one complete MDRO screening was mandatory for study inclusion.
Results: Of 133 patients diagnosed with SBP, 75 (56.4%) had culture-positive SBP and 22 (16.5%) had MDRO-SBP. Multidrug-resistant Escherichia coli (10/22; 45.5%) and vancomycin-resistant enterococci (7/22; 36.4%) resembled the major causatives of MDRO-SBP. Rectal swabs identified MDROs in 17 of 22 patients (77.3%) who developed MDRO-SBP with a time-dependent sensitivity of 77% and 87% after 30 and 90 days upon testing, while negative predictive value was 83% and 76%, respectively. The majority of patients were included from intensive care unit or intermediate care unit.
Conclusion: MDRO screening may serve as a noninvasive diagnostic tool to identify patients at risk for MDRO-SBP. Patients with decompensated cirrhosis should be screened for MDROs from the first day of inpatient treatment onward.
Background: Due to the coronavirus disease 2019 (COVID-19) pandemic, interventions in the upper airways are considered high-risk procedures for otolaryngologists and their colleagues. The purpose of this study was to evaluate limitations in hearing and communication when using a powered air-purifying respirator (PAPR) system to protect against severe acute respiratory syndrome coronavirus type 2 (SARS-CoV-2) transmission and to assess the benefit of a headset. Methods: Acoustic properties of the PAPR system were measured using a head and torso simulator. Audiological tests (tone audiometry, Freiburg speech test, Oldenburg sentence test (OLSA)) were performed in normal-hearing subjects (n = 10) to assess hearing with PAPR. The audiological test setup also included simulation of conditions in which the target speaker used either a PAPR, a filtering face piece (FFP) 3 respirator, or a surgical face mask. Results: Audiological measurements revealed that sound insulation by the PAPR headtop and noise, generated by the blower-assisted respiratory protection system, resulted in significantly deteriorated hearing thresholds (4.0 ± 7.2 dB hearing level (HL) vs. 49.2 ± 11.0
Hintergrund und Fragestellung: Die Severe acute respiratory syndrome coronavirus type 2(SARS-CoV-2)-Pandemie hat die Ausbildung von Medizinstudierenden grundlegend verändert. Die Notwendigkeit von Kontaktbeschränkungen und die damit einhergehende Forderung nach Distanzunterricht hat dazu geführt, dass innerhalb kurzer Zeit digitale Lehrformate umgesetzt werden mussten. Ziel dieser Arbeit war die Auswertung der studentischen Evaluationsergebnisse für virtuellen Unterricht im Fach Hals-Nasen-Ohren-Heilkunde während der SARS-CoV-2-Pandemie und ein Vergleich mit den zuvor erhobenen Evaluationsergebnissen unter Präsenzbedingungen.
Material und Methoden: Untersucht wurden die Evaluationsergebnisse für die Blockpraktika im Wintersemester 2020/21 und im Sommersemester 2021, die in einem virtuellen Format mit kurzer Präsenzphase durchgeführt wurden, sowie die der komplett im konventionellen Präsenzformat durchgeführten Praktika von Sommersemester 2018 bis Wintersemester 2019/20. Die anonyme Befragung der Studierenden bezog sich auf verschiedene Aspekte der Lehrveranstaltung, wie z. B. Organisation, Didaktik und Lernatmosphäre.
Ergebnisse: Von 16 abgefragten Kategorien zeigten 14 (87,5%) signifikant bessere Evaluationsergebnisse für die virtuellen Praktika verglichen mit den zuvor im Präsenzformat durchgeführten Praktika. Diese sehr positive Bewertung des digitalen Lehrangebots zeigte im Pandemieverlauf über die Dauer von zwei Semestern keine signifikante Änderung.
Schlussfolgerung: Die vorliegenden Daten belegen die hohe Akzeptanz eines digitalen Lehrangebots im Fach HNO-Heilkunde für Studierende. Auch wenn unerlässliche Bestandteile der ärztlichen Ausbildung, wie der Unterricht am Patienten und das Erlernen klinisch-praktischer Fertigkeiten, weiterhin nur im Präsenzformat realisiert werden können, legen die Ergebnisse nahe, dass digitale Elemente auch nach der SARS-CoV-2-Pandemie eine Rolle im Medizinstudium spielen könnten.
Patients with neuroendocrine tumors (NET) often go through a long phase between onset of symptoms and initial diagnosis. Assessment of time to diagnosis and pre-clinical pathway in patients with gastroenteropancreatic NET (GEP-NET) with regard to metastases and symptoms. Retrospective analysis of patients with GEP-NET at a tertiary referral center from 1984 to 2019; inclusion criteria: Patients ≥18 years, diagnosis of GEP-NET; statistical analysis using non-parametrical methods. Four hundred eighty-six patients with 488 tumors were identified; median age at first diagnosis (478/486, 8 unknown) was 59 years; 52.9% male patients. Pancreatic NET: 143/488 tumors; 29.3%; small intestinal NET: 145/488 tumors, 29.7%. 128/303 patients (42.2%) showed NET specific and 122/486 (25%) patients other tumor-specific symptoms. 222/279 patients had distant metastases at initial diagnosis (187/222 liver metastases). 154/488 (31.6%) of GEP-NET were incidental findings. Median time from tumor manifestation (e.g., symptoms related to NET) to initial diagnosis across all entities was 19.5 (95% CI: 12–28) days. No significant difference in patients with or without distant metastases (median 73 vs 105 days, P = .42). A large proportion of GEP-NET are incidental findings and only about half of all patients are symptomatic at the time of diagnosis. We did not find a significant influence of the presence of metastases on time to diagnosis, which shows a large variability with a median of <30 days.
Background/aims: Hepatocellular carcinoma (HCC) is a leading indication for liver transplantation (LT) worldwide. Early identification of patients at risk for HCC recurrence is of paramount importance since early treatment of recurrent HCC after LT may be associated with increased survival. We evaluated incidence of and predictors for HCC recurrence, with a focus on the course of AFP levels.
Methods: We performed a retrospective, single-center study of 99 HCC patients who underwent LT between January 28th, 1997 and May 11th, 2016. A multi-stage proportional hazards model with three stages was used to evaluate potential predictive markers, both by univariate and multivariable analysis, for influences on 1) recurrence after transplantation, 2) mortality without HCC recurrence, and 3) mortality after recurrence.
Results: 19/99 HCC patients showed recurrence after LT. Waiting time was not associated with overall HCC recurrence (HR = 1, p = 0.979). Similarly, waiting time did not affect mortality in LT recipients both with (HR = 0.97, p = 0.282) or without (HR = 0.99, p = 0.685) HCC recurrence. Log10-transformed AFP values at the time of LT (HR 1.75, p = 0.023) as well as after LT (HR 2.07, p = 0.037) were significantly associated with recurrence. Median survival in patients with a ratio (AFP at recurrence divided by AFP 3 months before recurrence) of 0.5 was greater than 70 months, as compared to a median of only 8 months in patients with a ratio of 5.
Conclusion: A rise in AFP levels rather than an absolute threshold could help to identify patients at short-term risk for HCC recurrence post LT, which may allow intensification of the surveillance strategy on an individualized basis.
Penile squamous cell carcinomas are rare tumor entities throughout Europe. Early lymphonodal spread urges for aggressive therapeutic approaches in advanced tumor stages. Therefore, understanding tumor biology and its microenvironment and correlation with known survival data is of substantial interest in order to establish treatment strategies adapted to the individual patient. Fifty-five therapy naïve squamous cell carcinomas, age range between 41 and 85 years with known clinicopathological data, were investigated with the use of tissue microarrays (TMA) regarding the tumor-associated immune cell infiltrate density (ICID). Slides were stained with antibodies against CD3, CD8 and CD20. An image analysis software was applied for evaluation. Data were correlated with clinicopathological characteristics and overall survival. There was a significant increase of ICID in squamous cell carcinomas of the penis in relation to tumor adjacent physiological tissue. Higher CD3-positive ICID was significantly associated with lower tumor stage in our cohort. The ICID was not associated with overall survival. Our data sharpens the view on tumor-associated immune cell infiltrate in penile squamous cell carcinomas with an unbiased digital and automated cell count. Further investigations on the immune cell infiltrate and its prognostic and possible therapeutic impact are needed.
Objectives: The four-dimensional ultrasound (4D-US) enables imaging of the aortic segment and simultaneous determination of the wall expansion. The method shows a high spatial and temporal resolution, but its in vivo reliability is so far unknown for low-measure values. The present study determines the intraobserver repeatability and interobserver reproducibility of 4D-US in the atherosclerotic and non-atherosclerotic infrarenal aorta. Methods: In all, 22 patients with non-aneurysmal aorta were examined by an experienced examiner and a medical student. After registration of 4D images, both the examiners marked the aortic wall manually before the commercially implemented speckle tracking algorithm was applied. The cyclic changes of the aortic diameter and circumferential strain were determined with the help of custom-made software. The reliability of 4D-US was tested by the intraclass correlation coefficient (ICC). Results: The 4D-US measurements showed very good reliability for the maximum aortic diameter and the circumferential strain for all patients and for the non-atherosclerotic aortae (ICC >0.7), but low reliability for circumferential strain in calcified aortae (ICC = 0.29). The observer- and masking-related variances for both maximum diameter and circumferential strain were close to zero. Conclusions: Despite the low-measured values, the high spatial and temporal resolution of the 4D-US enables a reliable evaluation of cyclic diameter changes and circumferential strain in non-aneurysmal aortae independent from the observer experience but with some limitations for calcified aortae. The 4D-US opens up a new perspective with regard to noninvasive, in vivo assessment of kinematic properties of the vessel wall in the abdominal aorta.
In the application of range of motion (ROM) tests there is little agreement on the number of repetitions to be measured and the number of preceding warm-up protocols. In stretch training a plateau in ROM gains can be seen after four to five repetitions. With increasing number of repetitions, the gain in ROM is reduced. This study examines the question of whether such an effect occurs in common ROM tests. Twenty-two healthy sport students (10 m/12 f.) with an average age of 25.3 ± 1.94 years (average height 174.1 ± 9.8 cm; weight 66.6 ± 11.3 kg and BMI 21.9 ± 2.0 kg/cm2) volunteered in this study. Each subject performed five ROM tests in a randomized order—measured either via a tape measure or a digital inclinometer: Tape measure was used to evaluate the Fingertip-to-Floor test (FtF) and the Lateral Inclination test (LI). Retroflexion of the trunk modified after Janda (RF), Thomas test (TT) and a Shoulder test modified after Janda (ST) were evaluated with a digital inclinometer. In order to show general acute effects within 20 repetitions we performed ANOVA/Friedman-test with multiple comparisons. A non-linear regression was then performed to identify a plateau formation. Significance level was set at 5%. In seven out of eight ROM tests (five tests in total with three tests measured both left and right sides) significant flexibility gains were observed (FtF: p < 0.001; LI-left/right: p < 0.001/0.001; RF: p = 0.009; ST-left/right: p < 0.001/p = 0.003; TT-left: p < 0.001). A non-linear regression with random effects was successfully applied on FtF, RF, LI-left/right, ST-left and TT-left and thus, indicate a gradual decline in the amount of gained ROM. An acute effect was observed in most ROM tests, which is characterized by a gradual decline of ROM gain. For those tests, we can state that the acute effect described in the stretching literature also applies to the performance of typical ROM tests. Since a non-linear behavior was shown, it is the decision of the practitioner to weigh up between measurement accuracy and expenditure. Researchers and practitioners should consider this when applying ROM assessments to healthy young adults.