Institutes
Refine
Year of publication
- 2020 (245) (remove)
Document Type
- Article (196)
- Preprint (39)
- Doctoral Thesis (10)
Language
- English (245) (remove)
Has Fulltext
- yes (245)
Is part of the Bibliography
- no (245)
Keywords
- COVID-19 (9)
- SARS-CoV-2 (5)
- bladder cancer (4)
- Aortic stenosis (3)
- Hypoxia (3)
- Inflammation (3)
- Macrophages (3)
- Mortality (3)
- Peri-implantitis (3)
- Postural control (3)
Institute
- Medizin (245)
- Biowissenschaften (3)
- Ernst Strüngmann Institut (3)
- Frankfurt Institute for Advanced Studies (FIAS) (3)
- MPI für Hirnforschung (3)
- Psychologie (2)
- Biochemie und Chemie (1)
- Biochemie, Chemie und Pharmazie (1)
- Buchmann Institut für Molekulare Lebenswissenschaften (BMLS) (1)
- Informatik und Mathematik (1)
Background: The adequate allocation of inpatient care resources requires assumptions about the need for health care and how this need will be met. However, in current practice, these assumptions are often based on outdated methods (e.g. Hill-Burton Formula). This study evaluated floating catchment area (FCA) methods, which have been applied as measures of spatial accessibility, focusing on their ability to predict the need for health care in the inpatient sector in Germany.
Methods: We tested three FCA methods (enhanced (E2SFCA), modified (M2SFCA) and integrated (iFCA)) for their accuracy in predicting hospital visits regarding six medical diagnoses (atrial flutter/fibrillation, heart failure, femoral fracture, gonarthrosis, stroke, and epilepsy) on national level in Germany. We further used the closest provider approach for benchmark purposes. The predicted visits were compared with the actual visits for all six diagnoses using a correlation analysis and a maximum error from the actual visits of ± 5%, ± 10% and ± 15%.
Results: The analysis of 229 million distances between hospitals and population locations revealed a high and significant correlation of predicted with actual visits for all three FCA methods across all six diagnoses up to ρ = 0.79 (p < 0.001). Overall, all FCA methods showed a substantially higher correlation with actual hospital visits compared to the closest provider approach (up to ρ = 0.51; p < 0.001). Allowing a 5% error of the absolute values, the analysis revealed up to 13.4% correctly predicted hospital visits using the FCA methods (15% error: up to 32.5% correctly predicted hospital). Finally, the potential of the FCA methods could be revealed by using the actual hospital visits as the measure of hospital attractiveness, which returned very strong correlations with the actual hospital visits up to ρ = 0.99 (p < 0.001).
Conclusion: We were able to demonstrate the impact of FCA measures regarding the prediction of hospital visits in non-emergency settings, and their superiority over commonly used methods (i.e. closest provider). However, hospital beds were inadequate as the measure of hospital attractiveness resulting in low accuracy of predicted hospital visits. More reliable measures must be integrated within the proposed methods. Still, this study strengthens the possibilities of FCA methods in health care planning beyond their original application in measuring spatial accessibility.
Background: The aim of this study was to collect standard reference values of the weight and the maximum pressure distribution in healthy adults aged 18–65 years and to investigate the influence of constitutional parameters on it.
Methods: A total of 416 healthy subjects (208 male / 208 female) aged between 18 and 65 years (Ø 38.3 ± 14.1 years) participated in this study, conducted 2015–2019 in Heidelberg. The age-specific evaluation is based on 4 age groups (G1, 18–30 years; G2, 31–40 years; G3, 41–50 years; G4, 51–65 years). A pressure measuring plate FDM-S (Zebris/Isny/Germany) was used to collect body weight distribution and maximum pressure distribution of the right and left foot and left and right forefoot/rearfoot, respectively.
Results: Body weight distribution of the left (50.07%) and right (50.12%) foot was balanced. There was higher load on the rearfoot (left 54.14%; right 55.09%) than on the forefoot (left 45.49%; right 44.26%). The pressure in the rearfoot was higher than in the forefoot (rearfoot left 9.60 N/cm2, rearfoot right 9.51 N/cm2/forefoot left 8.23 N/cm2, forefoot right 8.59 N/cm2). With increasing age, the load in the left foot shifted from the rearfoot to the forefoot as well as the maximum pressure (p ≤ 0.02 and 0.03; poor effect size). With increasing BMI, the body weight shifted to the left and right rearfoot (p ≤ 0.001, poor effect size). As BMI increased, so did the maximum pressure in all areas (p ≤ 0.001 and 0.03, weak to moderate effect size). There were significant differences in weight and maximum pressure distribution in the forefoot and rearfoot in the different age groups, especially between younger (18–40 years) and older (41–65 years) subjects.
Discussion: Healthy individuals aged from 18 to 65 years were found to have a balanced weight distribution in an aspect ratio, with a 20% greater load of the rearfoot. Age and BMI were found to be influencing factors of the weight and maximum pressure distribution, especially between younger and elder subjects. The collected standard reference values allow comparisons with other studies and can serve as a guideline in clinical practice and scientific studies.
Selective sympathetic and parasympathetic pathways that act on target organs represent the terminal actors in the neurobiology of homeostasis and often become compromised during a range of neurodegenerative and traumatic disorders. Here, we delineate several neurotransmitter and neuromodulator phenotypes found in diverse parasympathetic and sympathetic ganglia in humans and rodent species. The comparative approach reveals evolutionarily conserved and non-conserved phenotypic marker constellations. A developmental analysis examining the acquisition of selected neurotransmitter properties has provided a detailed, but still incomplete, understanding of the origins of a set of noradrenergic and cholinergic sympathetic neuron populations, found in the cervical and trunk region. A corresponding analysis examining cholinergic and nitrergic parasympathetic neurons in the head, and a range of pelvic neuron populations, with noradrenergic, cholinergic, nitrergic, and mixed transmitter phenotypes, remains open. Of particular interest are the molecular mechanisms and nuclear processes that are responsible for the correlated expression of the various genes required to achieve the noradrenergic phenotype, the segregation of cholinergic locus gene expression, and the regulation of genes that are necessary to generate a nitrergic phenotype. Unraveling the neuron population-specific expression of adhesion molecules, which are involved in axonal outgrowth, pathway selection, and synaptic organization, will advance the study of target-selective autonomic pathway generation.
Replacement of a stenotic aortic valve reduces immediately the ventricular to aortic gradient and is expected to improve diastolic and systolic left ventricular function over the long term. However, the hemodynamic changes immediately after valve implantation are so far poorly understood. Within this pilot study, we performed an invasive pressure volume loop analysis to describe the early hemodynamic changes after transcatheter aortic valve implantation (TAVI) with self-expandable prostheses. Invasive left ventricular pressure volume loop analysis was performed in 8 patients with aortic stenosis (mean 81.3 years) prior and immediately after transfemoral TAVI with a self-expandable valve system (St. Jude Medical Portico Valve). Parameters for global hemodynamics, afterload, contractility and the interaction of the cardiovascular system were analyzed. Left ventricular ejection fraction, (53.9% vs. 44.8%, p = 0.018), preload recruitable stroke work (68.5 vs. 44.8 mmHg, p = 0.012) and end-systolic elastance (3.55 vs. 2.17, p = 0.036) both marker for myocardial contractility declined significantly compared to baseline. As sign of impaired diastolic function, TAU, a preload-independent measure of isovolumic relaxation (37.3 vs. 41.8 ms, p = 0.018) and end-diastolic pressure (13.1 vs. 16.4 mmHg, p = 0.015) raised after valve implantation. Contrarily, a smaller ratio of end-systolic to arterial elastance (ventricular-arterial coupling) indicates an improvement of global cardiovascular energy efficiency (1.40 vs. 0.97 p = 0.036). Arterial elastance had a strong correlation with the number of conducted rapid ventricular pacings (Pearson correlation coefficient, r = 0.772, p = 0.025). Invasive left ventricular pressure volume loop analysis revealed impaired systolic and diastolic function in the early phase after TAVI with self-expandable valve for the treatment of severe aortic stenosis. Contrarily, we found indications for early improvement of global cardiovascular energy efficiency.
The nuclear factor kappa beta (NFκB) signaling pathway plays an important role in liver homeostasis and cancer development. Tax1-binding protein 1 (Tax1BP1) is a regulator of the NFκB signaling pathway, but its role in the liver and hepatocellular carcinoma (HCC) is presently unknown. Here we investigated the role of Tax1BP1 in liver cells and murine models of HCC and liver fibrosis. We applied the diethylnitrosamine (DEN) model of experimental hepatocarcinogenesis in Tax1BP1+/+ and Tax1BP1−/− mice. The amount and subsets of non-parenchymal liver cells in in Tax1BP1+/+ and Tax1BP1−/− mice were determined and activation of NFκB and stress induced signaling pathways were assessed. Differential expression of mRNA and miRNA was determined. Tax1BP1−/− mice showed increased numbers of inflammatory cells in the liver. Furthermore, a sustained activation of the NFκB signaling pathway was found in hepatocytes as well as increased transcription of proinflammatory cytokines in isolated Kupffer cells from Tax1BP1−/− mice. Several differentially expressed mRNAs and miRNAs in livers of Tax1BP1−/− mice were found, which are regulators of inflammation or are involved in cancer development or progression. Furthermore, Tax1BP1−/− mice developed more HCCs than their Tax1BP1+/+ littermates. We conclude that Tax1BP1 protects from liver cancer development by limiting proinflammatory signaling.
Background: To detect deviations from a normal postural control, standard values can be helpful for comparison purposes. Since the postural control is influenced by gender and age, the aim of the present study was the collection of standard values for women between 31 and 40 years of age.
Methods: For the study, 106 female, subjectively healthy, German subjects aged between 31 and 40 years (35 ± 2.98 years) were measured using a pressure measuring platform.
Results: Their average BMI was 21.60 ± 4.65 kg/m2. The load distribution between left and right foot was almost evenly balanced with a median 51.46% load on the left [tolerance interval (TR) 37.02%/65.90%; confidence interval (CI) 50.06/52.85%] and 48.54% [TR 43.10/62.97%; CI 47.14/49.93%] on the right foot. The median forefoot load was 33.84% [TR 20.68/54.73%; CI 31.67/37.33%] and the rearfoot load was measured at 66.16% [TR 45.27/79.33%; CI 62.67/68.33%]. The median/mean body sway in the sagittal plane was measured 12 mm [TR 5.45/23.44 mm; CI 11.00/14.00 mm] and 8.17 mm in the frontal plane [TR 3.33/19.08 mm; CI 7.67/9.33 mm]. The median of the ellipse area is 0.72 cm2 [TR 0.15/3.69 cm2; CI 0.54/0.89°]. The ellipse width has a median of 0.66 cm [TR 0.30/1.77 cm; CI 0.61/0.78 cm] and the height of 0.33 cm [TR 0.13/0.71 cm; CI 0.30/0.37 cm]. The ellipse angle (sway, left forefoot to right rearfoot) has a mean of − 19.34° [TR − 59.21/− 0.44°; CI − 22.52/− 16.16°] and the ellipse angle sway from right forefoot to left rearfoot has a mean of 12.75° [TR 0.09/59.09°; CI 9.00/16.33°].
Conclusion: The right-to-left ratio is balanced. The forefoot-to-rearfoot ratio is approximately 1:2. Also, the body sway can be classified with 12 and 8 mm as normal. The direction of fluctuation is either approx. 19° from the left forefoot to the right rearfoot or approx. 13° the opposite. Body weight, height, and BMI were comparable to the German average of women in a similar age group, so that the measured standard values are representative and might serve as baseline for the normal function of the balance system in order to support the diagnosis of possible dysfunctions in postural control.
Background: Patients with rare diseases (RDs) are often diagnosed too late or not at all. Clinical decision support systems (CDSSs) could support the diagnosis in RDs. The MIRACUM (Medical Informatics in Research and Medicine) consortium, which is one of four funded consortia in the German Medical Informatics Initiative, will develop a CDSS for RDs based on distributed clinical data from ten university hospitals. This qualitative study aims to investigate (1) the relevant organizational conditions for the operation of a CDSS for RDs when diagnose patients (e.g. the diagnosis workflow), (2) which data is necessary for decision support, and (3) the appropriate user group for such a CDSS.
Methods: Interviews were carried out with RDs experts. Participants were recruited from staff physicians at the Rare Disease Centers (RDCs) at the MIRACUM locations, which offer diagnosis and treatment of RDs.
An interview guide was developed with a category-guided deductive approach. The interviews were recorded on an audio device and then transcribed into written form. We continued data collection until all interviews were completed. Afterwards, data analysis was performed using Mayring’s qualitative content analysis approach.
Results: A total of seven experts were included in the study. The results show that medical center guides and physicians from RDC B-centers (with a focus on different RDs) are involved in the diagnostic process. Furthermore, interdisciplinary case discussions between physicians are conducted.
The experts explained that RDs exist which cannot be fully differentiated, but rather described only by their overall symptoms or findings: diagnosis is dependent on the disease or disease group. At the end of the diagnostic process, most centers prepare a summary of the patient case. Furthermore, the experts considered both physicians and experts from the B-centers to be potential users of a CDSS. The experts also have different experiences with CDSS for RDs.
Conclusions: This qualitative study is a first step towards establishing the requirements for the development of a CDSS for RDs. Further research is necessary to create solutions by also including the experts on RDs.
Of the 16 non-structural proteins (Nsps) encoded by SARS CoV-2, Nsp3 is the largest and plays important roles in the viral life cycle. Being a large, multidomain, transmembrane protein, Nsp3 has been the most challenging Nsp to characterize. Encoded within Nsp3 is the papain-like protease PLpro domain that cleaves not only the viral protein but also polyubiquitin and the ubiquitin-like modifier ISG15 from host cells. We here compare the interactors of PLpro and Nsp3 and find a largely overlapping interactome. Intriguingly, we find that near full length Nsp3 is a more active protease compared to the minimal catalytic domain of PLpro. Using a MALDI-TOF based assay, we screen 1971 approved clinical compounds and identify five compounds that inhibit PLpro with IC50s in the low micromolar range but showed cross reactivity with other human deubiquitinases and had no significant antiviral activity in cellular SARS-CoV-2 infection assays. We therefore looked for alternative methods to block PLpro activity and engineered competitive nanobodies that bind to PLpro at the substrate binding site with nanomolar affinity thus inhibiting the enzyme. Our work highlights the importance of studying Nsp3 and provides tools and valuable insights to investigate Nsp3 biology during the viral infection cycle.
Previous studies reported on the safety and applicability of mesenchymal stem/stromal cells (MSCs) to ameliorate pulmonary inflammation in acute respiratory distress syndrome (ARDS). Thus, multiple clinical trials assessing the potential of MSCs for COVID-19 treatment are underway. Yet, as SARS-inducing coronaviruses infect stem/progenitor cells, it is unclear whether MSCs could be infected by SARS-CoV-2 upon transplantation to COVID-19 patients. We found that MSCs from bone marrow, amniotic fluid, and adipose tissue carry angiotensin-converting enzyme 2 and transmembrane protease serine subtype 2 at low levels on the cell surface under steady-state and inflammatory conditions. We did not observe SARS-CoV-2 infection or replication in MSCs at steady state under inflammatory conditions, or in direct contact with SARS-CoV-2-infected Caco-2 cells. Further, indoleamine 2,3-dioxygenase 1 production in MSCs was not impaired in the presence of SARS-CoV-2. We show that MSCs are resistant to SARS-CoV-2 infection and retain their immunomodulation potential, supporting their potential applicability for COVID-19 treatment.
Background: Rare Diseases (RDs), which are defined as diseases affecting no more than 5 out of 10,000 people, are often severe, chronic and life-threatening. A main problem is the delay in diagnosing RDs. Clinical decision support systems (CDSSs) for RDs are software systems to support clinicians in the diagnosis of patients with RDs. Due to their clinical importance, we conducted a scoping review to determine which CDSSs are available to support the diagnosis of RDs patients, whether the CDSSs are available to be used by clinicians and which functionalities and data are used to provide decision support.
Methods: We searched PubMed for CDSSs in RDs published between December 16, 2008 and December 16, 2018. Only English articles, original peer reviewed journals and conference papers describing a clinical prototype or a routine use of CDSSs were included. For data charting, we used the data items “Objective and background of the publication/project”, “System or project name”, “Functionality”, “Type of clinical data”, “Rare Diseases covered”, “Development status”, “System availability”, “Data entry and integration”, “Last software update” and “Clinical usage”.
Results: The search identified 636 articles. After title and abstracting screening, as well as assessing the eligibility criteria for full-text screening, 22 articles describing 19 different CDSSs were identified. Three types of CDSSs were classified: “Analysis or comparison of genetic and phenotypic data,” “machine learning” and “information retrieval”. Twelve of nineteen CDSSs use phenotypic and genetic data, followed by clinical data, literature databases and patient questionnaires. Fourteen of nineteen CDSSs are fully developed systems and therefore publicly available. Data can be entered or uploaded manually in six CDSSs, whereas for four CDSSs no information for data integration was available. Only seven CDSSs allow further ways of data integration. thirteen CDSS do not provide information about clinical usage.
Conclusions: Different CDSS for various purposes are available, yet clinicians have to determine which is best for their patient. To allow a more precise usage, future research has to focus on CDSSs RDs data integration, clinical usage and updating clinical knowledge. It remains interesting which of the CDSSs will be used and maintained in the future.
Background: Combined inhibition of phosphatidylinositol 3-kinase (PI3K) and the mammalian target of rapamycin (mTOR) complexes may be an efficient treatment for acute leukemia. The primary objective of this phase I single center open label study was to determine the maximum tolerated dose (MTD) and recommended phase II dose (RP2D) of the dual pan-class I PI3K and mTOR inhibitor BEZ235 in patients with advanced leukemia.
Methods: Herein patients > 18 years of age who had relapsed or showed refractory leukemia were treated with BEZ235 (orally at 300–400 mg BID (cohort − 1/1)) to assess safety, tolerability, preliminary efficacy and pharmacokinetic (PK). Adverse events data and serious adverse events were analyzed and haematological and clinical biochemistry toxicities were assessed from laboratory test parameters. Response was assessed for the first time at the end of cycle 1 (day 29) and after every subsequent cycle. Pharmacokinetic and pharmacodynamic analyses of BEZ235 were also included (BEZ235 plasma levels, phosphorylation of AKT, S6 and 4EBP1). On statistics this trial is a multiple ascending dose study in which a following variant of the 3 + 3 rule (“Rolling Six”), a minimum of 6 and a maximum of 12 patients was recruited for the dose escalation and another 5 were planned for the expansion phase.
Results: Twenty-four patients with ALL (n = 11) or AML (n = 12) or CML-BP (n = 1) were enrolled. All patients had failed one (n = 5) or more lines of therapy (n = 5) and 14 patients were in refractory / refractory relapse. No formal MTD was defined, stomatitis and gastrointestinal toxicity at 400 mg BID dose was considered incompatible with prolonged treatment. The RP2D of BEZ235 was defined as 300 mg BID. Four of 24 patients showed clinical benefit. Twenty-two of 24 patients discontinued because of progression, (median time to progression 27 days (4d-112d). There was no association between PK parameters and efficacy or tolerability.
Conclusions: Combined inhibition of PI3K and mTOR inhibits a clinically meaningful driver pathway in a small subset of patients with ALL, with no benefit in patients with AML.
Trial registration: ClinicalTrials.gov, identifier NCT01756118. retrospectively registered 19th December 2012, https://clinicaltrials.gov/ct2/show/NCT01756118.
Background: The feedback given to students plays an important role in their efficiency related to learning practical skills. In the present study, diverse feedback modalities have been investigated. Our hypothesis is that individualized and unsupervised video feedback can produce a similar learning experience as performing practical skills in an oral and maxillofacial surgery setting with conventional direct expert feedback (control group).
Methods: This prospective, randomized, controlled, and blinded study compared direct expert feedback (DEF), individualized video feedback (IVF) and unsupervised video feedback (UVF). The participants were fourth-year dental students from University Goethe in Frankfurt. The students were assigned to one of the three feedback methods (n = 20 per group) using simple randomization. All participants watched an instruction video for an interdental (‘Ernst’) ligature and periphery venous catheterization. Next, the students were video recorded performing the tasks by themselves (pre-test). Following this, every student received feedback using one of the above-mentioned feedback modalities. The participants then performed the same task again while being video recorded (post-test) to measure the acquired competence. Six weeks later, the students participated in an objective structured clinical examination (OSCE) to evaluate their long-term knowledge retention. All examiners were blinded regarding the students’ instructional approach and their affiliation in terms of the learning group.
Results: For the interdental ligature, we found significant improvements in performance in each feedback modality group between the pre-test and post-test (p < 0.001). UVF had the strongest effect on performance time. The comparison between each group in the post-test showed no significant differences between the three groups.
Conclusion: This study showed that IVF and UVF can be considered an alternative or adjunct to conventional methods (i.e. DEF) when learning procedural skills in oral and maxillofacial surgery. However, DEF showed to be the most effective method of feedback and therefore preferable in teaching.
Background: With refinements in diagnosis and therapy of gliomas, the importance of survival time as the sole outcome parameter has decreased, and patient-centered outcome parameters have gained interest. Pursuing a profession is an indispensable component of human happiness. The aim of this study was to analyze the professional outcomes besides their neuro-oncological and functional evaluation after surgery for gliomas in eloquent areas.
Methods: We assessed neuro-oncological and functional outcomes of patients with gliomas WHO grades II and III undergoing surgery between 2012 and 2018. All patients underwent routine follow-up and adjuvant treatment. Treatment and survival parameters were collected prospectively. Repercussions of the disease on the patients’ professional status, socio-economic situation, and neurocognitive function were evaluated retrospectively with questionnaires.
Results: We analyzed data of 58 patients with gliomas (WHO II: 9; III: 49). Median patient age was 35.8 years (range 21–63 years). Awake surgery techniques were applied in 32 patients (55.2%). Gross total and subtotal tumor resections were achieved in 33 (56.9%) and 17 (29.3%) patients, respectively, whereas in 8 patients (13.8%) resection had to remain partial. Most patients (n = 46; 79.3%) received adjuvant treatment. Median follow up was 43.8 months (range 11–82 months). After treatment 41 patients (70.7%) were able to resume a working life. Median time until returning to work was 8.0 months (range 0.2–22.0 months). To be younger than 40 at the time of the surgery was associated with a higher probability to return to work (p < .001). Multivariable regression analysis showed that patient age < 40 years as well as occupational group and self-reported fatigue were factors independently associated with the ability to return to work.
Conclusion: The ability to resume professional activities following brain tumor surgery is an important patient-oriented outcome parameter. We found that the majority of patients with gliomas were able to return to work following surgical and adjuvant treatment. Preservation of neurological function is of utmost relevance for individual patients´ quality of life.
Background: Healthy volunteer registry donors have become the backbone of stem cell transplantation programs. While most registrants will never become actual donors, a small minority are called upon twice, most commonly for the same patient because of poor graft function. Anecdotal evidence provides no hard reasons to disallow second-time mobilized apheresis, but few centers have treated enough two-time donors for definitive conclusions. Moreover, for reasons unknown, the efficiency of G-CSF varies greatly between donations.
Methods: Comparison of outcomes of first vs. second donations can formally confirm G-CSF responsiveness as intrinsically, likely genetically, determined. In our database, we identified 60 donors (1.3%) who received two cycles of G-CSF 24 days to 4 years apart and systematically compared mobilization outcomes.
Results: First and second mobilization and collection proceeded without severe or unusual adverse effects. First-time mobilization efficiency was highly predictive of second-time mobilization. Neither mobilization efficiency nor time lag between donations affected the similarity of first- and second-time mobilization outcomes.
Conclusions: With the caveat that only donors with an unremarkable first donation were cleared for a second, our data indicate that a second donation is feasible, equally tolerable as a first donation, and efficient. Moreover, the data strongly support the notion of donor-intrinsic variables dictating mobilization response and argue against relevant damage to the stem cell compartment during mobilization with rhG-CSF.
Standard monitoring of heart rate, blood pressure and arterial oxygen saturation during endoscopy is recommended by current guidelines on procedural sedation. A number of studies indicated a reduction of hypoxic (art. oxygenation < 90% for > 15 s) and severe hypoxic events (art. oxygenation < 85%) by additional use of capnography. Therefore, U.S. and the European guidelines comment that additional capnography monitoring can be considered in long or deep sedation. Integrated Pulmonary Index® (IPI) is an algorithm-based monitoring parameter that combines oxygenation measured by pulse oximetry (art. oxygenation, heart rate) and ventilation measured by capnography (respiratory rate, apnea > 10 s, partial pressure of end-tidal carbon dioxide [PetCO2]). The aim of this paper was to analyze the value of IPI as parameter to monitor the respiratory status in patients receiving propofol sedation during PEG-procedure. Patients reporting for PEG-placement under sedation were randomized 1:1 in either standard monitoring group (SM) or capnography monitoring group including IPI (IM). Heart rate, blood pressure and arterial oxygen saturation were monitored in SM. In IM additional monitoring was performed measuring PetCO2, respiratory rate and IPI. Capnography and IPI values were recorded for all patients but were only visible to the endoscopic team for the IM-group. IPI values range between 1 and 10 (10 = normal; 8–9 = within normal range; 7 = close to normal range, requires attention; 5–6 = requires attention and may require intervention; 3–4 = requires intervention; 1–2 requires immediate intervention). Results on capnography versus standard monitoring of the same study population was published previously. A total of 147 patients (74 in SM and 73 in IM) were included in the present study. Hypoxic events occurred in 62 patients (42%) and severe hypoxic events in 44 patients (29%), respectively. Baseline characteristics were equally distributed in both groups. IPI = 1, IPI < 7 as well as the parameters PetCO2 = 0 mmHg and apnea > 10 s had a high sensitivity for hypoxic and severe hypoxic events, respectively (IPI = 1: 81%/81% [hypoxic/severe hypoxic event], IPI < 7: 82%/88%, PetCO2: 69%/68%, apnea > 10 s: 84%/84%). All four parameters had a low specificity for both hypoxic and severe hypoxic events (IPI = 1: 13%/12%, IPI < 7: 7%/7%, PetCO2: 29%/27%, apnea > 10 s: 7%/7%). In multivariate analysis, only SM and PetCO2 = 0 mmHg were independent risk factors for hypoxia. IPI (IPI = 1 and IPI < 7) as well as the individual parameters PetCO2 = 0 mmHg and apnea > 10 s allow a fast and convenient conclusion on patients’ respiratory status in a morbid patient population. Sensitivity is good for most parameters, but specificity is poor. In conclusion, IPI can be a useful metric to assess respiratory status during propofol-sedation in PEG-placement. However, IPI was not superior to PetCO2 and apnea > 10 s.
Aims: Stroke is a major complication after transcatheter aortic valve implantation (TAVI). Although multifactorial, it remains unknown whether the valve deployment system itself has an impact on the incidence of early stroke. We performed a meta- and network analysis to investigate the 30-day stroke incidence of self-expandable (SEV) and balloon-expandable (BEV) valves after transfemoral TAVI.
Methods and results: Overall, 2723 articles were searched directly comparing the performance of SEV and BEV after transfemoral TAVI, from which 9 were included (3086 patients). Random effects models were used for meta- and network meta-analysis based on a frequentist framework. Thirty-day incidence of stroke was 1.8% in SEV and 3.1% in BEV (risk ratio of 0.62, 95% confidence interval (CI) 0.49–0.80, p = 0.004). Treatment ranking based on network analysis (P-score) revealed CoreValve with the best performance for 30-day stroke incidence (75.2%), whereas SAPIEN had the worst (19.0%). However, network analysis showed no inferiority of SAPIEN compared with CoreValve (odds ratio 2.24, 95% CI 0.70–7.2).
Conclusion: Our analysis indicates higher 30-day stroke incidence after transfemoral TAVI with BEV compared to SEV. We could not find evidence for superiority of a specific valve system. More randomized controlled trials with head-to-head comparison of SEV and BEV are needed to address this open question.
Congenital diaphragmatic hernia (CDH) is a relatively common and life-threatening birth defect, characterized by incomplete formation of the diaphragm. Because CDH herniation occurs at the same time as preacinar airway branching, normal lung development becomes severely disrupted, resulting almost invariably in pulmonary hypoplasia. Despite various research efforts over the past decades, the pathogenesis of CDH and associated lung hypoplasia remains poorly understood. With the advent of molecular techniques, transgenic animal models of CDH have generated a large number of candidate genes, thus providing a novel basis for future research and treatment. This review article offers a comprehensive overview of genes and signaling pathways implicated in CDH etiology, whilst also discussing strengths and limitations of transgenic animal models in relation to the human condition.
In the application of range of motion (ROM) tests there is little agreement on the number of repetitions to be measured and the number of preceding warm-up protocols. In stretch training a plateau in ROM gains can be seen after four to five repetitions. With increasing number of repetitions, the gain in ROM is reduced. This study examines the question of whether such an effect occurs in common ROM tests. Twenty-two healthy sport students (10 m/12 f.) with an average age of 25.3 ± 1.94 years (average height 174.1 ± 9.8 cm; weight 66.6 ± 11.3 kg and BMI 21.9 ± 2.0 kg/cm2) volunteered in this study. Each subject performed five ROM tests in a randomized order—measured either via a tape measure or a digital inclinometer: Tape measure was used to evaluate the Fingertip-to-Floor test (FtF) and the Lateral Inclination test (LI). Retroflexion of the trunk modified after Janda (RF), Thomas test (TT) and a Shoulder test modified after Janda (ST) were evaluated with a digital inclinometer. In order to show general acute effects within 20 repetitions we performed ANOVA/Friedman-test with multiple comparisons. A non-linear regression was then performed to identify a plateau formation. Significance level was set at 5%. In seven out of eight ROM tests (five tests in total with three tests measured both left and right sides) significant flexibility gains were observed (FtF: p < 0.001; LI-left/right: p < 0.001/0.001; RF: p = 0.009; ST-left/right: p < 0.001/p = 0.003; TT-left: p < 0.001). A non-linear regression with random effects was successfully applied on FtF, RF, LI-left/right, ST-left and TT-left and thus, indicate a gradual decline in the amount of gained ROM. An acute effect was observed in most ROM tests, which is characterized by a gradual decline of ROM gain. For those tests, we can state that the acute effect described in the stretching literature also applies to the performance of typical ROM tests. Since a non-linear behavior was shown, it is the decision of the practitioner to weigh up between measurement accuracy and expenditure. Researchers and practitioners should consider this when applying ROM assessments to healthy young adults.
Survivin is a drug target and the survivin suppressant YM155 a drug candidate for high-risk neuroblastoma. Findings from one YM155-adapted subline of the neuroblastoma cell line UKF-NB-3 had suggested that increased ABCB1 (mediates YM155 efflux) levels, decreased SLC35F2 (mediates YM155 uptake) levels, decreased survivin levels, and TP53 mutations indicate YM155 resistance. Here, the investigation of ten additional YM155-adapted UKF-NB-3 sublines only confirmed the roles of ABCB1 and SLC35F2. However, cellular ABCB1 and SLC35F2 levels did not indicate YM155 sensitivity in YM155-naïve cells, as indicated by drug response data derived from the Cancer Therapeutics Response Portal (CTRP) and the Genomics of Drug Sensitivity in Cancer (GDSC) databases. Moreover, the resistant sublines were characterised by a remarkable heterogeneity. Only seven sublines developed on-target resistance as indicated by resistance to RNAi-mediated survivin depletion. The sublines also varied in their response to other anti-cancer drugs. In conclusion, cancer cell populations of limited intrinsic heterogeneity can develop various resistance phenotypes in response to treatment. Therefore, individualised therapies will require monitoring of cancer cell evolution in response to treatment. Moreover, biomarkers can indicate resistance formation in the acquired resistance setting, even when they are not predictive in the intrinsic resistance setting.
Introduction: Recommendations for venous thromboembolism and deep venous thrombosis (DVT) prophylaxis using graduated compression stockings (GCS) is historically based and has been critically examined in current publications. Existing guidelines are inconclusive as to recommend the general use of GCS.
Patients/Methods: 24 273 in-patients (general surgery and orthopedic patients) undergoing surgery between 2006 and 2016 were included in a retrospectively analysis from a single center. From January 2006 to January 2011 perioperative GCS was employed additionally to drug prophylaxis and from February 2011 to March 2016 patients received drug prophylaxis alone. According to german guidelines all patients received venous thromboembolism prophylaxis with weight-adapted LMWH. Risk stratification (low risk, moderate risk, high risk) was based on the guideline of the American College of Chest Physicians. Data analysis was performed before and after propensity matching (PM). The defined primary endpoint was the incidence of symptomatic or fatal pulmonary embolism (PE). A secondary endpoint was the incidence of deep venous thromboembolism (DVT).
Results: After risk stratification (low risk n = 16 483; moderate risk n = 4464; high risk n = 3326) a total of 24 273 patient were analyzed. Before to PM the relative risk for the occurrence of a PE or DVT was not increased by abstaining from GCS. After PM two groups of 11 312 patients each, one with and one without GCS application, were formed. When comparing the two groups, the relative risk (RR) for the occurrence of a pulmonary embolism was: Low Risk 0.99 [CI95% 0.998–1.000]; Moderate Risk 0.999 [CI95% 0.95–1.003]; High Risk 0.996 [CI95% 0.992–1.000] (p > 0.05). The incidence of PE in the total group LMWH alone was 0.1% (n = 16). In the total group using LMWH + GCS, the incidence was 0.3% (n = 29). RR after PM was 0.999 [CI95% 0.998–1.00].
Conclusion: In comparison to prior studies with only small numbers of patients our trial shows in a large group of patients with moderate and high risk developing VTE we can support the view that abstaining from GCS-use does not increase the incidence of symptomatic or fatal PE and symptomatic DVT.