Universitätspublikationen
Refine
Year of publication
Document Type
- Article (5139)
- Doctoral Thesis (825)
- Part of Periodical (179)
- Preprint (176)
- Conference Proceeding (154)
- Book (80)
- Contribution to a Periodical (69)
- Review (40)
- Part of a Book (16)
- Working Paper (8)
Language
Has Fulltext
- yes (6695) (remove)
Keywords
- inflammation (80)
- COVID-19 (62)
- SARS-CoV-2 (49)
- apoptosis (38)
- cancer (38)
- glioblastoma (38)
- Inflammation (36)
- breast cancer (34)
- autophagy (29)
- prostate cancer (29)
Institute
- Medizin (6695) (remove)
Neurogenic/neuropathic bowel dysfunction (NBD) is common in children who are affected by congenital and acquired neurological disease, and negatively impacts quality of life. In the past, NBD received less attention than neurogenic bladder, generally being considered only in spina bifida (the most common cause of pediatric NBD). Many methods of conservative and medical management of NBD are reported, including relatively recently Transanal Irrigation (TAI). Based on the literature and personal experience, an expert group (pediatric urologists/surgeons/gastroenterologists with specific experience in NBD) focused on NBD in children and adolescents. A statement document was created using a modified Delphi method. The range of causes of pediatric NBD are discussed in this paper. The various therapeutic approaches are presented to improve clinical management. The population of children and adolescents with NBD is increasing, due both to the higher survival rate and better diagnosis. While NBD is relatively predictable in producing either constipation or fecal incontinence, or both, its various effects on each patient will depend on a wide range of underlying causes and accompanying comorbidities. For this reason, management of NBD should be tailored individually with a combined multidisciplinary therapy appropriate for the status of the affected child and caregivers.
Postoperative Psychosen
(1916)
Background: Myocardial efficiency should be maintained stable under light-to-moderate stress conditions, but ischemia puts the myocardium at risk for impaired functionality. Additionally, the measurement of such efficiency typically requires invasive heart catheterization and exposure to ionizing radiation. In this work, we aimed to non-invasively assess myocardial power and the resulting efficiency during pharmacological stress testing and ischemia induction. Methods: In a cohort of n = 10 healthy Landrace pigs, dobutamine stress testing was performed, followed by verapamil-induced ischemia alongside cardiac magnetic resonance (CMR) imaging. External myocardial power, internal myocardial power, and myocardial efficiency were assessed non-invasively using geometrical and functional parameters from CMR volumetric as well as blood flow and pressure measurements. Results: External myocardial power significantly increased under dobutamine stress [2.3 (1.6–3.1) W/m2 vs. 1.3 (1.1–1.6) W/m2, p = 0.005] and significantly decreased under verapamil-induced ischemia [0.8 (0.5–0.9) W/m2, p = 0.005]. Internal myocardial power [baseline: 5.9 (4.6–8.5) W/m2] was not affected by dobutamine [7.5 (6.9–9.0) W/m2, p = 0.241] nor verapamil [5.8 (4.7–8.8) W/m2, p = 0.878]. Myocardial efficiency did not change from baseline to dobutamine [21% (15–27) vs. 31% (20–44), p = 0.059] but decreased significantly during verapamil-induced ischemia [10% (8–13), p = 0.005]. Conclusion: In healthy Landrace pigs, dobutamine stress increased external myocardial power, whereas myocardial efficiency was maintained stable. On the contrary, verapamil-induced ischemia substantially decreased external myocardial power and myocardial efficiency. Non-invasive CMR was able to quantify these efficiency losses and might be useful for future clinical studies evaluating the effects of therapeutic interventions on myocardial energetics.
The effects of exercise interventions on unspecific chronic low back pain (CLBP) have been investigated in many studies, but the results are inconclusive regarding exercise types, efficiency, and sustainability. This may be because the influence of psychosocial factors on exercise induced adaptation regarding CLBP is neglected. Therefore, this study assessed psychosocial characteristics, which moderate and mediate the effects of sensorimotor exercise on LBP. A single-blind 3-arm multicenter randomized controlled trial was conducted for 12-weeks. Three exercise groups, sensorimotor exercise (SMT), sensorimotor and behavioral training (SMT-BT), and regular routines (CG) were randomly assigned to 662 volunteers. Primary outcomes (pain intensity and disability) and psychosocial characteristics were assessed at baseline (M1) and follow-up (3/6/12/24 weeks, M2-M5). Multiple regression models were used to analyze whether psychosocial characteristics are moderators of the relationship between exercise and pain, meaning that psychosocial factors and exercise interact. Causal mediation analysis were conducted to analyze, whether psychosocial characteristics mediate the exercise effect on pain. A total of 453 participants with intermittent pain (mean age = 39.5 ± 12.2 years, f = 62%) completed the training. It was shown, that depressive symptomatology (at M4, M5), vital exhaustion (at M4), and perceived social support (at M5) are significant moderators of the relationship between exercise and the reduction of pain intensity. Further depressive mood (at M4), social-satisfaction (at M4), and anxiety (at M5 SMT) significantly moderate the exercise effect on pain disability. The amount of moderation was of clinical relevance. In contrast, there were no psychosocial variables which mediated exercise effects on pain. In conclusion it was shown, that psychosocial variables can be moderators in the relationship between sensorimotor exercise induced adaptation on CLBP which may explain conflicting results in the past regarding the merit of exercise interventions in CLBP. Results suggest further an early identification of psychosocial risk factors by diagnostic tools, which may essential support the planning of personalized exercise therapy.
Specific protocols define eligibility, conditioning, donor selection, graft composition and prophylaxis of graft vs. host disease for children and young adults undergoing hematopoietic stem cell transplant (HSCT). However, international protocols rarely, if ever, detail supportive care, including pharmaceutical infection prophylaxis, physical protection with face masks and cohort isolation or food restrictions. Supportive care suffers from a lack of scientific evidence and implementation of practices in the transplant centers brings extensive restrictions to the child's and family's daily life after HSCT. Therefore, the Board of the Pediatric Diseases Working Party (PDWP) of the European Society for Blood and Marrow Transplantation (EBMT) held a series of dedicated workshops since 2017 with the aim of initiating the production of a set of minimal recommendations. The present paper describes the consensus reached within the field of infection prophylaxis.
The bile acid pool with its individual bile acids (BA) is modulated in the enterohepatic circulation by the liver as the primary site of synthesis, the motility of the gallbladder and of the intestinal tract, as well as by bacterial enzymes in the intestine. The nuclear receptor farnesoid X receptor (FXR) and Gpbar1 (TGR5) are important set screws in this process. Bile acids have a vasodilatory effect, at least according to in vitro studies. The present review examines the question of the extent to which the increase in bile acids in plasma could be responsible for the hyperdynamic circulatory disturbance of liver cirrhosis and whether modulation of the bile acid pool, for example, via administration of ursodeoxycholic acid (UDCA) or via modulation of the dysbiosis present in liver cirrhosis could influence the hemodynamic disorder of liver cirrhosis. According to our analysis, the evidence for this is limited. Long-term studies on this question are lacking.
Optimal distribution-preserving downsampling of large biomedical data sets (opdisDownsampling)
(2021)
Motivation: The size of today’s biomedical data sets pushes computer equipment to its limits, even for seemingly standard analysis tasks such as data projection or clustering. Reducing large biomedical data by downsampling is therefore a common early step in data processing, often performed as random uniform class-proportional downsampling. In this report, we hypothesized that this can be optimized to obtain samples that better reflect the entire data set than those obtained using the current standard method. Results: By repeating the random sampling and comparing the distribution of the drawn sample with the distribution of the original data, it was possible to establish a method for obtaining subsets of data that better reflect the entire data set than taking only the first randomly selected subsample, as is the current standard. Experiments on artificial and real biomedical data sets showed that the reconstruction of the remaining data from the original data set from the downsampled data improved significantly. This was observed with both principal component analysis and autoencoding neural networks. The fidelity was dependent on both the number of cases drawn from the original and the number of samples drawn. Conclusions: Optimal distribution-preserving class-proportional downsampling yields data subsets that reflect the structure of the entire data better than those obtained with the standard method. By using distributional similarity as the only selection criterion, the proposed method does not in any way affect the results of a later planned analysis.
High sedation needs of critically ill COVID-19 ARDS patients - a monocentric observational study
(2021)
Background: Therapy of severely affected coronavirus patient, requiring intubation and sedation is still challenging. Recently, difficulties in sedating these patients have been discussed. This study aims to describe sedation practices in patients with 2019 coronavirus disease (COVID-19)-induced acute respiratory distress syndrome (ARDS). Methods: We performed a retrospective monocentric analysis of sedation regimens in critically ill intubated patients with respiratory failure who required sedation in our mixed 32-bed university intensive care unit. All mechanically ventilated adults with COVID-19-induced ARDS requiring continuously infused sedative therapy admitted between April 4, 2020, and June 30, 2020 were included. We recorded demographic data, sedative dosages, prone positioning, sedation levels and duration. Descriptive data analysis was performed; for additional analysis, a logistic regression with mixed effect was used. Results: In total, 56 patients (mean age 67 (±14) years) were included. The mean observed sedation period was 224 (±139) hours. To achieve the prescribed sedation level, we observed the need for two or three sedatives in 48.7% and 12.8% of the cases, respectively. In cases with a triple sedation regimen, the combination of clonidine, esketamine and midazolam was observed in most cases (75.7%). Analgesia was achieved using sufentanil in 98.6% of the cases. The analysis showed that the majority of COVID-19 patients required an unusually high sedation dose compared to those available in the literature. Conclusion: The global pandemic continues to affect patients severely requiring ventilation and sedation, but optimal sedation strategies are still lacking. The findings of our observation suggest unusual high dosages of sedatives in mechanically ventilated patients with COVID-19. Prescribed sedation levels appear to be achievable only with several combinations of sedatives in most critically ill patients suffering from COVID-19-induced ARDS and a potential association to the often required sophisticated critical care including prone positioning and ECMO treatment seems conceivable.
Due to their physiological role in removing damaged cells, natural killer (NK) cells represent ideal candidates for cellular immunotherapy in the treatment of cancer. Thereby, the cytotoxicity of NK cells is regulated by signals on both, the NK cells as well as the targeted tumor cells, and the interplay and balance of these signals determine the killing capacity of NK cells. One promising avenue in cancer treatment is therefore the combination of NK cell therapy with agents that either help to increase the killing capacity of NK cells or sensitize tumor cells to an NK cell-mediated attack. In this mini-review, we present different strategies that can be explored to unleash the potential of NK cell immunotherapy. In particular, we summarize how modulation of apoptosis signaling within tumor cells can be exploited to sensitize tumor cells to NK cell-mediated cytotoxicity.
Background: Salivary gland cancer (SGC) is rare and a heterogeneous type of cancer. Prospective randomized trials are lacking. No guideline focusing on standard procedures of radiotherapy (RT) in the treatment of SGC exists. Therefore, we surveyed the members of the German Society of Radiation Oncology (DEGRO) to gain information about current therapeutic strategies of SGC. Methods: An anonymous questionnaire was designed and made available on the online platform umfrageonline.com. The corresponding link was sent to all DEGRO members who provided their user data for contact purposes. Alternatively, a PDF printout version was sent. Frequency distributions of responses for each question were calculated. The data were also analyzed by type of institution. Results: Sixty-seven responses were received, including answers from 21 university departments, 22 non-university institutions, and 24 radiation oncology practices. Six participants reported that their departments (practice: n = 5, non-university hospital: n = 1) did not treat SGC, and therefore the questionnaire was not completed. Concerning radiation techniques, target volume definition, and concomitant chemotherapy, treatment strategies varied greatly among the participants. Comparing university vs. non-university institutions, university hospitals treat significantly more patients with SGC per year and initiated more molecular pathological diagnostics. Conclusion: SGC represents a major challenge for clinicians, as reflected by the inhomogeneous survey results regarding diagnostics, RT approaches, and systemic therapy. Future prospective, multicenter clinical trials are warranted to improve and homogenize treatment of SGC and to individualize treatment according to histologic subtypes and risk factors.
Introduction Disseminated infection due to non-tuberculous mycobacteria has been a major factor of mortality and comorbidity in HIV patients. Until 2018, U.S. American guidelines have recommended antimycobacterial prophylaxis in patients with low CD4 cell counts, a practice that has not been adopted in Europe. This study aimed at examining the impact of disseminated NTM disease on clinical outcome in German HIV patients with a severe immunodeficiency. Materials and methods In this retrospective case control study, HIV patients with disseminated NTM disease were identified by retrospective chart review and matched by their CD4 cell counts to HIV patients without NTM infection in a 1:1 alocation. Primary endpoints were mortality and time to first rehospitalisation. In addition, other opportunistic diseases, as well as antimycobacterial and antiretroviral treatments were examined. Results Between 2006 and 2016, we identified 37 HIV patients with disseminated NTM disease. Most of them were suffering from infections due to M. avium complex (n = 31, 77.5%). Time to event analysis showed a non-significant trend to higher mortality in patients with disseminated NTM disease (p = 0.24). Rehospitalisation took place significantly earlier in patients with disseminated NTM infections (median 40.5 days vs. 109 days, p<0.0001). Conclusion In this retrospective case control study, we could demonstrate that mortality is not significantly higher in HIV patients with disseminated NTM disease in the ART era, but that they require specialised medical attention in the first months following discharge.
Although the global tobacco market of cigarillos is substantial, little is known about their particulate matter (PM) emissions. For exposure risk assessment of cigarillos, the PM fractions PM10, PM2.5, and PM1 of eight cigarillo brands (four with filters) and a reference cigarette were measured. For this purpose, second-hand smoke was generated by an automatic smoke pump in a measuring chamber with a volume of 2.88 m³. The mean particle concentrations of the cigarillos ranged from 2783 μg/m³ to 6686 μg/m³ for PM10, from 2767 μg/m³ to 6585 μg/m³ for PM2.5, and from 2441 to 4680 μg/m³ for PM1. Mean concentrations of the reference cigarette for PM10, PM2.5, and PM1 were 4400 μg/m³, 4335 μg/m³, and 3289 μg/m³, respectively. Filter-tipped cigarillos showed between 5% and 38% lower PM10 and PM2.5 levels, respectively, and between 4% and 30% lower PM1 levels. Our findings show generally high PM emissions for all investigated tobacco products. Therefore, the declaration of PM amounts to government authorities should be mandatory for all tobacco products. Policymakers should ensure that corresponding information will be provided in the future.
Background: Iron deficiency (ID) is one of the most common nutritional deficiencies in children worldwide and may result in iron deficiency anemia (IDA). The reticulocyte hemoglobin equivalent (Ret-He) provides information about the current availability of iron in erythropoiesis. This study aims to examine the validation of Ret-He as a screening marker for ID and IDA in children. Methods: Blood samples were retrospectively obtained from medical records. Anemia was defined according to the definition provided by the World Health Organization (WHO) for children. ID was defined by transferrin saturation (TSAT) < 20% and ferritin < 100 ng/mL. Children were classified into four groups: IDA, non-anemia iron deficiency (NAID), control and others. Results: Out of 970 children, 332 (34.2%) had NAID and 278 (28.7%) presented with IDA. Analysis revealed that Ret-He significantly correlates with ferritin (rho = 0.41; p < 0.001), TSAT (rho = 0.66; p < 0.001) and soluble transferrin receptor (sTfR) (rho = −0.72; p < 0.001). For ROC analysis, the area under the curve (AUC) was 0.771 for Ret-He detecting ID and 0.845 for detecting IDA. The cut-off value for Ret-He to diagnose ID was 33.5 pg (sensitivity 90.7%; specificity 35.8%) and 31.6 pg (sensitivity 90.6%; specificity 50.4%) to diagnose IDA. Conclusions: The present study demonstrates Ret-He to be a screening marker for ID and IDA in children. Furthermore, Ret-He can be used as a single screening parameter for ID and IDA in children without considering other iron parameters. Economically, the use of Ret-He is highly relevant, as it can save one blood tube per patient and additional costs.
Although the human immune response to cancer is naturally potent, it can be severely disrupted as a result of an immunosuppressive tumor microenvironment. Infiltrating regulatory T lymphocytes contribute to this immunosuppression by inhibiting proliferation of cytotoxic CD8+ T lymphocytes, which are key to an effective anti-cancer immune response. Other important contributory factors are thought to include metabolic stress caused by the local nutrient deprivation common to many solid tumors. Interleukin-33 (IL-33), an alarmin released in reaction to cell damage, and sphingosine-1-phosphate (S1P) are known to control cell positioning and differentiation of T lymphocytes. In an in vitro model of nutrient deprivation, we investigated the influence of IL-33 and S1P receptor 4 (S1P4) on the differentiation and migration of human CD8+ T lymphocytes. Serum starvation of CD8+ T lymphocytes induced a subset of CD8Low and IL-33 receptor-positive (ST2L+) cells characterized by enhanced expression of the regulatory T cell markers CD38 and CD39. Both S1P1 and S1P4 were transcriptionally regulated after stimulation with IL-33. Moreover, expression of the chemokine receptor CXCR4 was increased in CD8+ T lymphocytes treated with the selective S1P4 receptor agonist CYM50308. We conclude that nutrient deprivation promotes CD8Low T lymphocytes, contributing to an immunosuppressive microenvironment and a poor anti-cancer immune response by limiting cytotoxic effector functions. Our results suggest that S1P4 signaling modulation may be a promising target for anti-CXCR4 cancer immunotherapy.
Influence of antibiotic management on microbial selection and infectious complications after trauma
(2021)
Background: The inflammatory response and post-traumatic complications like infections play an important role in the pathophysiology of severe injuries. This study examines the microbiological aspects in anti-infective treatment of trauma patients and their inflammatory response in post-traumatic infections complications. Patients and Methods: A retrospective analysis of prospectively collected data in trauma patients (ISS ≥ 16) over a 1-year period (01/2018 to 12/2018) is provided. Patient population was stratified into severely injured patients without post-traumatic infection (inf-PT), and severely injured patients who developed an infection (inf+PT).Results: Of 114 trauma patients, 45 suffered from post-traumatic infection during the first 10 days of hospitalization. Severely injured patients with concomitant traumatic brain injury (PT+TBI) showed the highest rate of post-traumatic infection. Pro-inflammatory reaction was tracked by levels of Interleukin (IL-)6 (day 3: inf+T 190.8 ± 359.4 pg/dL > inf-PT 56.2 ± 57.7 pg/mL (mean ± SD); p = 0.008) and C-Reactive-Protein (CRP, day 3: inf+PT 15.3 mg/dL > inf-PT 6.7 mg/dL, p = 0.001) which were significantly higher in trauma patients who develop an infectious complication and showed a significant positive correlation with the occurrence of infection. The leading entity of infection was pneumonia followed by infections of the urinary tract mainly caused by gram-negative Enterobacteriaceae. 67.5% of all trauma patients received single-shot antibiosis during initial care in trauma bay. The development of secondary colonization was not relevant positively correlated with single-shot antibiosis (r = 0.013, p = 0.895) and prophylactically calculated antibiotic administration (r = 0.066, p = 0.500).Conclusion: Severely injured trauma patients have an increased risk for development of infectious complications, which mainly is pneumonia followed by infection of the urinary tract mainly caused by gram-negative Enterobacteriaceae. Based on the data in this study, the one-time antibiotic and prophylactic calculated use of antibiotics, like Cephalosporins must be critically discussed in terms of their role in the development of post-traumatic infections and microbial selection.
Emil Sioli †
(1923)
Der Zahnarzt im Felde
(1916)
Zielsetzung: Ziel dieser Studie war die Überprüfung der Machbarkeit einer softwaregestützten radiologischen Evaluation der Cageposition und Quantifizierung einer möglichen Cagemigration und -sinterung anhand computertomographisch gewonnener DICOM-Daten im Rahmen des Heilungsprozesses interkorporell fusionierter Patienten. Zusätzlich dazu wurde eine mögliche Korrelation zum Fusionsverhalten des Cages sowie zum klinischen Outcome der Patienten analysiert.
Material und Methoden: In den postoperativen CT Datensätzen von 67 Patienten nach monosegmentaler, dorsal instrumentierter TLIF wurde mithilfe der Software VGStudio Max die Cageposition bestimmt. Eine im postoperativen Verlauf eingetretene Lageänderung ≥ 1 mm bzw. ≥ 3° wurde hierbei als minimale Migration / Sinterung, eine Lageänderung ≥ 3 mm bzw. ≥ 10° als deutliche Migration / Sinterung des Cages gewertet. Um zu prüfen, ob das Migrations- und Sinterungsverhalten einen Einfluss auf die Osteogenese hat, erfolgte auf Basis der von Bridwell et al publizierten Fusionskriterien in den 12 Monate postoperativen CT-Aufnahmen eine Evaluation des Fusionstatus‘. Zur klinischen Beurteilung wurden der Oswestry Disability Index, die Visuelle Analogskala, der Schmerzmittelbedarf und der modifizierte Pationnaire Questionnaire der Patienten ausgewertet.
Ergebnisse: Die Messung der Cageposition mittels VGStudio Max ist eine präzise und reliable Methode zur Quantifizierung einer Cagemigration und -sinterung. Insgesamt war bei 85,1% der Patienten eine Migration (61,2% minimal, 23,9% deutlich) und bei 58,2% der Patienten eine Sinterung (32,8% minimal, 25,4% deutlich) des Cages nachweisbar. Radiologische Zeichen einer Pseudarthrose fanden sich bei 5 Patienten (7,5%). Die übrigen 92,5% der Patienten wiesen eine Grad I bzw. II Fusion auf.
Cagemigration und -sinterung hatten keinen signifikanten Einfluss auf das Fusionsverhalten und das klinische Outcome. Eine Korrelation zwischen Fusionsergebnis und klinischem Outcome bestand ebenfalls nicht.
Schlussfolgerung: Die Inzidenz der Cagemigration ist - unter Berücksichtigung auch geringfügiger Lageänderungen der Cages - deutlich höher als vorbeschrieben. Auf Basis des Migrations- bzw. Sinterungsverhaltens von Cages können jedoch keine Rückschlüsse auf das Fusionsergebnis gezogen werden. Als Kriterium in der Fusionsbeurteilung eignet sich der Nachweis einer Cagemigration bzw. -sinterung daher nicht in dem Ausmaß wie bisher vermutet.
Über Rassenhygiene
(1913)