Refine
Year of publication
Document Type
- Article (844)
- Preprint (715)
- Part of Periodical (21)
- Conference Proceeding (9)
- Working Paper (8)
- Review (3)
- Book (1)
- Part of a Book (1)
- Contribution to a Periodical (1)
- Report (1)
Language
- English (1580)
- German (23)
- Multiple languages (1)
Has Fulltext
- yes (1604)
Is part of the Bibliography
- no (1604)
Keywords
- Heavy Ion Experiments (21)
- Hadron-Hadron Scattering (11)
- Hadron-Hadron scattering (experiments) (11)
- LHC (9)
- Heavy-ion collision (6)
- Jets (5)
- ALICE experiment (4)
- Collective Flow (4)
- Inverse kinematics (4)
- Quark-Gluon Plasma (4)
Institute
- Physik (1209)
- Frankfurt Institute for Advanced Studies (FIAS) (989)
- Informatik (920)
- Medizin (212)
- Geowissenschaften (28)
- ELEMENTS (21)
- Biochemie und Chemie (17)
- Biowissenschaften (14)
- Senckenbergische Naturforschende Gesellschaft (10)
- Institut für Ökologie, Evolution und Diversität (9)
Purpose: A study of real-time adaptive radiotherapy systems was performed to test the hypothesis that, across delivery systems and institutions, the dosimetric accuracy is improved with adaptive treatments over non-adaptive radiotherapy in the presence of patient-measured tumor motion.
Methods and materials: Ten institutions with robotic(2), gimbaled(2), MLC(4) or couch tracking(2) used common materials including CT and structure sets, motion traces and planning protocols to create a lung and a prostate plan. For each motion trace, the plan was delivered twice to a moving dosimeter; with and without real-time adaptation. Each measurement was compared to a static measurement and the percentage of failed points for γ-tests recorded.
Results: For all lung traces all measurement sets show improved dose accuracy with a mean 2%/2 mm γ-fail rate of 1.6% with adaptation and 15.2% without adaptation (p < 0.001). For all prostate the mean 2%/2 mm γ-fail rate was 1.4% with adaptation and 17.3% without adaptation (p < 0.001). The difference between the four systems was small with an average 2%/2 mm γ-fail rate of <3% for all systems with adaptation for lung and prostate.
Conclusions: The investigated systems all accounted for realistic tumor motion accurately and performed to a similar high standard, with real-time adaptation significantly outperforming non-adaptive delivery methods.
Background: Epileptic seizures are common clinical features in patients with acute subdural hematoma (aSDH); however, diagnostic feasibility and therapeutic monitoring remain limited. Surface electroencephalography (EEG) is the major diagnostic tool for the detection of seizures but it might be not sensitive enough to detect all subclinical or nonconvulsive seizures or status epilepticus. Therefore, we have planned a clinical trial to evaluate a novel treatment modality by perioperatively implanting subdural EEG electrodes to diagnose seizures; we will then treat the seizures under therapeutic monitoring and analyze the clinical benefit.
Methods: In a prospective nonrandomized trial, we aim to include 110 patients with aSDH. Only patients undergoing surgical removal of aSDH will be included; one arm will be treated according to the guidelines of the Brain Trauma Foundation, while the other arm will additionally receive a subdural grid electrode. The study's primary outcome is the comparison of incidence of seizures and time-to-seizure between the interventional and control arms. Invasive therapeutic monitoring will guide treatment with antiseizure drugs (ASDs). The secondary outcome will be the functional outcome for both groups as assessed via the Glasgow Outcome Scale and modified Rankin Scale both at discharge and during 6 months of follow-up. The tertiary outcome will be the evaluation of chronic epilepsy within 2-4 years of follow-up.
Discussion: The implantation of a subdural EEG grid electrode in patients with aSDH is expected to be effective in diagnosing seizures in a timely manner, facilitating treatment with ASDs and monitoring of treatment success. Moreover, the occurrence of epileptiform discharges prior to the manifestation of seizure patterns could be evaluated in order to identify high-risk patients who might benefit from prophylactic treatment with ASDs.
Trial registration: ClinicalTrials.gov identifier no. NCT04211233.
Poster presentation: Background In the past years, once-daily (QD) dosing of antiretroviral combination therapy has become an increasingly available treatment option for HIV-1+ patients. Methods Open label study in which HIV-1+ patients treated with SAQ/RTV (1000/100 mg BID) and two NRTIs with HIV-RNA-PCR < 50 copies/ml were switched to SAQ/RTV(2000/100 mg QD) with unchanged NRTI-backbone. CD4-cells, HIV-RNA-PCR, SAQ and RTV drug-levels and metabolic parameters were compared. Summary of results 17 patients (15 male, 42 years), median CD4 456 ± 139/micro l were included so far. The median follow-up time is 4 months. The HIV-RNA-PCR remained <50 copies/ml for all patients. Fasting metabolic parameters remained unchanged. The SAQ AUC 0–12 h were significantly higher when given QD vs. BID (median 29,400 vs. 18,500 ng*h/ml; p = 0.009), whereas the Cmin, Cmax and AUC was lower for RTV when given QD vs. BID (7,400 vs. 11,700 ng*h/ml; p = 0.02). Conclusion In this ongoing study SAQ/RTV (2000/100 mg QD) was well tolerated and demonstrated higher SAQ and lower RTV drug levels as compared to the BID dosing schedule. (Table 1 and Figure 1.)
Background: Research on chronic subdural hematoma (cSDH) management has primarily focused on potential recurrence after surgical evacuation. Herein, we present a novel postoperative/non-invasive treatment that includes a supervised Valsalva maneuver (SVM), which may serve to reduce SDH recurrence. Accordingly, the aims of the study were to investigate the effects of SVM on SDH recurrence rates and functional outcomes.
Methods: A prospective study was conducted from December 2016 until December 2019 at the Goethe University Hospital Frankfurt. Of the 204 adult patients with surgically treated cSDH who had subdural drains placed, 94 patients were assigned to the SVM group and 82 patients were assigned to the control group. The SVM was performed by having patients blow into a self-made SVM device at least two times/h for 12 h/day. The primary end-point was SDH recurrence rate, while secondary outcomes were morbidity and functional outcomes at 3 months of follow-up.
Results: SDH recurrence was observed in 16 of 94 patients (17%) in the SVM group, which was a significant reduction as compared with the control group, which had 24 of 82 patients (29.3%; p = 0.05) develop recurrent SDHs. Further, the infection rate (e.g., pneumonia) was significantly lower in the SVM group (1.1%) than in the control group (13.4%; p < 0.001; odds ratio [OR] 0.1). At the 3-month follow-up, 85 of 94 patients (90.4%) achieved favorable outcomes in the SVM group compared with 62 of 82 patients (75.6%) in the control group (p = 0.008; OR 3.0). Independent predictors for favorable outcome at follow-up were age (OR 0.9) and infection (OR 0.2).
Conclusion: SVM appears to be safe and effective in the post-operative management of cSDHs, reducing both recurrence rates and infections after surgical evacuation, thereby resulting in favorable outcomes at follow-up.
Background: The extent of preoperative peritumoral edema in glioblastoma (GBM) has been negatively correlated with patient outcome. As several ongoing studies are investigating T-cell based immunotherapy in GBM, we conducted this study to assess whether peritumoral edema with potentially increased intracranial pressure, disrupted tissue homeostasis and reduced local blood flow has influence on immune infiltration and affects survival.
Methods: A volumetric analysis of preoperative imaging (gadolinium enhanced T1 weighted MRI sequences for tumor size and T2 weighted sequences for extent of edema (including the infiltrative zone, gliosis etc.) was conducted in 144 patients using the BrainlabÒ software. Immunohistochemical staining was analyzed for lymphocytic- (CD 3+) and myeloid (CD15+) tumor infiltration. A retrospective analysis of patient-, surgical-, and molecular characteristics was performed using medical records.
Results: The edema to tumor ratio was neither associated with progression-free nor overall survival (p=0.90, p=0.74). However, GBM patients displaying IDH-1 wildtype had significantly higher edema to tumor ratio than patients displaying an IDH-1 mutation (p=0.01). Immunohistopathological analysis did not show significant differences in lymphocytic or myeloid tumor infiltration (p=0.78, p=0.74) between these groups.
Conclusion: In our cohort, edema to tumor ratio had no significant correlation with immune infiltration and outcome. However, patients with an IDH-1wildtype GBM had a significantly higher edema to tumor ratio compared to their IDH-1 mutated peer group. Further studies are necessary to elucidate the underlying mechanisms.
Background: Glioblastoma (GBM) is a cancer type with high thrombogenic potential and GBM patients are therefore at a particularly high risk for thrombotic events. To date only limited data on anticoagulation management after pulmonary embolism (PE) in GBM is available and the sporadic use of DOACs remains off-label.
Methods: A retrospective cohort analysis of patients with GBM and postoperative, thoracic CT-scan confirmed, PE was performed. Clinical course, follow-up at 6 and 12 months and the overall survival (OS) were evaluated using medical charts and neuroradiological data.
Results: Out of 584 GBM patients, 8% suffered from postoperative PE. Out of theses, 30% received direct oral anticoagulants (DOACs) and 70% low-molecular-weight heparin (LMWH) for therapeutic anticoagulation. There was no significant difference in major intracranial hemorrhage (ICH), re-thrombosis or re-embolism between the two cohorts. Although statistically non-significant, a tendency to reduced mRS at 6- and 12 months was observed in the LMWH cohort. Furthermore, patients receiving DOACs had a statistical benefit in OS.
Conclusion: In our analysis DOACs showed a satisfactory safety profile in terms of major ICH, re-thrombosis and re-embolism compared to LMWH in GBM patients with postoperative PE. Prospective, randomized trials are urgent to evaluate DOACs for therapeutic anticoagulation in GBM patients with PE.
Background: Dysphagia is a common and severe symptom of traumatic brain injury (TBI) affecting up to 78% of patients. It is associated with pneumonia, increased morbidity and mortality. Although subdural hematoma (SDH) accounts for over 50% of TBI, the occurrence of dysphagia in this subtype has not been investigated. This study investigates the overall frequency, clinical predictors of dysphagia and functional outcome of patients with SDH associated dysphagia.
Methods: All patients presenting in author ́s institution between 2007 and 2020 with SDH were included in the study. Patients with SDH and clinical suspicion for dysphagia received a clinical swallowing assessment by a speech and language pathologist (SLP). Furthermore the severity of dysphagia was rated according to swallowing disorder scale.Functional outcome was evaluated by Glasgow outcome scale (GOS).
Results: Of 545 patients with SDH, 71 patients had dysphagia (13%). The prevalence of dysphagia was significantly lower in the surgical arm compared to the conservative arm (11.8% vs 21.8%; OR 0.23; p=0.02). Independent predictors for dysphagia were GCS <13 at admission (p<0.001; OR 4.17), cardiovascular disease (p=0.002; OR 2.29) and pneumonia (p=0.002; OR 2.88) whereas operation was a protective factor (p<0.001; OR 0.2). All patients with dysphagia improved significantly under SLP treatment from initial diagnosis to hospital discharge (p<0.01). However, patients with most severe grade of dysphagia showed no significant improvement during the clinical course. Patients with dysphagia had significantly worse outcome (GOS 1-3) compared to those without dysphagia (48.8% vs 26.4%; p<0.001).
Conclusion: Dysphagia is a frequent symptom in SDH and the early identification of dysphagia is crucial regarding initiation of treatment and functional outcome. Surgery is effective in preventing dysphagia and should be considered in high-risked patients.
Species’ functional traits set the blueprint for pair-wise interactions in ecological networks. Yet, it is unknown to what extent the functional diversity of plant and animal communities controls network assembly along environmental gradients in real-world ecosystems. Here we address this question with a unique dataset of mutualistic bird–fruit, bird–flower and insect–flower interaction networks and associated functional traits of 200 plant and 282 animal species sampled along broad climate and land-use gradients on Mt. Kilimanjaro. We show that plant functional diversity is mainly limited by precipitation, while animal functional diversity is primarily limited by temperature. Furthermore, shifts in plant and animal functional diversity along the elevational gradient control the niche breadth and partitioning of the respective other trophic level. These findings reveal that climatic constraints on the functional diversity of either plants or animals determine the relative importance of bottom-up and top-down control in plant–animal interaction networks.
Background: Since sorafenib has shown activity in different tumour types and gemcitabine regimens improved the outcome for biliary tract cancer (BTC) patients, we evaluated first-line gemcitabine plus sorafenib in a double-blind phase II study.
Patients and methods: 102 unresectable or metastatic BTC patients with histologically proven adenocarcinoma of gallbladder or intrahepatic bile ducts, Eastern Cooperative Oncology Group (ECOG) 0–2 were randomised to gemcitabine (1000 mg/m2 once weekly, first 7-weeks + 1-week rest followed by once 3-weeks + 1-week rest) plus sorafenib (400 mg twice daily) or placebo. Treatment continued until progression or unacceptable toxicity. Tumour samples were prospectively stained for sorafenib targets and potential biomarkers. Serum samples (first two cycles) were measured for vascular endothelial growth factors (VEGFs), vascular endothelial growth factor receptor 2 (VEGFR-2) and stromal cell-derived factor 1 (SDF1)α by enzyme-linked immunosorbent assay (ELISA).
Results: Gemcitabine plus sorafenib was generally well tolerated. Four and three patients achieved partial responses in the sorafenib and placebo groups, respectively. There was no difference in the primary end-point, median progression-free survival (PFS) for gemcitabine plus sorafenib versus gemcitabine plus placebo (3.0 versus 4.9 months, P = 0.859), and no difference for median overall survival (OS) (8.4 versus 11.2 months, P = 0.775). Patients with liver metastasis after resection of primary BTC survived longer with sorafenib (P = 0.019) compared to placebo. Patients who developed hand-foot syndrome (HFS) showed longer PFS and OS than patients without HFS. Two sorafenib targets, VEGFR-2 and c-kit, were not expressed in BTC samples. VEGFR-3 and Hif1α were associated with lymph node metastases and T stage. Absence of PDGFRβ expression correlated with longer PFS.
Conclusion: The addition of sorafenib to gemcitabine did not demonstrate improved efficacy in advanced BTC patients. Biomarker subgroup analysis suggested that some patients might benefit from combined treatment.
Background: Computed tomography (CT) allows estimation of coronary artery calcium (CAC) progression. We evaluated several progression algorithms in our unselected, population-based cohort for risk prediction of coronary and cardiovascular events.
Methods: In 3281 participants (45–74 years of age), free from cardiovascular disease until the second visit, risk factors, and CTs at baseline (b) and after a mean of 5.1 years (5y) were measured. Hard coronary and cardiovascular events, and total cardiovascular events including revascularization, as well, were recorded during a follow-up time of 7.8±2.2 years after the second CT. The added predictive value of 10 CAC progression algorithms on top of risk factors including baseline CAC was evaluated by using survival analysis, C-statistics, net reclassification improvement, and integrated discrimination index. A subgroup analysis of risk in CAC categories was performed.
Results: We observed 85 (2.6%) hard coronary, 161 (4.9%) hard cardiovascular, and 241 (7.3%) total cardiovascular events. Absolute CAC progression was higher with versus without subsequent coronary events (median, 115 [Q1–Q3, 23–360] versus 8 [0–83], P<0.0001; similar for hard/total cardiovascular events). Some progression algorithms added to the predictive value of baseline CT and risk assessment in terms of C-statistic or integrated discrimination index, especially for total cardiovascular events. However, CAC progression did not improve models including CAC5y and 5-year risk factors. An excellent prognosis was found for 921 participants with double-zero CACb=CAC5y=0 (10-year coronary and hard/total cardiovascular risk: 1.4%, 2.0%, and 2.8%), which was for participants with incident CAC 1.8%, 3.8%, and 6.6%, respectively. When CACb progressed from 1 to 399 to CAC5y≥400, coronary and total cardiovascular risk were nearly 2-fold in comparison with subjects who remained below CAC5y=400. Participants with CACb≥400 had high rates of hard coronary and hard/total cardiovascular events (10-year risk: 12.0%, 13.5%, and 30.9%, respectively).
Conclusions: CAC progression is associated with coronary and cardiovascular event rates, but adds only weakly to risk prediction. What counts is the most recent CAC value and risk factor assessment. Therefore, a repeat scan >5 years after the first scan may be of additional value, except when a double-zero CT scan is present or when the subjects are already at high risk.