Refine
Has Fulltext
- yes (10)
Is part of the Bibliography
- no (10)
Keywords
- ASCT (1)
- Accelerometry (1)
- CHIP (1)
- Comparative analysis (1)
- Deep neural network (1)
- Evidence-based guidelines (1)
- Exercise testing (1)
- Gradient boosting (1)
- Inverse kinematics (1)
- Isoscalar giant resonances (1)
Institute
- Medizin (4)
- ELEMENTS (2)
- Geowissenschaften (2)
- Extern (1)
- Sonderforschungsbereiche / Forschungskollegs (1)
Psoriasis vulgaris is a common and chronic inflammatory skin disease which has the potential to significantly reduce the quality of life in severely affected patients. The incidence of psoriasis in Western industrialized countries ranges from 1.5 to 2%. Despite the large variety of treatment options available, patient surveys have revealed insufficient satisfaction with the efficacy of available treatments and a high rate of medication non-compliance. To optimize the treatment of psoriasis in Germany, the Deutsche Dermatologische Gesellschaft and the Berufsverband Deutscher Dermatologen (BVDD) have initiated a project to develop evidence-based guidelines for the management of psoriasis. The guidelines focus on induction therapy in cases of mild, moderate, and severe plaque-type psoriasis in adults. The short version of the guidelines reported here consist of a series of therapeutic recommendations that are based on a systematic literature search and subsequent discussion with experts in the field; they have been approved by a team of dermatology experts. In addition to the therapeutic recommendations provided in this short version, the full version of the guidelines includes information on contraindications, adverse events, drug interactions, practicality, and costs as well as detailed information on how best to apply the treatments described (for full version, please see Nast et al., JDDG, Suppl 2:S1–S126, 2006; or http://www.psoriasis-leitlinie.de).
Die vorliegende Veröffentlichung umfasst zwei Grundbausteine. Zum einen die offizielle Rote Liste mit Nennung der Gefährdungskategorien, zum anderen ein revidiertes systematisches Gesamtartenverzeichnis der Mollusken Baden-Württembergs. Die Rote Liste dient zum schnellen Feststellen der jeweiligen Gefährdungskategorien der einzelnen Arten in Baden-Württemberg und ist wie üblich alphabetisch nach Gattungen geordnet. Sehr großer Wert wurde auf die sorgfältige Analyse der Ergebnisse gelegt (Kapitel 7). Das Gesamtartenverzeichnis dient der aktuellen systematischen Einordnung aller Arten, weshalb hier die Taxa im Kontext des wissenschaftlichen Systems der Mollusken aufgeführt werden. Im systematischen Artenverzeichnis soll der momentane Kenntnisstand über die Mollusken Baden-Württembergs in knapper Darstellung zum Ausdruck kommen. Hier sind auch die bekannten Unterarten aufgeführt und es werden zusätzliche Informationen zum Verbreitungstyp, zur Verbreitung (Vorkommen in den Naturräumen 3. Ordnung) sowie zur Ökologie (Zuordnung einzelner Arten zu bestimmten Biotoptypen) gegeben. Mit diesen Zusatzinformationen werden Rote Listen und Artenverzeichnisse zu Gradmessern der Biodiversitätsforschung. In über 130 ‚Anmerkungen‘ werden die entsprechenden Angaben zur Systematik, Verbreitung und Ökologie präzisiert und es wird auf die hierfür zu Grunde liegende Literatur verwiesen. Alle Angaben der Roten Liste sind auch im ausführlichen systematischen Artenverzeichnis enthalten. In beiden Listen sind die Arten mit ihrer laufenden Nummer aufgeführt. Damit ist ein problemloser Wechsel von der Roten Liste zu den Angaben im systematischen Artenverzeichnis gewährleistet. Der Forschungsstand findet sich vielfach in der historischen Literatur, die deshalb eine sorgfältige und kritische Berücksichtigung erfuhr (siehe Anmerkungen und Literaturverzeichnis). Einen unschätzbaren Wert haben in diesem Zusammenhang die zahlreichen Veröffentlichungen David Geyer‘s, die den Beginn der modernen Regionalfaunisik in Baden-Württemberg kennzeichnen. Ein eigenes Kapitel zur Forschungsgeschichte hätte jedoch den vorgegebenen Rahmen dieser Arbeit gesprengt.
Großräumige Monitoringprogramme stellen eine zweistufige
Stichprobe dar: Zuerst wird eine räumliche Stichprobe ausgewählt
und danach eine Stichprobe an beobachteten Individuen,
besetzten Flächen oder Arten. Damit die in Monitoringprogrammen
gewonnenen Zahlen interpretierbar bleiben,
muss die räumliche Stichprobe „definiert zufällig“ erfolgen,
ansonsten können Verfälschungen auftreten. Außerdem muss
beachtet werden, dass Zählungen und Vorkommensbeobachtungen
(„Präsenz-Absenz-Daten“) binomiale Zufallsgrößen
sind, ganz analog zum Wurf einer Münze. Die Binomialverteiltung
stellt sozusagen das „Grundgesetz der Bestandserhebung“
dar und besagt, dass Zählungen (Z) erstens auch unter
identischen Bedingungen automatisch streuen, und dass sie
zweitens im Durchschnitt einem Anteil p der vorhandenen
Bestände N entsprechen, wobei p die Antreffwahrscheinlichkeit
darstellt. Drittens beinhaltet ein Vergleich zwischen zwei
oder mehr Zählungen immer gleichzeitig einen Vergleich der
Bestände N und der Antreffwahrscheinlichkeit p. Das bedeutet,
dass ein Zeittrend in Zählungen zustande kommen kann
durch einen realen Bestandstrend, durch einen Trend in der
Antreffwahrscheinlichkeit oder durch eine Kombination von
beidem. Eine direkte Interpretation von Zählungen impliziert
immer die Annahme, dass p = 1 oder dass p konstant sei. Es
ist nützlich, sich die Entstehung von Vogelzählungen hierarchisch,
d. H. mehrstufig vorzustellen: In einem ersten Schritt
entstehen die wahren Bestände und im zweiten die Zählungen
in Abhängigkeit der Bestände und der Antreffwahrscheinlichkeit
p. Extrainformation ist nötig, um die wahren Bestände
korrigiert für p zu schätzen. Diese Extrainformation besteht
in der Regel aus Distanzinformation
oder aus wiederholten
Beobachtungen, woraus Distance-Sampling- und Fangwiederfang-
Methoden die echten Bestände oder das wahre Vorkommen
zu schätzen vermögen. In den vergangenen Jahren
haben wir im Schweizer Brutvogelmonitoringprogramm
MHB mehrere Analyseverfahren vom Fangwiederfang-Typ
getestet und stellen diese und unsere Befunde zusammenfassend
kurz vor. Diese Methoden korrigieren für den binomialen
„Beobachtungsfehler“, der allen Vogelzählungen und
Vorkommensbeobachtungen inhärent ist. Wir glauben, dass
man an Methoden wie den hier illustrierten eigentlich nicht
vorbei kommt, wenn bei Monitoringprogrammen absolute
Bestandsgrößen vonnöten sind oder wenn man für „gefährliche
Muster“ in der Antreffwahrscheinlichkeit, z. B. Zeittrends
in p, korrigieren möchte.
Clonal hematopoiesis of indeterminate potential (CHIP) is caused by recurrent somatic mutations leading to clonal blood cell expansion. However, direct evidence of the fitness of CHIP-mutated human hematopoietic stem cells (HSCs) in blood reconstitution is lacking. Because myeloablative treatment and transplantation enforce stress on HSCs, we followed 81 patients with solid tumors or lymphoid diseases undergoing autologous stem cell transplantation (ASCT) for the development of CHIP. We found a high incidence of CHIP (22%) after ASCT with a high mean variant allele frequency (VAF) of 10.7%. Most mutations were already present in the graft, albeit at lower VAFs, demonstrating a selective reconstitution advantage of mutated HSCs after ASCT. However, patients with CHIP mutations in DNA-damage response genes showed delayed neutrophil reconstitution. Thus, CHIP-mutated stem and progenitor cells largely gain on clone size upon ASCT-related blood reconstitution, leading to an increased future risk of CHIP-associated complications.
Climate change and its impacts already pose considerable challenges for societies that will further increase with global warming (IPCC, 2014a, b). Uncertainties of the climatic response to greenhouse gas emissions include the potential passing of large-scale tipping points (e.g. Lenton et al., 2008; Levermann et al., 2012; Schellnhuber, 2010) and changes in extreme meteorological events (Field et al., 2012) with complex impacts on societies (Hallegatte et al., 2013). Thus climate change mitigation is considered a necessary societal response for avoiding uncontrollable impacts (Conference of the Parties, 2010). On the other hand, large-scale climate change mitigation itself implies fundamental changes in, for example, the global energy system. The associated challenges come on top of others that derive from equally important ethical imperatives like the fulfilment of increasing food demand that may draw on the same resources. For example, ensuring food security for a growing population may require an expansion of cropland, thereby reducing natural carbon sinks or the area available for bio-energy production. So far, available studies addressing this problem have relied on individual impact models, ignoring uncertainty in crop model and biome model projections. Here, we propose a probabilistic decision framework that allows for an evaluation of agricultural management and mitigation options in a multi-impact-model setting. Based on simulations generated within the Inter-Sectoral Impact Model Intercomparison Project (ISI-MIP), we outline how cross-sectorally consistent multi-model impact simulations could be used to generate the information required for robust decision making.
Using an illustrative future land use pattern, we discuss the trade-off between potential gains in crop production and associated losses in natural carbon sinks in the new multiple crop- and biome-model setting. In addition, crop and water model simulations are combined to explore irrigation increases as one possible measure of agricultural intensification that could limit the expansion of cropland required in response to climate change and growing food demand. This example shows that current impact model uncertainties pose an important challenge to long-term mitigation planning and must not be ignored in long-term strategic decision making.
In order to achieve climate change mitigation, long-term decisions are required that must be reconciled with other societal goals that draw on the same resources. For example, ensuring food security for a growing population may require an expansion of crop land, thereby reducing natural carbon sinks or the area available for bio-energy production. Here, we show that current impact-model uncertainties pose an important challenge to long-term mitigation planning and propose a new risk-assessment and decision framework that accounts for competing interests.
Based on cross-sectorally consistent simulations generated within the Inter-Sectoral Impact Model Intercomparison Project (ISI-MIP) we discuss potential gains and limitations of additional irrigation and trade-offs of the expansion of agricultural land as two possible response measures to climate change and growing food demand. We describe an illustrative example in which the combination of both measures may close the supply demand gap while leading to a loss of approximately half of all natural carbon sinks.
We highlight current limitations of available simulations and additional steps required for a comprehensive risk assessment.
Background: Health-related and disease-specific quality of life (HRQoL) has been increasingly valued as relevant clinical parameter in cystic fibrosis (CF) clinical care and clinical trials. HRQoL measures should assess – among other domains – daily functioning from a patient’s perspective. However, validation studies for the most frequently used HRQoL questionnaire in CF, the Cystic Fibrosis Questionnaire (CFQ), have not included measures of physical activity or fitness. The objective of this study was, therefore, to determine the cross-sectional and longitudinal relationships between HRQoL, physical activity and fitness in patients with CF.
Methods: Baseline (n = 76) and 6-month follow-up data (n = 70) from patients with CF (age ≥12 years, FEV1 ≥35%) were analysed. Patients participated in two multi-centre exercise intervention studies with identical assessment methodology. Outcome variables included HRQoL (German revised multi-dimensional disease-specific CFQ (CFQ-R)), body composition, pulmonary function, physical activity, short-term muscle power, and aerobic fitness by peak oxygen uptake and aerobic power.
Results: Peak oxygen uptake was positively related to 7 of 13 HRQoL scales cross-sectionally (r = 0.30-0.46). Muscle power (r = 0.25-0.32) and peak aerobic power (r = 0.24-0.35) were positively related to 4 scales each, and reported physical activity to 1 scale (r = 0.29). Changes in HRQoL-scores were directly and significantly related to changes in reported activity (r = 0.35-0.39), peak aerobic power (r = 0.31-0.34), and peak oxygen uptake (r = 0.26-0.37) in 3 scales each. Established associates of HRQoL such as FEV1 or body mass index correlated positively with fewer scales (all 0.24 < r < 0.55).
Conclusions: HRQoL was associated with physical fitness, especially aerobic fitness, and to a lesser extent with reported physical activity. These findings underline the importance of physical fitness for HRQoL in CF and provide an additional rationale for exercise testing in this population.
Trial registration: ClinicalTrials.gov, NCT00231686
A new technique developed for measuring nuclear reactions at low momentum transfer with stored beams in inverse kinematics was successfully used to study isoscalar giant resonances. The experiment was carried out at the experimental heavy-ion storage ring (ESR) at the GSI facility using a stored 58Ni beam at 100 MeV/u and an internal helium gas-jet target. In these measurements, inelastically scattered α-recoils at very forward center-of-mass angles (θcm ≤ 1.5°) were detected with a dedicated setup, including ultra-high vacuum compatible detectors. Experimental results indicate a dominant contribution of the isoscalar giant monopole resonance at this very forward angular range. It was found that the monopole contribution exhausts 79+12−11% of the energy-weighted sum rule (EWSR), which agrees with measurements performed in normal kinematics. This opens up the opportunity to investigate the giant resonances in a large domain of unstable and exotic nuclei in the near future. It is a fundamental milestone towards new nuclear reaction studies with stored ion beams.
The nucleosynthesis of elements beyond iron is dominated by neutron captures in the s and r processes. However, 32 stable, proton-rich isotopes cannot be formed during those processes, because they are shielded from the s-process flow and r-process β-decay chains. These nuclei are attributed to the p and rp process.
For all those processes, current research in nuclear astrophysics addresses the need for more precise reaction data involving radioactive isotopes. Depending on the particular reaction, direct or inverse kinematics, forward or time-reversed direction are investigated to determine or at least to constrain the desired reaction cross sections.
The Facility for Antiproton and Ion Research (FAIR) will offer unique, unprecedented opportunities to investigate many of the important reactions. The high yield of radioactive isotopes, even far away from the valley of stability, allows the investigation of isotopes involved in processes as exotic as the r or rp processes.
Importance: The entry of artificial intelligence into medicine is pending. Several methods have been used for the predictions of structured neuroimaging data, yet nobody compared them in this context.
Objective: Multi-class prediction is key for building computational aid systems for differential diagnosis. We compared support vector machine, random forest, gradient boosting, and deep feed-forward neural networks for the classification of different neurodegenerative syndromes based on structural magnetic resonance imaging.
Design, setting, and participants: Atlas-based volumetry was performed on multi-centric T1-weighted MRI data from 940 subjects, i.e., 124 healthy controls and 816 patients with ten different neurodegenerative diseases, leading to a multi-diagnostic multi-class classification task with eleven different classes.
Interventions: N.A.
Main outcomes and measures: Cohen’s kappa, accuracy, and F1-score to assess model performance.
Results: Overall, the neural network produced both the best performance measures and the most robust results. The smaller classes however were better classified by either the ensemble learning methods or the support vector machine, while performance measures for small classes were comparatively low, as expected. Diseases with regionally specific and pronounced atrophy patterns were generally better classified than diseases with widespread and rather weak atrophy.
Conclusions and relevance: Our study furthermore underlines the necessity of larger data sets but also calls for a careful consideration of different machine learning methods that can handle the type of data and the classification task best.