Refine
Year of publication
- 2014 (1213) (remove)
Document Type
- Article (587)
- Part of Periodical (163)
- Working Paper (149)
- Book (134)
- Doctoral Thesis (86)
- Report (27)
- Part of a Book (23)
- Conference Proceeding (18)
- Review (10)
- Preprint (8)
Language
- English (1213) (remove)
Keywords
- taxonomy (21)
- new species (19)
- Syntax (11)
- Inversionsfigur (10)
- Multistability (10)
- Multistable figures (10)
- Wahrnehmungswechsel (10)
- morphology (8)
- Bantusprachen (7)
- Benjamin, Walter (7)
Institute
- Medizin (231)
- Wirtschaftswissenschaften (149)
- Center for Financial Studies (CFS) (131)
- Physik (101)
- Biowissenschaften (88)
- Sustainable Architecture for Finance in Europe (SAFE) (86)
- House of Finance (HoF) (82)
- Biochemie und Chemie (48)
- Geowissenschaften (41)
- Gesellschaftswissenschaften (33)
Background: Subarachnoid hemorrhage (SAH) is mainly caused by ruptured cerebral aneurysms but in up to 15% of patients with SAH no bleeding source could be identified. Our objective was to analyze patient characteristics, clinical outcome and prognostic factors in patients suffering from non-aneurysmal SAH.
Methods: From 1999 to 2009, data of 125 patients with non-aneurysmal SAH were prospectively entered into a database. All patients underwent repetitive cerebral angiography. Outcome was assessed according to the modified Rankin Scale (mRS) (mRS 0-2 favorable vs. 3-6 unfavorable). Also, patients were divided in two groups according to the distribution of blood in the CT scan (perimesencephalic and non-perimesencephalic SAH).
Results: 106 of the 125 patients were in good WFNS grade (I-III) at admission (85%). Overall, favorable outcome was achieved in 104 of 125 patients (83%). Favorable outcome was associated with younger age (P < 0.001), good admission status (P < 0.0001), and absence of hydrocephalus (P = 0.001).73 of the 125 patients suffered from perimesencephalic SAH, most patients (90%) were in good grade at admission, and 64 achieved favorable outcome.52 of the 125 patients suffered from non-perimesencephalic SAH and 40 were in good grade at admission. Also 40 patients achieved favorable outcome.
Conclusions: Patients suffering from non-aneurysmal SAH have better prognosis compared to aneurysm related SAH and poor admission status was the only independent predictor of unfavorable outcome in the multivariate analysis. Patients with a non-perimesencephalic SAH have an increased risk of a worse neurological outcome. These patients should be monitored attentively.
Background: Hereditary angioedema (HAE) due to C1 inhibitor deficiency is a rare but serious and potentially life-threatening disease marked by spontaneous, recurrent attacks of swelling. The study objective was to characterize direct and indirect resource utilization associated with HAE from the patient perspective in Europe.
Methods: The study was conducted in Spain, Germany, and Denmark to assess the real-world experience of HAE via a cross-sectional survey of HAE patients, including direct and indirect resource utilization during and between attacks for patients and their caregivers over the past 6 months. A regression model examined predictors of medical resource utilization.
Results: Overall, 164 patients had an attack in the past 6 months and were included in the analysis. The most significant predictor of medical resource utilization was the severity of the last attack (OR 2.6; p < 0.001). Among patients who sought medical care during the last attack (23%), more than half utilized the emergency department. The last attack prevented patients from their normal activities an average of 4-12 hours. Patient and caregiver absenteeism increased with attack severity and frequency. Among patients who were working or in school (n = 120), 72 provided work/school absenteeism data, resulting in an estimated 20 days missing from work/school on average per year; 51% (n = 84) indicated that HAE has hindered their career/educational advancement.
Conclusion: HAE poses a considerable burden on patients and their families in terms of direct medical costs and indirect costs related to lost productivity. This burden is substantial at the time of attacks and in between attacks.
Background: Risk stratification, detection of minimal residual disease (MRD), and implementation of novel therapeutic agents have improved outcome in acute lymphoblastic leukemia (ALL), but survival of adult patients with T-cell acute lymphoblastic leukemia (T-ALL) remains unsatisfactory. Thus, novel molecular insights and therapeutic approaches are urgently needed.
Methods: We studied the impact of B-cell CLL/lymphoma 11b (BCL11b), a key regulator in normal T-cell development, in T-ALL patients enrolled into the German Multicenter Acute Lymphoblastic Leukemia Study Group trials (GMALL; n = 169). The mutational status (exon 4) of BCL11b was analyzed by Sanger sequencing and mRNA expression levels were determined by quantitative real-time PCR. In addition gene expression profiles generated on the Human Genome U133 Plus 2.0 Array (affymetrix) were used to investigate BCL11b low and high expressing T-ALL patients.
Results: We demonstrate that BCL11b is aberrantly expressed in T-ALL and gene expression profiles reveal an association of low BCL11b expression with up-regulation of immature markers. T-ALL patients characterized by low BCL11b expression exhibit an adverse prognosis [5-year overall survival (OS): low 35% (n = 40) vs. high 53% (n = 129), P = 0.02]. Within the standard risk group of thymic T-ALL (n = 102), low BCL11b expression identified patients with an unexpected poor outcome compared to those with high expression (5-year OS: 20%, n = 18 versus 62%, n = 84, P < 0.01). In addition, sequencing of exon 4 revealed a high mutation rate (14%) of BCL11b.
Conclusions: In summary, our data of a large adult T-ALL patient cohort show that low BCL11b expression was associated with poor prognosis; particularly in the standard risk group of thymic T-ALL. These findings can be utilized for improved risk prediction in a significant proportion of adult T-ALL patients, which carry a high risk of standard therapy failure despite a favorable immunophenotype.
Modeling the effects of neuronal morphology on dendritic chloride diffusion and GABAergic inhibition
(2014)
Poster presentation at the Twenty Third Annual Computational Neuroscience Meeting: CNS*2014 Québec City, Canada. 26-31 July 2014.
Gamma-aminobutyric acid receptors (GABAARs) are ligand-gated chloride (Cl−) channels which mediate the majority of inhibitory neurotransmission in the CNS. Spatiotemporal changes of intracellular Cl− concentration alter the concentration gradient for Cl− across the neuronal membrane and thus affect the current flow through GABAARs and the efficacy of GABAergic inhibition. However, the impact of complex neuronal morphology on Cl− diffusion and the redistribution of intracellular Cl− is not well understood. Recently, computational models for Cl− diffusion and GABAAR-mediated inhibition in realistic neuronal morphologies became available [1-3]. Here we have used computational models of morphologically complex dendrites to test the effects of spines on Cl− diffusion. In all dendritic morphologies tested, spines slowed down longitudinal Cl− diffusion along dendrites and decreased the amount and spatial spread of synaptically evoked Cl− changes. Spine densities of 2-10 spines/µm decreased the longitudinal diffusion coefficient of Cl− to 80-30% of its value in smooth dendrites, respectively. These results suggest that spines are able to limit short-term ionic plasticity [4] at dendritic GABAergic synapses.
Poster presentation at The Twenty Third Annual Computational Neuroscience Meeting: CNS*2014 Québec City, Canada. 26-31 July 2014: We study random strongly heterogeneous recurrent networks of firing rate neurons, introducing the notion of cohorts: groups of co-active neurons, who compete for firing with one another and whose presence depends sensitively on the structure of the input. The identities of neurons recruited to and dropped from an active cohort changes smoothly with varying input features. We search for network parameter regimes in which the activation of cohorts is robust yet easily switchable by the external input and which exhibit large repertoires of different cohorts. We apply these networks to model the emergence of orientation and direction selectivity in visual cortex. We feed these random networks with a set of harmonic inputs that vary across neurons only in their temporal phase, mimicking the feedforward drive due to a moving grating stimulus. The relationship between the phases that carries the information about the orientation of the stimulus determines which cohort of neurons is activated. As a result the individual neurons acquire non-monotonic orientation tuning curves which are characterized by high orientation and direction selectivity. This mechanism of emergence for direction selectivity differs from the classical motion detector scheme, which is based on the nonlinear summation of the time-shifted inputs. In our model these two mechanisms coexist in the same network, but can be distinguished by their different frequency and contrast dependences. In general, the mechanism we are studying here converts temporal phase sequence into population activity and could therefore be used to extract and represent also various other relevant stimulus features.
Understanding the diverging opinions of academic experts, stakeholders and the public is important for effective conservation management. This is especially so when a consensus is needed for action to minimize future risks but the knowledge upon which to base this action is uncertain or missing. How to manage non-native, invasive species (NIS) is an interesting case in point: the issue has long been controversial among stakeholders, but publicly visible, major disagreement among experts is recent. To characterize the multitude of experts’ understanding and valuation of non-native, NIS we performed structured qualitative interviews with 26 academic experts, 13 of whom were invasion biologists and 13 landscape experts. Within both groups, thinking varied widely, not only about basic concepts (e.g., non-native, invasive) but also about their valuation of effects of NIS. The divergent opinions among experts, regarding both the overall severity of the problem in Europe and its importance for ecosystem services, contrasted strongly with the apparent consensus that emerges from scientific synthesis articles and policy documents. We postulate that the observed heterogeneity of expert judgments is related to three major factors: (1) diverging conceptual understandings, (2) lack of empirical information and high scientific uncertainties due to complexities and contingencies of invasion processes, and (3) missing deliberation of values. Based on theory from science studies, we interpret the notion of an NIS as a boundary object, i.e., concepts that have a similar but not identical meaning to different groups of experts and stakeholders. This interpretative flexibility of a concept can facilitate interaction across diverse groups but bears the risk of introducing misunderstandings. An alternative to seeking consensus on exact definitions and risk assessments would be for invasive species experts to acknowledge uncertainties and engage transparently with stakeholders and the public in deliberations about conflicting opinions, taking the role of honest brokers of policy alternatives rather than of issue advocates.
Climate is frequently used to predict the outcome of species introductions based on the results from species distribution models (SDMs). However, despite the widespread use of SDMs for pre- and post-border risk assessments, data that can be used to validate predictions is often not available until after an invasion has occurred. Here we explore the potential for using historical forestry trials to assess the performance of climate-based SDMs. SDMs were parameterized based on the native range distribution of 36 Australian acacias, and predictions were compared against both the results of 150 years of government forestry trials, and current invasive distribution in southern Africa using true skill statistic, sensitivity and specificity. Classification tree analysis was used to evaluate why some Australian acacias failed in trials while others were successful. Predicted suitability was significantly related to the invaded range (sensitivity = 0.87) and success in forestry trials (sensitivity = 0.80), but forestry trial failures were under-predicted (specificity = 0.35). Notably, for forestry trials, the success in trials was greater for species invasive somewhere in the world. SDM predictions also indicate a considerable invasion potential of eight species that are currently naturalized but not yet widespread. Forestry trial data clearly provides a useful additional source of data to validate and refine SDMs in the context of risk assessment. Our study identified the climatic factors required for successful invasion of acacias, and accentuates the importance of integration of status elsewhere for risk assessment.
A hybrid form of tilapia was introduced into Port Sulphur, Louisiana and was subsequently managed by treatment with rotenone and stocking of native predatory fishes. Measurements of tilapia from before this management event were compared to measurements of tilapia in the two years after the treatment. Post-management tilapia were consistently deeper in body and had greater weight per unit length (condition) when compared to pre-management fish. Procrustes generalized least squares data supported this by consistently finding post-management tilapia to be consistently deeper in body and head shape than pre-management fish. Although this could indicate the effectiveness of stocking native predators, several other factors, including two cold winters, seasonal effects, and less competition, may have contributed to this result.
NeoBiota, Volume 22 (2014)
(2014)
NeoBiota, Volume 21 (2014)
(2014)