Refine
Year of publication
- 2014 (1206) (remove)
Document Type
- Article (586)
- Part of Periodical (157)
- Working Paper (149)
- Book (134)
- Doctoral Thesis (86)
- Report (27)
- Part of a Book (23)
- Conference Proceeding (18)
- Review (10)
- Preprint (8)
Language
- English (1206) (remove)
Has Fulltext
- yes (1206) (remove)
Keywords
- taxonomy (21)
- new species (19)
- Syntax (11)
- Inversionsfigur (10)
- Multistability (10)
- Multistable figures (10)
- Wahrnehmungswechsel (10)
- morphology (8)
- Bantusprachen (7)
- Benjamin, Walter (7)
Institute
- Medizin (231)
- Wirtschaftswissenschaften (149)
- Center for Financial Studies (CFS) (131)
- Physik (101)
- Biowissenschaften (88)
- Sustainable Architecture for Finance in Europe (SAFE) (86)
- House of Finance (HoF) (82)
- Biochemie und Chemie (48)
- Geowissenschaften (41)
- Gesellschaftswissenschaften (33)
The success of invasive species has been explained by two contrasting but non-exclusive views: (i) intrinsic factors make some species inherently good invaders; (ii) species become invasive as a result of extrinsic ecological and genetic influences such as release from natural enemies, hybridization or other novel ecological and evolutionary interactions. These viewpoints are rarely distinguished but hinge on distinct mechanisms leading to different management scenarios. To improve tests of these hypotheses of invasion success we introduce a simple mathematical framework to quantify the invasiveness of species along two axes: (i) interspecific differences in performance among native and introduced species within a region, and (ii) intraspecific differences between populations of a species in its native and introduced ranges. Applying these equations to a sample dataset of occurrences of 1,416 plant species across Europe, Argentina, and South Africa, we found that many species are common in their native range but become rare following introduction; only a few introduced species become more common. Biogeographical factors limiting spread (e.g. biotic resistance, time of invasion) therefore appear more common than those promoting invasion (e.g. enemy release). Invasiveness, as measured by occurrence data, is better explained by inter-specific variation in invasion potential than biogeographical changes in performance. We discuss how applying these comparisons to more detailed performance data would improve hypothesis testing in invasion biology and potentially lead to more efficient management strategies.
Saccharum spontaneum L. is an invasive grass that has spread extensively in disturbed areas throughout the Panama Canal watershed (PCW), where it has created a fire hazard and inhibited reforestation efforts. Currently physical removal of aboveground biomass is the primary means of controlling this weed, which is largely ineffective and does little to inhibit spread of the species. Little is known about reproduction of this species, although it is both rhizomatous and produces abundant seed. Here we report a series of studies looking at some of the basic reproductive mechanisms and strategies utilised by S. spontaneum to provide information to support development of better targeted management strategies. We found that seed produced between September and November was germinable both in the lab and in situ. Genetic diversity of mature stands was assessed using microsatellite markers and found to be high, even at small scales. Studies of vegetative reproduction showed that buds on stems that had been dried for up to six weeks were still capable of sprouting. Separate experiments showed that stem fragments could sprout when left on the surface or buried shallowly and that larger pieces sprouted more readily than smaller pieces. Collectively these results demonstrate that S. spontaneum in the PCW has the capability to produce many propagules that can successfully recruit and it is likely that seed dispersal drives the spread of the species. Timing of management actions to reduce flowering would significantly reduce the seed load into the environment and help to prevent spread to new sites. Similarly, where biomass is cut, cutting stems into smaller pieces will allow the stems to dry out and reduce the ability of buds to sprout. Additionally, attention should be paid to prevent accidental transport to new sites on machinery.
Smut fungi are well-suited to investigate the ecology and evolution of plant pathogens, as they are strictly biotrophic, yet cultivable on media. Here we report the genome sequence of Melanopsichium pennsylvanicum, closely related to Ustilago maydis and other Poaceae-infecting smuts, but parasitic to a dicot plant. To explore the evolutionary patterns resulting from host adaptation after this huge host jump, the genome of M. pennsylvanicum was sequenced and compared to the genomes of Ustilago maydis, Sporisorium reilianum, and Ustilago hordei. While all four genomes had a similar completeness in CEGMA analyses, gene absence was highest in M. pennsylvanicum, and most pronounced in putative secreted proteins, which are often considered as effector candidates. In contrast, the amount of private genes was similar among the species, highlighting that gene loss rather than gene gain is the hallmark of adaptation after the host jump to the dicot host. Our analyses revealed a trend of putative effectors to be next to another putative effector, but the majority of these are not in clusters and thus the focus on pathogenicity clusters might not be appropriate for all smut genomes. Positive selection studies revealed that M. pennsylvanicum has the highest number and proportion of genes under positive selection. In general, putative effectors showed a higher proportion of positively selected genes than non-effector candidates. The 248 putative secreted effectors found in all four smut genomes might constitute a core set needed for pathogenicity, while those 92 that are found in all grass-parasitic smuts, but have no ortholog in M. pennsylvanicum might constitute a set of effectors important for successful colonization of grass hosts.
Obesity and associated lifestyle in a large sample of multi-morbid German primary care attendees
(2014)
Background: Obesity and the accompanying increased morbidity and mortality risk is highly prevalent among older adults. As obese elderly might benefit from intentional weight reduction, it is necessary to determine associated and potentially modifiable factors on senior obesity. This cross-sectional study focuses on multi-morbid patients which make up the majority in primary care. It reports on the prevalence of senior obesity and its associations with lifestyle behaviors.
Methods: A total of 3,189 non-demented, multi-morbid participants aged 65–85 years were recruited in primary care within the German MultiCare-study. Physical activity, smoking, alcohol consumption and quantity and quality of nutritional intake were classified as relevant lifestyle factors. Body Mass Index (BMI, general obesity) and waist circumference (WC, abdominal obesity) were used as outcome measures and regression analyses were conducted.
Results: About one third of all patients were classified as obese according to BMI. The prevalence of abdominal obesity was 73.5%. Adjusted for socio-demographic variables and objective and subjective disease burden, participants with low physical activity had a 1.6 kg/m2 higher BMI as well as a higher WC (4.9 cm, p<0.001). Current smoking and high alcohol consumption were associated with a lower BMI and WC. In multivariate logistic regression, using elevated WC and BMI as categorical outcomes, the same pattern in lifestyle factors was observed. Only for WC, not current but former smoking was associated with a higher probability for elevated WC. Dietary intake in quantity and quality was not associated with BMI or WC in either model.
Conclusions: Further research is needed to clarify if the huge prevalence discrepancy between BMI and WC also reflects a difference in obesity-related morbidity and mortality. Yet, age-specific thresholds for the BMI are needed likewise. Encouraging and promoting physical activity in older adults might a starting point for weight reduction efforts.
Introduction: Multimorbidity is a major concern in primary care. Nevertheless, evidence of prevalence and patterns of multimorbidity, and their determinants, are scarce. The aim of this study is to systematically review studies of the prevalence, patterns and determinants of multimorbidity in primary care.
Methods: Systematic review of literature published between 1961 and 2013 and indexed in Ovid (CINAHL, PsychINFO, Medline and Embase) and Web of Knowledge. Studies were selected according to eligibility criteria of addressing prevalence, determinants, and patterns of multimorbidity and using a pretested proforma in primary care. The quality and risk of bias were assessed using STROBE criteria. Two researchers assessed the eligibility of studies for inclusion (Kappa = 0.86).
Results: We identified 39 eligible publications describing studies that included a total of 70,057,611 patients in 12 countries. The number of health conditions analysed per study ranged from 5 to 335, with multimorbidity prevalence ranging from 12.9% to 95.1%. All studies observed a significant positive association between multimorbidity and age (odds ratio [OR], 1.26 to 227.46), and lower socioeconomic status (OR, 1.20 to 1.91). Positive associations with female gender and mental disorders were also observed. The most frequent patterns of multimorbidity included osteoarthritis together with cardiovascular and/or metabolic conditions.
Conclusions: Well-established determinants of multimorbidity include age, lower socioeconomic status and gender. The most prevalent conditions shape the patterns of multimorbidity. However, the limitations of the current evidence base means that further and better designed studies are needed to inform policy, research and clinical practice, with the goal of improving health-related quality of life for patients with multimorbidity. Standardization of the definition and assessment of multimorbidity is essential in order to better understand this phenomenon, and is a necessary immediate step.
HIV vaccine preclinical testing is difficult because HIV’s only relevant hosts are humans and no correlates of protection are known. To this end, we are working on the humanization of different mouse strains with human peripheral blood mononuclear cells (PBMCs) as well as human hematopoietic stem cells (HSC) to generate a useful small animal model.
We generated immune deficient mice (NOD Scid IL2gc -/- /NOD Rag1-/- IL2gc -/-) expressing human MHC class II (HLA-DQ8) on a mouse class II deficient background (Ab-/-). Here, the human HLA-DQ8 should interact with the matching T cell receptors of transferred matching human PBMCs and therefore could support the functionality of the transferred human CD4+ cells in the mice.
Mice that were adoptively transferred with human HLA-DQ8 PBMCs only showed engraftment of CD3+ T cells. Surprisingly, the presence of HLA class II did not significantly change the repopulation rates in the mice. Also, the presence of HLA class II did not advance B cell engraftment, such that humoral immune responses were undetectable. However, the overall survival of DQ8-expressing mice was significantly prolonged, compared to mice expressing mouse MHC class II molecules, and correlated with an increased time span until onset of GvHD.
To avoid GVHD and to increase and maintain the level of human cell reconstitution over a long period of time, the same mouse strains were reconstituted with human HSC. Compared to PBMC-repopulated mice, HSC-reconstituted mice develop almost all subpopulations of the human immune system detectable at week 12 after HSC transfer. These mice developed adaptive immune responses after Tetanus Toxoide (TT) immunizations. In addition, we are testing the susceptibility of these humanized mice to different HIV strains with a detailed look at immune responses.
We develop a methodology to identify and rank “systemically important financial institutions” (SIFIs). Our approach is consistent with that followed by the Financial Stability Board (FSB) but, unlike the latter, it is free of judgment and it is based entirely on publicly available data, thus filling the gap between the official views of the regulator and those that market participants can form with their own information set. We apply the methodology to annual data on three samples of banks (global, EU and euro area) for the years 2007-2012. We examine the evolution of the SIFIs over time and document the shifs in the relative weights of the major geographic areas. We also discuss the implication of the 2013 update of the identification methodology proposed by the FSB.
We examine the effects of credit default swaps (CDS), a major type of over-the-counter derivative, on the corporate liquidity management of the reference firms. CDS help firms to access the credit market since the lenders can hedge their credit risk more easily using these contracts. However, CDS-protected creditors can be tougher in debt renegotiations and less willing to support distressed borrowers, causing some firms to become more cautious. Consequently, we find that firms hold significantly more cash after the inception of CDS trading on their debt. The increase in cash holdings by CDS firms is more pronounced for financially constrained firms and firms facing higher refinancing risk. Moreover, bank relationships and outstanding credit facilities intensify the CDS effect on cash holding. Finally, firms with greater financial expertise hold more cash when their debt is referenced by CDS. These findings suggest that CDS, which are primarily a risk management tool for lenders, induce firms to adopt more conservative liquidity policies.
We use a unique data set from the Trade Reporting and Compliance Engine (TRACE) to study liquidity effects in the US structured product market. Our main contribution is the analysis of the relation between the accuracy in measuring liquidity and the potential degree of disclosure. Having access to all relevant trading information, we provide evidence that transaction cost measures that use dealer specific information such as trader identity and trade direction can be efficiently proxied by measures that use less detailed information. This finding is important for all market participants in the context of OTC markets, as it fosters our understanding of the information contained in transaction data. Thus, our results provide guidance for improving transparency while maintaining trader confidentiality. In addition, we analyze liquidity in the structured product market in general and show that securities that are mainly institutionally traded, guaranteed by a federal authority, or have low credit risk, tend to be more liquid.
Gene transfer vectors such as lentiviral vectors offer versatile possibilities to express transgenic antigens for vaccination purposes. However, viral vaccines leading to broad transduction and transgene expression in vivo, are undesirable. Therefore, strategies capable of directing gene transfer only to professional antigen-presenting cells would increase the specific activity and safety of genetic vaccines. A lentiviral vector pseudotype specific for murine major histocompatibilty complex class II (LV-MHCII) was recently developed and the present study aims to characterize the in vivo biodistribution profile and immunization potential of this vector in mice. Whereas the systemic administration of a vector pseudotyped with a ubiquitously-interacting envelope led to prominent detection of vector copies in the liver of animals, the injection of an equivalent amount of LV-MHCII resulted in a more specific biodistribution of vector and transgene. Copies of LV-MHCII were found only in secondary lymphoid organs, essentially in CD11c+ dendritic cells expressing the transgene whereas B cells were not efficiently targeted in vivo, contrary to expectations based on in vitro testing. Upon a single injection of LV-MHCII, naive mice mounted specific effector CD4 and CD8 T cell responses against the intracelllular transgene product with the generation of Th1 cytokines, development of in vivo cytotoxic activity and establishment of T cell immune memory. The targeting of dendritic cells by recombinant viral vaccines must therefore be assessed in vivo but this strategy is feasible, effective for immunization and cross-presentation and constitutes a potentially safe alternative to limit off-target gene expression in gene-based vaccination strategies with integrative vectors.
Background: Autotaxin (ATX) and its product lysophosphatidic acid (LPA) are considered to be involved in the development of liver fibrosis and elevated levels of serum ATX have been found in patients with hepatitis C virus associated liver fibrosis. However, the clinical role of systemic ATX in the stages of liver cirrhosis was unknown. Here we investigated the relation of ATX serum levels and severity of cirrhosis as well as prognosis of cirrhotic patients.
Methods: Patients with liver cirrhosis were prospectively enrolled and followed until death, liver transplantation or last contact. Blood samples drawn at the day of inclusion in the study were assessed for ATX content by an enzyme-linked immunosorbent assay. ATX levels were correlated with the stage as well as complications of cirrhosis. The prognostic value of ATX was investigated by uni- and multivariate Cox regression analyses. LPA concentration was determined by liquid chromatography-tandem mass spectrometry.
Results: 270 patients were enrolled. Subjects with liver cirrhosis showed elevated serum levels of ATX as compared to healthy subjects (0.814±0.42 mg/l vs. 0.258±0.40 mg/l, P<0.001). Serum ATX levels correlated with the Child-Pugh stage and the MELD (model of end stage liver disease) score and LPA levels (r = 0.493, P = 0.027). Patients with hepatic encephalopathy (P = 0.006), esophageal varices (P = 0.002) and portal hypertensive gastropathy (P = 0.008) had higher ATX levels than patients without these complications. Low ATX levels were a parameter independently associated with longer overall survival (hazard ratio 0.575, 95% confidence interval 0.365–0.905, P = 0.017).
Conclusion: Serum ATX is an indicator for the severity of liver disease and the prognosis of cirrhotic patients.
Global-scale assessments of freshwater fluxes and storages by hydrological models under historic climate conditions are subject to a variety of uncertainties. Using the global hydrological model WaterGAP 2.2, we investigated the sensitivity of simulated freshwater fluxes and water storage variations to five major sources of uncertainty: climate forcing, land cover input, model structure, consideration of human water use and calibration (or no calibration). In a modelling experiment, five variants of the standard version of WaterGAP 2.2 were generated that differed from the standard version only regarding the investigated source of uncertainty. Sensitivity was analyzed by comparing water fluxes and water storage variations computed by the variants to those of the standard version, considering both global averages and grid cell values for the time period 1971–2000. The basin-specific calibration approach for WaterGAP, which forces simulated mean annual river discharge to be equal to observed values at 1319 gauging stations (representing 54% of global land area except Antarctica and Greenland), has the highest effect on modelled water fluxes and leads to the best fit of modelled to observed monthly and seasonal river discharge. Alternative state-of-the-art climate forcings rank second regarding the impact on grid cell specific fluxes and water storage variations, and their impact is ubiquitous and stronger than that of alternative land cover inputs. The diverse model refinements during the last decade lead to an improved fit to observed discharge, and affect globally averaged fluxes and storage values (the latter mainly due to modelling of groundwater depletion) but only affect a relatively small number of grid cells. Considering human water use is important for the global water storage trend (in particular in the groundwater compartment) but impacts on water fluxes are rather local and only important where water use is high. The best fit to observed time series of monthly river discharge (Nash–Sutcliffe criterion) or discharge seasonality is obtained with the standard WaterGAP 2.2 model version which is calibrated and driven by a sequence of two time series of daily observation-based climate forcings, WFD/WFDEI. Discharge computed by a calibrated model version using monthly CRU 3.2 and GPCC v6 climate input reduced the fit to observed discharge for most stations. Taking into account the investigated uncertainties of climate and land cover data, we estimate that the global 1971–2000 discharge into oceans and inland sinks is between 40 000 and 42 000 km3 yr−1. The range is mainly due differences in precipitation data that affect discharge in uncalibrated river basins. Actual evapotranspiration, with approximately 70 000 km3 yr−1, is rather unaffected by climate and land cover in global sum but differs spatially. Human water use is calculated to reduce river discharge by approximately 1000 km3 yr−1. Thus, global renewable water resources are estimated to range between 41 000 and 43 000 km3 yr−1. The climate data sets WFD (available until 2001) and WFDEI (starting in 1979) were found to be inconsistent with respect to short wave radiation data, resulting in strongly different potential evapotranspiration. Global assessments of freshwater fluxes and storages would therefore benefit from the development of a global data set of consistent daily climate forcing from 1900 to current.
MicroRNAs (miRNAs, miRs) emerged as key regulators of gene expression. Germline hemizygous deletion of the gene that encodes the miR-17~92 miRNA cluster was associated with microcephaly, short stature and digital abnormalities in humans. Mice deficient for the miR-17~92 cluster phenocopy several features such as growth and skeletal development defects and exhibit impaired B cell development. However, the individual contribution of miR-17~92 cluster members to this phenotype is unknown. Here we show that germline deletion of miR-92a in mice is not affecting heart development and does not reduce circulating or bone marrow-derived hematopoietic cells, but induces skeletal defects. MiR-92a−/− mice are born at a reduced Mendelian ratio, but surviving mice are viable and fertile. However, body weight of miR-92a−/− mice was reduced during embryonic and postnatal development and adulthood. A significantly reduced body and skull length was observed in miR-92a−/− mice compared to wild type littermates. µCT analysis revealed that the length of the 5th mesophalanx to 5th metacarpal bone of the forelimbs was significantly reduced, but bones of the hindlimbs were not altered. Bone density was not affected. These findings demonstrate that deletion of miR-92a is sufficient to induce a developmental skeletal defect.
Knowledge of factors influencing the timing of reproduction is important for animal conservation and management. Brown bears (Ursus arctos) are able to vary the birth date of their cubs in response to their fat stores, but little information is available about the timing of implantation and parturition in free-ranging brown bears. Body temperature and activity of pregnant brown bears is higher during the gestation period than during the rest of hibernation and drops at parturition. We compared mean daily body temperature and activity levels of pregnant and nonpregnant females during preimplantation, gestation, and lactation. Additionally we tested whether age, litter size, primiparity, environmental conditions, and the start of hibernation influence the timing of parturition. The mean date of implantation was 1 December (SD = 12), the mean date of parturition was 26 January (SD = 12), and the mean duration of the gestation period was 56 days (SD = 2). The body temperature of pregnant females was higher during the gestation and lactation periods than that of nonpregnant bears. The body temperature of pregnant females decreased during the gestation period. Activity recordings were also used to determine the date of parturition. The parturition dates calculated with activity and body temperature data did not differ significantly and were the same in 50% of the females. Older females started hibernation earlier. The start of hibernation was earlier during years with favorable environmental conditions. Dates of parturition were later during years with good environmental conditions which was unexpected. We suggest that free-ranging pregnant brown bears in areas with high levels of human activities at the beginning of the denning period, as in our study area, might prioritize investing energy in early denning than in early parturition during years with favorable environmental conditions, as a strategy to prevent disturbances caused by human.
Sanctions placed upon airlines and other operators transporting persons without the required paperwork are called ‘carrier sanctions’. They constitute a key example of how border control mechanisms are currently being outsourced, privatized, delegated, and moved from the border itself to new physical locations. These practices can lead to a phenomenon referred to in this paper as ‘hidden coercion’. This paper argues that, while hidden coercion is commonplace in the reality of migration policy in most states, it is so far neglected in theoretical discussions of state coercion. Moreover, the discussion of carrier sanctions demonstrates that this neglect is problematic, since hidden coercion is not justifiable even within a framework that legitimizes state border coercion.
Cytochrome P450 (CYP) epoxygenases generate bioactive lipid epoxides which can be further metabolized to supposedly less active diols by the soluble epoxide hydrolase (sEH). As the role of epoxides and diols in angiogenesis is unclear, we compared retinal vasculature development in wild-type and sEH−/− mice. Deletion of the sEH significantly delayed angiogenesis, tip cell, and filopodia formation, a phenomenon associated with activation of the Notch signaling pathway. In the retina, sEH was localized in Müller glia cells, and Müller cell–specific sEH deletion reproduced the sEH−/− retinal phenotype. Lipid profiling revealed that sEH deletion decreased retinal and Müller cell levels of 19,20–dihydroxydocosapentaenoic acid (DHDP), a diol of docosahexenoic acid (DHA). 19,20-DHDP suppressed endothelial Notch signaling in vitro via inhibition of the γ-secretase and the redistribution of presenilin 1 from lipid rafts. Moreover, 19,20-DHDP, but not the parent epoxide, was able to rescue the defective angiogenesis in sEH−/− mice as well as in animals lacking the Fbxw7 ubiquitin ligase, which demonstrate strong basal activity of the Notch signaling cascade. These studies demonstrate that retinal angiogenesis is regulated by a novel form of neuroretina–vascular interaction involving the sEH-dependent generation of a diol of DHA in Müller cells.
Alternative polyadenylation (APA) is a widespread mechanism that contributes to the sophisticated dynamics of gene regulation. Approximately 50% of all protein-coding human genes harbor multiple polyadenylation (PA) sites; their selective and combinatorial use gives rise to transcript variants with differing length of their 3' untranslated region (3'UTR). Shortened variants escape UTR-mediated regulation by microRNAs (miRNAs), especially in cancer, where global 3'UTR shortening accelerates disease progression, dedifferentiation and proliferation. Here we present APADB, a database of vertebrate PA sites determined by 3' end sequencing, using massive analysis of complementary DNA ends. APADB provides (A)PA sites for coding and non-coding transcripts of human, mouse and chicken genes. For human and mouse, several tissue types, including different cancer specimens, are available. APADB records the loss of predicted miRNA binding sites and visualizes next-generation sequencing reads that support each PA site in a genome browser. The database tables can either be browsed according to organism and tissue or alternatively searched for a gene of interest. APADB is the largest database of APA in human, chicken and mouse. The stored information provides experimental evidence for thousands of PA sites and APA events. APADB combines 3' end sequencing data with prediction algorithms of miRNA binding sites, allowing to further improve prediction algorithms. Current databases lack correct information about 3'UTR lengths, especially for chicken, and APADB provides necessary information to close this gap. Database URL: http://tools.genxpro.net/apadb/
Background: Subarachnoid hemorrhage (SAH) is mainly caused by ruptured cerebral aneurysms but in up to 15% of patients with SAH no bleeding source could be identified. Our objective was to analyze patient characteristics, clinical outcome and prognostic factors in patients suffering from non-aneurysmal SAH.
Methods: From 1999 to 2009, data of 125 patients with non-aneurysmal SAH were prospectively entered into a database. All patients underwent repetitive cerebral angiography. Outcome was assessed according to the modified Rankin Scale (mRS) (mRS 0-2 favorable vs. 3-6 unfavorable). Also, patients were divided in two groups according to the distribution of blood in the CT scan (perimesencephalic and non-perimesencephalic SAH).
Results: 106 of the 125 patients were in good WFNS grade (I-III) at admission (85%). Overall, favorable outcome was achieved in 104 of 125 patients (83%). Favorable outcome was associated with younger age (P < 0.001), good admission status (P < 0.0001), and absence of hydrocephalus (P = 0.001).73 of the 125 patients suffered from perimesencephalic SAH, most patients (90%) were in good grade at admission, and 64 achieved favorable outcome.52 of the 125 patients suffered from non-perimesencephalic SAH and 40 were in good grade at admission. Also 40 patients achieved favorable outcome.
Conclusions: Patients suffering from non-aneurysmal SAH have better prognosis compared to aneurysm related SAH and poor admission status was the only independent predictor of unfavorable outcome in the multivariate analysis. Patients with a non-perimesencephalic SAH have an increased risk of a worse neurological outcome. These patients should be monitored attentively.
Background: Hereditary angioedema (HAE) due to C1 inhibitor deficiency is a rare but serious and potentially life-threatening disease marked by spontaneous, recurrent attacks of swelling. The study objective was to characterize direct and indirect resource utilization associated with HAE from the patient perspective in Europe.
Methods: The study was conducted in Spain, Germany, and Denmark to assess the real-world experience of HAE via a cross-sectional survey of HAE patients, including direct and indirect resource utilization during and between attacks for patients and their caregivers over the past 6 months. A regression model examined predictors of medical resource utilization.
Results: Overall, 164 patients had an attack in the past 6 months and were included in the analysis. The most significant predictor of medical resource utilization was the severity of the last attack (OR 2.6; p < 0.001). Among patients who sought medical care during the last attack (23%), more than half utilized the emergency department. The last attack prevented patients from their normal activities an average of 4-12 hours. Patient and caregiver absenteeism increased with attack severity and frequency. Among patients who were working or in school (n = 120), 72 provided work/school absenteeism data, resulting in an estimated 20 days missing from work/school on average per year; 51% (n = 84) indicated that HAE has hindered their career/educational advancement.
Conclusion: HAE poses a considerable burden on patients and their families in terms of direct medical costs and indirect costs related to lost productivity. This burden is substantial at the time of attacks and in between attacks.
Background: Risk stratification, detection of minimal residual disease (MRD), and implementation of novel therapeutic agents have improved outcome in acute lymphoblastic leukemia (ALL), but survival of adult patients with T-cell acute lymphoblastic leukemia (T-ALL) remains unsatisfactory. Thus, novel molecular insights and therapeutic approaches are urgently needed.
Methods: We studied the impact of B-cell CLL/lymphoma 11b (BCL11b), a key regulator in normal T-cell development, in T-ALL patients enrolled into the German Multicenter Acute Lymphoblastic Leukemia Study Group trials (GMALL; n = 169). The mutational status (exon 4) of BCL11b was analyzed by Sanger sequencing and mRNA expression levels were determined by quantitative real-time PCR. In addition gene expression profiles generated on the Human Genome U133 Plus 2.0 Array (affymetrix) were used to investigate BCL11b low and high expressing T-ALL patients.
Results: We demonstrate that BCL11b is aberrantly expressed in T-ALL and gene expression profiles reveal an association of low BCL11b expression with up-regulation of immature markers. T-ALL patients characterized by low BCL11b expression exhibit an adverse prognosis [5-year overall survival (OS): low 35% (n = 40) vs. high 53% (n = 129), P = 0.02]. Within the standard risk group of thymic T-ALL (n = 102), low BCL11b expression identified patients with an unexpected poor outcome compared to those with high expression (5-year OS: 20%, n = 18 versus 62%, n = 84, P < 0.01). In addition, sequencing of exon 4 revealed a high mutation rate (14%) of BCL11b.
Conclusions: In summary, our data of a large adult T-ALL patient cohort show that low BCL11b expression was associated with poor prognosis; particularly in the standard risk group of thymic T-ALL. These findings can be utilized for improved risk prediction in a significant proportion of adult T-ALL patients, which carry a high risk of standard therapy failure despite a favorable immunophenotype.