Refine
Year of publication
- 2016 (730) (remove)
Document Type
- Article (730) (remove)
Language
- English (730) (remove)
Has Fulltext
- yes (730)
Is part of the Bibliography
- no (730)
Keywords
- Mitochondria (7)
- crystal structure (7)
- invasion (5)
- Cortex (4)
- Gender (4)
- Invasive species (4)
- apoptosis (4)
- cancer (4)
- management (4)
- Apoptosis (3)
Institute
- Medizin (283)
- Biowissenschaften (70)
- Physik (69)
- Geowissenschaften (61)
- Frankfurt Institute for Advanced Studies (FIAS) (58)
- Biochemie und Chemie (46)
- Informatik (38)
- Biodiversität und Klima Forschungszentrum (BiK-F) (34)
- Institut für Ökologie, Evolution und Diversität (27)
- Senckenbergische Naturforschende Gesellschaft (27)
Nodular lymphocyte predominant Hodgkin lymphoma (NLPHL) is an indolent lymphoma, but can transform into diffuse large B cell lymphoma (DLBCL), showing a more aggressive clinical behavior. Little is known about these cases on the molecular level. Therefore, the aim of the present study was to characterize DLBCL transformed from NLPHL (LP-DLBCL) by gene expression profiling (GEP). GEP revealed an inflammatory signature pinpointing to a specific host response. In a coculture model resembling this host response, DEV tumor cells showed an impaired growth behavior. Mechanisms involved in the reduced tumor cell proliferation included a downregulation of MYC and its target genes. Lack of MYC expression was also confirmed in 12/16 LP-DLBCL by immunohistochemistry. Furthermore, CD274/PD-L1 was upregulated in DEV tumor cells after coculture with T cells or monocytes and its expression was validated in 12/19 cases of LP-DLBCL. Thereby, our data provide new insights into the pathogenesis of LP-DLBCL and an explanation for the relatively low tumor cell content. Moreover, the findings suggest that treatment of these patients with immune checkpoint inhibitors may enhance an already ongoing host response in these patients.
Purpose: Advanced Ewing sarcomas have poor prognosis. They are defined by early relapse (<24 months after diagnosis) and/or by metastasis to multiple bones or bone marrow (BM). We analyzed risk factors, toxicity and survival in advanced Ewing sarcoma patients treated with the MetaEICESS vs. EICESS92 protocols.
Design: Of 44 patients, 18 patients were enrolled into two subsequent MetaEICESS protocols between 1992 and 2014, and compared to outcomes of 26 advanced Ewing sarcoma patients treated with EICESS 1992 between 1992 and 1996. MetaEICESS 1992 consisted of induction chemotherapy, whole body imaging directed radiotherapy to the primary tumor and metastases, tandem high-dose chemotherapy and autologous rescue. In MetaEICESS 2007 this treatment was complemented by allogeneic stem cell transplantation. EICESS 1992 comprised induction chemotherapy, local therapy to the primary tumor only followed by consolidation chemotherapy.
Results: In MetaEICESS 8/18 patients survived in complete remission vs. 2/26 in EICESS 1992 (p<0.05). Survival did not differ between MetaEICESS 2007 and MetaEICESS 1992. Three MetaEICESS patients died of complications, all in MetaEICESS 1992. After exclusion of patients succumbing to treatment related complications (n=3), 7/10 patients survived without BM involvement, in contrast to 0/5 patients with BM involvement. This was confirmed in a multivariate analysis. There was no correlation between BM involvement and the number of metastases at diagnosis.
Conclusion: The MetaEICESS protocols yield long-term disease-free survival in patients with advanced Ewing sarcoma. Allogeneic stem cell transplantation was not associated with increased death of complications. Bone marrow involvement is a risk factor distinct from multiple bone metastases.
Apoptosis is deregulated in most, if not all, cancers, including hematological malignancies. Smac mimetics that antagonize Inhibitor of Apoptosis (IAP) proteins have so far largely been investigated in acute myeloid leukemia (AML) cell lines; however, little is yet known on the therapeutic potential of Smac mimetics in primary AML samples. In this study, we therefore investigated the antileukemic activity of the Smac mimetic BV6 in diagnostic samples of 67 adult AML patients and correlated the response to clinical, cytogenetic and molecular markers and gene expression profiles. Treatment with cytarabine (ara-C) was used as a standard chemotherapeutic agent. Interestingly, about half (51%) of primary AML samples are sensitive to BV6 and 21% intermediate responsive, while 28% are resistant. Notably, 69% of ara-C-resistant samples show a good to fair response to BV6. Furthermore, combination treatment with ara-C and BV6 exerts additive effects in most samples. Whole-genome gene expression profiling identifies cell death, TNFR1 and NF-κB signaling among the top pathways that are activated by BV6 in BV6-sensitive, but not in BV6-resistant cases. Furthermore, sensitivity of primary AML blasts to BV6 correlates with significantly elevated expression levels of TNF and lower levels of XIAP in diagnostic samples, as well as with NPM1 mutation. In a large set of primary AML samples, these data provide novel insights into factors regulating Smac mimetic response in AML and have important implications for the development of Smac mimetic-based therapies and related diagnostics in AML.
The use of phylogenies in ecology is increasingly common and has broadened our understanding of biological diversity. Ecological sub-disciplines, particularly conservation, community ecology and macroecology, all recognize the value of evolutionary relationships but the resulting development of phylogenetic approaches has led to a proliferation of phylogenetic diversity metrics. The use of many metrics across the sub-disciplines hampers potential meta-analyses, syntheses, and generalizations of existing results. Further, there is no guide for selecting the appropriate metric for a given question, and different metrics are frequently used to address similar questions. To improve the choice, application, and interpretation of phylo-diversity metrics, we organize existing metrics by expanding on a unifying framework for phylogenetic information.
Generally, questions about phylogenetic relationships within or between assemblages tend to ask three types of question: how much; how different; or how regular? We show that these questions reflect three dimensions of a phylogenetic tree: richness, divergence, and regularity. We classify 70 existing phylo-diversity metrics based on their mathematical form within these three dimensions and identify ‘anchor’ representatives: for α-diversity metrics these are PD (Faith's phylogenetic diversity), MPD (mean pairwise distance), and VPD (variation of pairwise distances). By analysing mathematical formulae and using simulations, we use this framework to identify metrics that mix dimensions, and we provide a guide to choosing and using the most appropriate metrics. We show that metric choice requires connecting the research question with the correct dimension of the framework and that there are logical approaches to selecting and interpreting metrics. The guide outlined herein will help researchers navigate the current jungle of indices.
The multifunctional protein p21Cip1/CDKN1A (p21) is an important and universal Cdk-interacting protein. Recently, we have reported that p21 is involved in the regulation of the mitotic kinase Cdk1/cyclin B1 and critical for successful mitosis and cytokinesis. In the present work we show that S130 of p21 is phosphorylated by Cdk1/cyclin B1 during mitosis, which reduces p21’s stability and binding affinity to Cdk1/cyclin B1. Interfering with this phosphorylation results in extended mitotic duration and defective chromosome segregation, indicating that this regulation ensures proper mitotic progression. Given that p53, the major transcriptional activator of p21, is the most frequently mutated gene in human cancer and that deregulated Cdk1 associates with the development of different types of cancer, this work provides new insight into the understanding of how deregulated p21 contributes to chromosomal instability and oncogenesis.
Impact of Polo-like kinase 1 inhibitors on human adipose tissue-derived mesenchymal stem cells
(2016)
Polo-like kinase 1 (Plk1) has been established as one of the most promising targets for molecular anticancer intervention. In fact, various Plk1 inhibitors have been identified and characterized. While the data derived from the bench are prospective, the clinical outcomes are less encouraging by showing modest efficacy. One of the explanations for this discrepancy could be unintendedly targeting of non-malignant cells by Plk1 inhibitors. In this work, we have addressed the effect of Plk1 inhibition in adipose tissue-derived mesenchymal stem cells (ASCs). We show that both visceral and subcutaneous ASCs display monopolar spindles, reduced viability and strong apoptosis induction upon treatment with BI 2536 and BI 6727, the Plk1 kinase domain inhibitors, and with Poloxin, the regulatory Polo-box domain inhibitor. While Poloxin triggers quickly apoptosis, BI 2536 and BI 6727 result in mitotic arrest in ASCs. Importantly, survived ASCs exhibit DNA damage and a pronounced senescent phenotype. In addition, Plk1 inhibition impairs ASCs’ motility and homing ability. These results show that Plk1 inhibitors target slowly proliferating ASCs, an important population of anti-inflammation and immune modulation. The toxic effects on primary cells like ASCs could be partially responsible for the reported moderate antitumor activity in patients treated with Plk1 inhibitors.
Although the mechanistic target of rapamycin (mTOR) inhibitor, everolimus, has improved the outcome of patients with renal cell carcinoma (RCC), improvement is temporary due to the development of drug resistance. Since many patients encountering resistance turn to alternative/complementary treatment options, an investigation was initiated to evaluate whether the natural compound, sulforaphane (SFN), influences growth and invasive activity of everolimus-resistant (RCCres) compared to everolimus-sensitive (RCCpar) RCC cell lines in vitro. RCC cells were exposed to different concentrations of SFN and cell growth, cell proliferation, apoptosis, cell cycle, cell cycle regulating proteins, the mTOR-akt signaling axis, adhesion to human vascular endothelium and immobilized collagen, chemotactic activity, and influence on surface integrin receptor expression were investigated. SFN caused a significant reduction in both RCCres and RCCpar cell growth and proliferation, which correlated with an elevation in G2/M- and S-phase cells. SFN induced a marked decrease in the cell cycle activating proteins cdk1 and cyclin B and siRNA knock-down of cdk1 and cyclin B resulted in significantly diminished RCC cell growth. SFN also modulated adhesion and chemotaxis, which was associated with reduced expression of the integrin subtypes α5, α6, and β4. Distinct differences were seen in RCCres adhesion and chemotaxis (diminished by SFN) and RCCpar adhesion (enhanced by SFN) and chemotaxis (not influenced by SFN). Functional blocking of integrin subtypes demonstrated divergent action on RCC binding and invasion, depending on RCC cell sensitivity to everolimus. Therefore, SFN administration could hold potential for treating RCC patients with established resistance towards everolimus.
EUSOBI and 30 national breast radiology bodies support mammography for population-based screening, demonstrated to reduce breast cancer (BC) mortality and treatment impact. According to the International Agency for Research on Cancer, the reduction in mortality is 40 % for women aged 50–69 years taking up the invitation while the probability of false-positive needle biopsy is <1 % per round and overdiagnosis is only 1–10 % for a 20-year screening. Mortality reduction was also observed for the age groups 40–49 years and 70–74 years, although with “limited evidence”. Thus, we firstly recommend biennial screening mammography for average-risk women aged 50–69 years; extension up to 73 or 75 years, biennially, is a second priority, from 40–45 to 49 years, annually, a third priority. Screening with thermography or other optical tools as alternatives to mammography is discouraged. Preference should be given to population screening programmes on a territorial basis, with double reading. Adoption of digital mammography (not film-screen or phosphor-plate computer radiography) is a priority, which also improves sensitivity in dense breasts. Radiologists qualified as screening readers should be involved in programmes. Digital breast tomosynthesis is also set to become “routine mammography” in the screening setting in the next future. Dedicated pathways for high-risk women offering breast MRI according to national or international guidelines and recommendations are encouraged.
We assessed the prognostic value of hypoxia (carbonic anhydrase 9; CA9), vessel density (CD31), with macrophages (CD68) and B cells (CD20) that can interact and lead to immune suppression and disease progression using scanning and histological mapping of whole-mount FFPE pancreatectomy tissue sections from 141 primarily resectable pancreatic ductal adenocarcinoma (PDAC) samples treated with surgery and adjuvant chemotherapy. Their expression was correlated with clinicopathological characteristics, and overall survival (OS), progression-free survival (PFS), local progression-free survival (LPFS) and distant metastases free-survival (DMFS), also in the context of stroma density (haematoxylin-eosin) and activity (alpha-smooth muscle actin). The median OS was 21 months after a mean follow-up of 20 months (range, 2–69 months). The median tumor surface area positive for CA9 and CD31 was 7.8% and 8.1%, respectively. Although total expression of these markers lacked prognostic value in the entire cohort, nevertheless, high tumor compartment CD68 expression correlated with worse PFS (p = 0.033) and DMFS (p = 0.047). Also, high CD31 expression predicted for worse OS (p = 0.004), PFS (p = 0.008), LPFS (p = 0.014) and DMFS (p = 0.004) in patients with moderate density stroma. High stromal and peripheral compartment CD68 expression predicted for significantly worse outcome in patients with loose and moderate stroma density, respectively. Altogether, in contrast to the current notion, hypoxia levels in PDAC appear to be comparable to other malignancies. CD31 and CD68 constitute prognostic markers in patient subgroups that vary according to tumor compartment and stromal density. Our study provides important insight on the pathophysiology of PDAC and should be exploited for future treatments.
Background: Few studies have evaluated the impact of pre-treatment drug resistance (PDR) on response to combination antiretroviral treatment (cART) in children. The objective of this joint EuroCoord-CHAIN-EPPICC/PENTA project was to assess the prevalence of PDR mutations and their association with virological outcome in the first year of cART in children.
Methods: HIV-infected children <18 years initiating cART between 1998 and 2008 were included if having at least one genotypic resistance test prior to cART initiation. We used the World Health Organization 2009 resistance mutation list and Stanford algorithm to infer resistance to prescribed drugs. Time to virological failure (VF) was defined as the first of two consecutive HIV-RNA > 500 copies/mL after 6 months cART and was assessed by Cox proportional hazards models. All models were adjusted for baseline demographic, clinical, immunology and virology characteristics and calendar period of cART start and initial cART regimen.
Results: Of 476 children, 88 % were vertically infected. At cART initiation, median (interquartile range) age was 6.6 years (2.1–10.1), CD4 cell count 297 cells/mm3 (98–639), and HIV-RNA 5.2 log10copies/mL (4.7–5.7). Of 37 children (7.8 %, 95 % confidence interval (CI), 5.5–10.6) harboring a virus with ≥1 PDR mutations, 30 children had a virus resistant to ≥1 of the prescribed drugs. Overall, the cumulative Kaplan-Meier estimate for virological failure was 19.8 % (95 %CI, 16.4–23.9). Cumulative risk for VF tended to be higher among children harboring a virus with PDR and resistant to ≥1 drug prescribed than among those receiving fully active cART: 32.1 % (17.2–54.8) versus 19.4 % (15.9–23.6) (P = 0.095). In multivariable analysis, age was associated with a higher risk of VF with a 12 % reduced risk per additional year (HR 0.88; 95 %CI, 0.82–0.95; P < 0.001).
Conclusions: PDR was not significantly associated with a higher risk of VF in children in the first year of cART. The risk of VF decreased by 12 % per additional year at treatment initiation which may be due to fading of PDR mutations over time. Lack of appropriate formulations, in particular for the younger age group, may be an important determinant of virological failure.
Influence of the sFlt-1/PlGF ratio on clinical decision-making in women with suspected preeclampsia
(2016)
Objective: To evaluate the influence of the soluble fms-like tyrosine kinase 1/placental growth factor ratio in physicians’ decision making in pregnant women with signs and symptoms of preeclampsia in routine clinical practice.
Methods: A multicenter, prospective, open, non-interventional study enrolled pregnant women presenting with preeclampsia signs and symptoms in several European perinatal care centers. Before the soluble fms-like tyrosine kinase 1/placental growth factor ratio result was known, physicians documented intended clinical procedures using an iPad® application (data locked/time stamped). After the result was available, clinical decisions were confirmed or revised and documented. An independent adjudication committee evaluated the appropriateness of decisions based on maternal/fetal outcomes. Clinician decision making with regard to hospitalization was the primary outcome.
Results: In 16.9% of mothers (20/118) the hospitalization decision was changed after knowledge of the ratio. In 13 women (11.0%), the initial decision to hospitalize was changed to no hospitalization. In seven women (5.9%) the revised decision was hospitalization. All revised decisions were considered appropriate by the panel of adjudicators (McNemar test; p < 0.0001).
Conclusions: The use of the soluble fms-like tyrosine kinase 1/placental growth factor test influenced clinical decision making towards appropriate hospitalization in a considerable proportion of women with suspected preeclampsia. This is the first study to demonstrate the impact of angiogenic biomarkers on decision making in a routine clinical practice.
Background: Hemodynamic instability is frequent and outcome-relevant in critical illness. The understanding of complex hemodynamic disturbances and their monitoring and management plays an important role in treatment of intensive care patients. An increasing number of treatment recommendations and guidelines in intensive care medicine emphasize hemodynamic goals, which go beyond the measurement of blood pressures. Yet, it is not known to which extent the infrastructural prerequisites for extended hemodynamic monitoring are given in intensive care units (ICUs) and how hemodynamic management is performed in clinical practice. Further, it is still unclear which factors trigger the use of extended hemodynamic monitoring.
Methods: In this multicenter, 1-day (November 7, 2013, and the preceding 24 h) cross-sectional study, we retrieved data on patient monitoring from ICUs in Germany, Austria, and Switzerland by means of a web-based case report form. One hundred and sixty-one intensive care units contributed detailed information on availability of hemodynamic monitoring. In addition, detailed information on hemodynamic monitoring of 1789 patients that were treated on due date was collected, and independent factors triggering the use of extended hemodynamic monitoring were identified by multivariate analysis.
Results: Besides basic monitoring with electrocardiography (ECG), pulse oximetry, and blood pressure monitoring, the majority of patients received invasive arterial (77.9 %) and central venous catheterization (55.2 %). All over, additional extended hemodynamic monitoring for assessment of cardiac output was only performed in 12.3 % of patients, while echocardiographic examination was used in only 1.9 %. The strongest independent predictors for the use of extended hemodynamic monitoring of any kind were mechanical ventilation, the need for catecholamine therapy, and treatment backed by protocols. In 71.6 % of patients in whom extended hemodynamic monitoring was added during the study period, this extension led to changes in treatment.
Conclusions: Extended hemodynamic monitoring, which goes beyond the measurement of blood pressures, to date plays a minor role in the surveillance of critically ill patients in German, Austrian, and Swiss ICUs. This includes also consensus-based recommended diagnostic and monitoring applications, such as echocardiography and cardiac output monitoring. Mechanical ventilation, the use of catecholamines, and treatment backed by protocol could be identified as factors independently associated with higher use of extended hemodynamic monitoring.
Resistance formation after initial therapy response (acquired resistance) is common in high-risk neuroblastoma patients. YM155 is a drug candidate that was introduced as a survivin suppressant. This mechanism was later challenged, and DNA damage induction and Mcl-1 depletion were suggested instead. Here we investigated the efficacy and mechanism of action of YM155 in neuroblastoma cells with acquired drug resistance. The efficacy of YM155 was determined in neuroblastoma cell lines and their sublines with acquired resistance to clinically relevant drugs. Survivin levels, Mcl-1 levels, and DNA damage formation were determined in response to YM155. RNAi-mediated depletion of survivin, Mcl-1, and p53 was performed to investigate their roles during YM155 treatment. Clinical YM155 concentrations affected the viability of drug-resistant neuroblastoma cells through survivin depletion and p53 activation. MDM2 inhibitor-induced p53 activation further enhanced YM155 activity. Loss of p53 function generally affected anti-neuroblastoma approaches targeting survivin. Upregulation of ABCB1 (causes YM155 efflux) and downregulation of SLC35F2 (causes YM155 uptake) mediated YM155-specific resistance. YM155-adapted cells displayed increased ABCB1 levels, decreased SLC35F2 levels, and a p53 mutation. YM155-adapted neuroblastoma cells were also characterized by decreased sensitivity to RNAi-mediated survivin depletion, further confirming survivin as a critical YM155 target in neuroblastoma. In conclusion, YM155 targets survivin in neuroblastoma. Furthermore, survivin is a promising therapeutic target for p53 wild-type neuroblastomas after resistance acquisition (neuroblastomas are rarely p53-mutated), potentially in combination with p53 activators. In addition, we show that the adaptation of cancer cells to molecular-targeted anticancer drugs is an effective strategy to elucidate a drug’s mechanism of action.
Background: Community acquired viruses (CRVs) may cause severe disease in cancer patients. Thus, efforts should be made to diagnose CRV rapidly and manage CRV infections accordingly.
Methods: A panel of 18 clinicians from the Infectious Diseases Working Party of the German Society for Haematology and Medical Oncology have convened to assess the available literature and provide recommendations on the management of CRV infections including influenza, respiratory syncytial virus, parainfluenza virus, human metapneumovirus and adenovirus.
Results: CRV infections in cancer patients may lead to pneumonia in approximately 30% of the cases, with an associated mortality of around 25%. For diagnosis of a CRV infection, combined nasal/throat swabs or washes/aspirates give the best results and nucleic acid amplification based-techniques (NAT) should be used to detect the pathogen. Hand hygiene, contact isolation and face masks have been shown to be of benefit as general infection management. Causal treatment can be given for influenza, using a neuraminidase inhibitor, and respiratory syncytial virus, using ribavirin in addition to intravenous immunoglobulins. Ribavirin has also been used to treat parainfluenza virus and human metapneumovirus, but data are inconclusive in this setting. Cidofovir is used to treat adenovirus pneumonitis.
Conclusions: CRV infections may pose a vital threat to patients with underlying malignancy. This guideline provides information on diagnosis and treatment to improve the outcome.
Background: Shortening duration of peginterferon-based HCV treatment reduces associated burden for patients. Primary objectives of this study were to assess the efficacy against the minimally acceptable response rate 12 weeks post-treatment (SVR12) and safety of simeprevir plus PR in treatment-naïve HCV GT1 patients treated for 12 weeks. Additional objectives included the investigation of potential associations of rapid viral response and baseline factors with SVR12.
Methods: In this Phase III, open-label study in treatment-naïve HCV GT1 patients with F0–F2 fibrosis, patients with HCV-RNA <25 IU/mL (detectable/undetectable) at Week 2, and undetectable HCV-RNA at Weeks 4 and 8, stopped all treatment at Week 12. All other patients continued PR for a further 12 weeks. Baseline factors significantly associated with SVR12 were identified through logistic regression.
Results: Of 163 patients who participated in the study, 123 (75%) qualified for 12-week treatment; of these, 81 (66%) achieved SVR12. Baseline factors positively associated with SVR12 rates in patients receiving the 12-week regimen were: IL28B CC genotype: (94% SVR12); HCV RNA ≤800,000 IU/mL (82%); F0–F1 fibrosis (74%). Among all 163 patients, 94% experienced ≥1 adverse event (AE), 4% a serious AE, and 2.5% discontinued due to an AE. Reduced impairment in patient-reported outcomes was observed in the 12-week vs >12-week regimen.
Conclusions: Overall SVR12 rate (66%) was below the target of 80%, indicating that shortening of treatment with simeprevir plus PR to 12 weeks based on very early response is not effective. However, baseline factors associated with higher SVR12 rates were identified. Therefore, while Week 2 response alone is insufficient to predict efficacy, GT1 patients with favourable baseline factors may benefit from a shortened simeprevir plus PR regimen.
Trial Registration: ClinicalTrials.gov NCT01846832
Core Facilities (CF) for advanced light microscopy (ALM) have become indispensable support units for research in the life sciences. Their organizational structure and technical characteristics are quite diverse, although the tasks they pursue and the services they offer are similar. Therefore, throughout Europe, scientists from ALM‐CFs are forming networks to promote interactions and discuss best practice models. Here, we present recommendations for ALM‐CF operations elaborated by the workgroups of the German network of ALM‐CFs, German Bio‐Imaging (GerBI). We address technical aspects of CF planning and instrument maintainance, give advice on the organization and management of an ALM‐CF, propose a scheme for the training of CF users, and provide an overview of current resources for image processing and analysis. Further, we elaborate on the new challenges and opportunities for professional development and careers created by CFs. While some information specifically refers to the German academic system, most of the content of this article is of general interest for CFs in the life sciences.
In modern welfare states, family policies may resolve the tension between employment and care-focused demands. However these policies sometimes have adverse consequences for distinct social groups. This study examined gender and educational differences in working parents’ perceived work–family conflict and used a comparative approach to test whether family policies, in particular support for child care and leave from paid work, are capable of reducing work–family conflict as well as the gender and educational gaps in work–family conflict. We use data from the European Social Survey 2010 for 20 countries and 5296 respondents (parents), extended with information on national policies for maternity and parental leave and child care support from the OECD Family Database. Employing multilevel analysis, we find that mothers and the higher educated report most work–family conflict. Policies supporting child care reduce the level of experienced work–family conflict; family leave policy appears to have no alleviating impact on working parents’ work–family conflict. Our findings indicate that family policies appear to be unable to reduce the gender gap in conflict perception and even widen the educational gap in work–family conflict.
Background: Colorectal cancer (CRC) is a leading cause of cancer-related death worldwide. Growing evidence indicates that tumor-initiating cells (TICs) are responsible for tumor growth and progression. Conventional chemotherapeutics do not sufficiently eliminate TICs, leading to tumor relapse. We aimed to gain insight into TIC biology by comparing the transcriptome of primary TIC cultures and their normal stem cell counterparts to uncover expression differences.
Methods: We established colonosphere cultures derived from the resection of paired specimens of primary tumor and normal mucosa in patients with CRC. These colonospheres, enriched for TICs, were used for differential transcriptome analyses to detect new targets for a TIC-directed therapy. Effects of target inhibition on CRC cells were studied in vitro and in vivo.
Results: Pathway analysis of the regulated genes showed enrichment of genes central to PI3K/AKT and Wnt-signaling. We identified CD133 as a marker for a more aggressive CRC subpopulation enriched with TICs in SW480 CRC cells in an in vivo cancer model. Treatment of CRC cells with the selective AKT inhibitor MK-2206 caused a decrease in cell proliferation, particularly in the TIC fraction, resulting in a significant reduction of the stemness capacity to form colonospheres in vitro and to initiate tumor formation in vivo. Consequently, MK-2206 treatment of mice with established xenograft tumors exhibited a significant deceleration of tumor progression. Primary patient-derived tumorsphere growth was significantly inhibited by MK-2206.
Conclusion: This study reveals that AKT signaling is critical for TIC proliferation and can be efficiently targeted by MK-2206 representing a preclinical therapeutic strategy to repress colorectal TICs.
Monitoring of minimal residual disease (MRD) or chimerism may help guide pre-emptive immunotherapy (IT) with a view to preventing relapse in childhood acute lymphoblastic leukemia (ALL) after transplantation. Patients with ALL who consecutively underwent transplantation in Frankfurt/Main, Germany between January 1, 2005 and July 1, 2014 were included in this retrospective study. Chimerism monitoring was performed in all, and MRD assessment was performed in 58 of 89 patients. IT was guided in 19 of 24 patients with mixed chimerism (MC) and MRD and by MRD only in another 4 patients with complete chimerism (CC). The 3-year probabilities of event-free survival (EFS) were .69 ± .06 for the cohort without IT and .69 ± .10 for IT patients. Incidences of relapse (CIR) and treatment-related mortality (CITRM) were equally distributed between both cohorts (without IT: 3-year CIR, .21 ± .05, 3-year CITRM, .10 ± .04; IT patients: 3-year CIR, .18 ± .09, 3-year CITRM .13 ± .07). Accordingly, 3-year EFS and 3-year CIR were similar in CC and MC patients with IT, whereas MC patients without IT experienced relapse. IT was neither associated with an enhanced immune recovery nor an increased risk for acute graft-versus-host disease. Relapse prevention by IT in patients at risk may lead to the same favorable outcome as found in CC and MRD-negative-patients. This underlines the importance of excellent MRD and chimerism monitoring after transplantation as the basis for IT to improve survival in childhood ALL.
Infant acute leukemia still has a poor prognosis, and allogeneic hematopoietic stem cell transplantation is indicated in selected patients. Umbilical cord blood (UCB) is an attractive cell source for this population because of the low risk of chronic graft-versus-host disease (GVHD), the strong graft-versus-leukemia effect, and prompt donor availability. This retrospective, registry-based study reported UCB transplantation (UCBT) outcomes in 252 children with acute lymphoblastic leukemia (ALL; n = 157) or acute myelogenous leukemia (AML; n = 95) diagnosed before 1 year of age who received a single-unit UCBT after myeloablative conditioning between 1996 and 2012 in European Society for Blood and Marrow Transplantation centers. Median age at UCBT was 1.1 years, and median follow-up was 42 months. Most patients (57%) received a graft with 1 HLA disparity and were transplanted in first complete remission (CR; 55%). Cumulative incidence function (CIF) of day 100 acute GVHD (grades II to IV) was 40% ± 3% and of 4-year chronic GVHD was 13% ± 2%. CIF of 1-year transplant-related mortality was 23% ± 3% and of 4-year relapse was 27% ± 3%. Leukemia-free-survival (LFS) at 4 years was 50% ± 3%; it was 40% and 66% for those transplanted for ALL and AML, respectively (P = .001). LFS was better for patients transplanted in first CR, regardless of diagnosis. In multivariate model, diagnosis of ALL (P = .001), advanced disease status at UCBT (<.001), age at diagnosis younger than 3 months (P = .012), and date of transplant before 2004 were independently associated with worse LFS. UCBT is a suitable option for patients diagnosed with infant acute leukemia who achieve CR. In this cohort, patients with AML had better survival than those with ALL.
Natural Killer (NK) cells are active against Aspergillus fumigatus, which in turn is able to impair the host defense. Unfortunately, little is known on the mutual interaction of NK cells and A. fumigatus. We coincubated human NK cells with A. fumigatus hyphae and assessed the gene expression and protein concentration of selected molecules. We found that A. fumigatus up-regulates the gene expression of pro-inflammatory molecules in NK cells, but inhibited the release of these molecules resulting in intracellular accumulation and limited extracellular availability. A. fumigatus down-regulatedmRNA levels of perforin in NK cells, but increased its intra- and extracellular protein concentration. The gene expression of stress related molecules of A. fumigatus such as heat shock protein hsp90 was up-regulated by human NK cells. Our data characterize for the first time the immunosuppressive effect of A. fumigatus on NK cells and may help to develop new therapeutic antifungal strategies.
We have recently shown that caspase-8 is a new substrate of Polo-like kinase 3 (Plk3) that phosphorylates the protein on residue T273 thereby promoting its pro-apoptotic function. In the present study we aimed to investigate the clinical relevance of Plk3 expression and phosphorylation of caspase-8 at T273 in patients with anal squamous cell carcinoma (SSC) treated with 5-fluorouracil and mitomycin C-based chemoradiotherapy (CRT). Immunohistochemical detection of the markers was performed in pretreatment biopsy specimens of 95 patients and was correlated with clinical/histopathologic characteristics including HPV-16 virus load/p16INK4a expression and cumulative incidence of local and distant failure, cancer specific survival (CSS), and overall survival (OS). We observed significant positive correlations between Plk3 expression, pT273 caspase-8 signal, and levels of HPV-16 virus DNA load/p16INK4a detection. Patients with high scores of Plk3 and pT273 caspase-8 showed increased local control (p = 0.011; p = 0.001), increased CSS (p = 0.011; p = 0.013) and OS (p = 0.024; p = 0.001), while the levels of pT273 caspase-8 were significantly associated (p = 0.033) with distant metastases. In multivariate analyses Plk3 expression remained significant for local failure (p = 0.018), CSS (p = 0.016) and OS (p = 0.023). Moreover, a combined HPV16 DNA load and Plk3 or pT273 caspase-8 variable revealed a significant correlation to decreased local failure (p = 0.001; p = 0.009), increased CSS (p = 0.016; p = 0.023) and OS (p = 0.003; p = 0.003). In conclusion these data indicate that elevated levels of Plk3 and pT273 caspase-8 are correlated with favorable clinical outcome in patients with anal SCC treated with concomitant CRT.
Renal cell carcinoma alters endothelial receptor expression responsible for leukocyte adhesion
(2016)
Renal cell carcinoma (RCC) escapes immune recognition. To elaborate the escape strategy the influence of RCC cells on endothelial receptor expression and endothelial leukocyte adhesion was evaluated. Human umbilical vein endothelial cells (HUVEC) were co-cultured with the RCC cell line, Caki-1, with and without tumor necrosis factor (TNF)-alpha. Intercellular cell adhesion molecule-1 (ICAM-1), vascular cell adhesion molecule-1 (VCAM-1), endothelial (E)-selectin, standard and variants (V) of CD44 were then analysed in HUVEC, using flow cytometry and Western blot analysis. To determine which components are responsible for HUVEC-Caki-1 interaction causing receptor alteration, Caki-1 membrane fragments versus cell culture supernatant were applied to HUVECS. Adhesion of peripheral blood lymphocytes (PBL) and polymorphonuclear neutrophils (PMN) to endothelium was evaluated by co-culture adhesion assays. Relevance of endothelial receptor expression for adhesion to endothelium was determined by receptor blockage. Co-culture of RCC and HUVECs resulted in a significant increase in endothelial ICAM-1, VCAM-1, E-selectin, CD44 V3 and V7 expression. Previous stimulation of HUVECs with TNF-alpha and co-cultivation with Caki-1 resulted in further elevation of endothelial CD44 V3 and V7 expression, whereas ICAM-1, VCAM-1 and E-selectin expression were significantly diminished. Since Caki-1 membrane fragments also caused these alterations, but cell culture supernatant did not, cell-cell contact may be responsible for this process. Blocking ICAM-1, VCAM-1, E-selectin or CD44 with respective antibodies led to a significant decrease in PBL and PMN adhesion to endothelium. Thus, exposing HUVEC to Caki-1 results in significant alteration of endothelial receptor expression and subsequent endothelial attachment of PBL and PMN.
Purpose: Few individuals that are latently infected with M. tuberculosis latent tuberculosis infection(LTBI) progress to active disease. We investigated risk factors for LTBI and active pulmonary tuberculosis (PTB) in Germany.
Methods: Healthy household contacts (HHCs), health care workers (HCWs) exposed to M. tuberculosis and PTB patients were recruited at 18 German centres. Interferon-γ release assay (IGRA) testing was performed. LTBI risk factors were evaluated by comparing IGRA-positive with IGRA-negative contacts. Risk factors for tuberculosis were evaluated by comparing PTB patients with HHCs.
Results: From 2008–2014, 603 HHCs, 295 HCWs and 856 PTBs were recruited. LTBI was found in 34.5% of HHCs and in 38.9% of HCWs. In HCWs, care for coughing patients (p = 0.02) and longstanding nursing occupation (p = 0.04) were associated with LTBI. In HHCs, predictors for LTBI were a diseased partner (odds ratio 4.39), sexual contact to a diseased partner and substance dependency (all p < 0.001). PTB was associated with male sex, low body weight (p < 0.0001), alcoholism (15.0 vs 5.9%; p < 0.0001), glucocorticoid therapy (7.2 vs 2.0%; p = 0.004) and diabetes (7.8 vs. 4.0%; p = 0.04). No contact developed active tuberculosis within 2 years follow-up.
Conclusions: Positive IGRA responses are frequent among exposed HHCs and HCWs in Germany and are poor predictors for the development of active tuberculosis.
Measuring NADPH oxidase (Nox)-derived reactive oxygen species (ROS) in living tissues and cells is a constant challenge. All probes available display limitations regarding sensitivity, specificity or demand highly specialized detection techniques. In search for a presumably easy, versatile, sensitive and specific technique, numerous studies have used NADPH-stimulated assays in membrane fractions which have been suggested to reflect Nox activity. However, we previously found an unaltered activity with these assays in triple Nox knockout mouse (Nox1-Nox2-Nox4-/-) tissue and cells compared to wild type. Moreover, the high ROS production of intact cells overexpressing Nox enzymes could not be recapitulated in NADPH-stimulated membrane assays. Thus, the signal obtained in these assays has to derive from a source other than NADPH oxidases. Using a combination of native protein electrophoresis, NADPH-stimulated assays and mass spectrometry, mitochondrial proteins and cytochrome P450 were identified as possible source of the assay signal. Cells lacking functional mitochondrial complexes, however, displayed a normal activity in NADPH-stimulated membrane assays suggesting that mitochondrial oxidoreductases are unlikely sources of the signal. Microsomes overexpressing P450 reductase, cytochromes b5 and P450 generated a NADPH-dependent signal in assays utilizing lucigenin, L-012 and dihydroethidium (DHE). Knockout of the cytochrome P450 reductase by CRISPR/Cas9 technology (POR-/-) in HEK293 cells overexpressing Nox4 or Nox5 did not interfere with ROS production in intact cells. However, POR-/- abolished the signal in NADPH-stimulated assays using membrane fractions from the very same cells. Moreover, membranes of rat smooth muscle cells treated with angiotensin II showed an increased NADPH-dependent signal with lucigenin which was abolished by the knockout of POR but not by knockout of p22phox. In conclusion: the cytochrome P450 system accounts for the majority of the signal of Nox activity chemiluminescence based assays.
Planted forests of alien tree species make significant contributions to the economy and provide multiple products and ecosystem services On the other hand, non-native trees now feature prominently on the lists of invasive alien plants in many parts of the world, and in some areas non-native woody species are now among the most conspicuous, damaging and, in some cases, best-studied invasive species. Afforestation and reforestation policies, both on public and private land, need to include clearly stated objectives and principles to reduce impacts of invasive trees outside areas set aside for forestry. With the intention of encouraging national authorities to implement general principles of prevention and mitigation of the risks posed by invasive alien tree species used in plantation forestry into national environmental policies, the Council of Europe facilitated the preparation of a Code of Conduct on Planted Forest and Invasive Alien Trees. This new voluntary Code, comprising 14 principles, complements existing codes of conduct dealing with horticulture and botanic gardens. The Code is addressed to all relevant stakeholders and decision makers in the 47 Member States of the Council of Europe. It aims to enlist the co-operation of the forest sector (trade and industry, national forest authorities, certification bodies and environmental organizations) and associated professionals in preventing new introductions and reducing, controlling and mitigating negative impacts due to tree invasions that arise, directly or indirectly, as a consequence of plantation forestry.
Cat’s claw creeper vine, Dolichandra unguis-cati (L.) Lohmann (syn. Macfadyena unguis-cati (L.) Gentry) (Bignoniaceae), is a major environmental weed in Australia. Two distinct forms of this weed (‘long’ and ‘short’ pod), with differences in leaf morphology and fruit size, occur in Australia. The long pod form has only been reported in less than fifteen localities in the whole of south-east Queensland, while the short pod form is widely distributed in Queensland and New South Wales. This study sought to compare growth traits such as specific leaf area, relative growth rate, stem length, shoot/root ratio, tuber biomass and branching architecture between these forms. These traits were monitored under glasshouse conditions over a period of 18 months. Short pod exhibited higher values of relative growth rates, stem length, number of tubers and specific leaf area than long pod, but only after 10 months of plant growth. Prior to this, long and short pod did not differ significantly. Higher values for these traits have been described as characteristics of successful colonizers. Results from this study could partly explain why the short pod form is more widely distributed in Australia while long pod is confined to a few localities.
The 13th International Conference on Ecology and Management of Alien Plant Invasions (EMAPi) was held in Waikoloa Village, Hawaii, 20–24 September 2015. EMAPi is the only international conference that focuses exclusively on alien plants; its history and broad significance were outlined by Richardson et al. (2010). During EMAPi 2015, over 200 presentations were delivered by delegates hailing from 31 countries. The presentations covered a wide range of topics in invasion biology, addressing organizational levels ranging from the gene to global patterns. Connecting science with management emerged as a unifying theme across the conference program. Commonalities emerged through lively discussions, giving new insights into research needs, management strategies, and more effective implementation of biosecurity and control. A highlight was the mid-conference field trip, where researchers, land managers, and policy makers discussed collaboration and solutions in the stimulating back drop of Hawaii Volcanoes National Park, Hakalau National Wildlife Refuge, and other conservation sites that have evolving invasive plant management strategies.
Biological control of weeds in Vanuatu began in 1935, with the introduction of the tingid Teleonemia scrupulosa to control Lantana camara. To date, nine biological control agents have been intentionally introduced to control eight weed species. Seven of these agents have established on their respective hosts while an eighth, Zygogramma bicolorata, an agent for Parthenium hysterophorus has only recently been released and establishment is unlikely. The fate of a ninth agent, Heteropsylla spinulosa, released for the control of Mimosa diplotricha is unclear. Six other biological control agents, including Epiblema strenuana which was first detected in 2014 on P. hysterophorus on Efate have spread into the country unintentionally. Control of the target weeds range from inadequate to very good. By far the most successful agent has been Calligrapha pantherina which was introduced to control Sida acuta and Sida rhombifolia. The beetle was released on 14 islands and managed to spread to at least another 10 islands where it has effectively controlled both Sida spp. Control of the two water weeds, Eichhornia crassipes by Neochetina bruchi and N. eichhorniae and Pistia stratiotes by Neohydronomus affinis, has also been fairly good in most areas. Two agents, T. scrupulosa and Uroplata girardi, were released on L. camara, and four other agents have been found on the weed, but L. camara is still not under adequate control. The rust Puccinia spegazzinii was first released on Mikania micrantha in 2012 and successfully established. Anecdotal evidence suggests that it is having an impact on M. micrantha, but detailed monitoring is required to determine its overall impact. Future prospects for weed biological control in Vanuatu are positive, with the expected greater spread of recently released agents and the introduction of new agents for P. hysterophorus, L. camara, Dolichandra unguis-cati and Spathodea campanulata.
Biological control of introduced weeds in the 22 Pacific island countries and territories (PICTs) began in 1911, with the lantana seed-feeding fly introduced into Fiji and New Caledonia from Hawaii. To date, a total of 62 agents have been deliberately introduced into the PICTs to control 21 weed species in 17 countries. A further two agents have spread naturally into the region. The general impact of the 36 biocontrol agents now established in the PICTs ranges from none to complete control of their target weed(s). Fiji has been most active in weed biocontrol, releasing 30 agents against 11 weed species. Papua New Guinea, Guam, and the Federated States of Micronesia have also been very active in weed biocontrol. For some weeds such as Lantana camara, agents have been released widely, and can now be found in 15 of the 21 PICTs in which the weed occurs. However, agents for other commonly found weeds, such as Sida acuta, have been released in only a few countries in which the weed is present. There are many safe and effective biocontrol agents already in the Pacific that could be utilised more widely, and highly effective agents that have been released elsewhere in the world that could be introduced following some additional host specificity testing. This paper discusses the current status of biological control efforts against introduced weeds in the 22 PICTs and reviews options that could be considered by countries wishing to initiate weed biological control programmes.
Successful invasion is often due to a combination of species characteristics (or invasiveness) and habitat suitability (or invasibility). Our objective was to identify preferred habitats and suitable environmental conditions for the African tulip tree Spathodea campanulata (Bignoniaceae), one of the most invasive alien trees on the tropical island of French Polynesia (South Pacific Ocean), in relation to its distribution and photosynthesis capacity. Spathodea abundance and leaf chlorophyll fluorescence Fo’, ETRmax, and Y(II) effective were examined in relation to topography and micro-climate along elevational transects between 140 m and 1,300 m. Results showed that Spathodea is (1) present up to 1,240 m with lowest maximum July–October (cool season) temperature of 9.4 °C and an average July-October temperature of 14.6 °C, (2) is able to colonize slope steepness of more than 45°, (3) is well represented in the elevational range of 140–540 m as well as in the native forests between 940 m and 1,040 m, suggesting a high threat for native and endemic plants species. Along one of the transects, in the elevation range of 541–940 m, Spathodea was under-represented, Chl fluorescence Fo’ increased significantly while Y(II)effective decreased significantly supporting the hypothesis that this range is a non-preferred environment, probably due to microclimate conditions characterized by punctual air dryness. Among Spathodea plants surveyed along a wetter transect, Y(II)effective and ETRmax were comparable from low elevation to mid-high elevation indicating that the potential photosynthesis rate of Spathodea may be similar from sea level until mid-high elevation. Major infestations on the island of Tahiti were reported on the leeward (drier and urbanized) west coast, but Spathodea has also been recently found on the slopes of the windward (wetter) east coast. Chlorophyll fluorescence measurements indicate a high photosynthetic capacity among Spathodea in wet environments suggesting that Spathodea will become invasive across most of the island of Tahiti.
The annual grass Bromus tectorum has invaded millions of hectares in western North America and has transformed former perennial grass and shrub-dominated communities into annual grasslands. Fire plays a key role in the maintenance of B. tectorum on the landscape but the type of disturbance responsible for initial invasion is less well understood. We conducted an experiment in a perennial shrub/grass/forb community in eastern Idaho, USA to examine the roles of plant community and soil disturbance on B. tectorum emergence and establishment prior to state-changing fires. Our experiment consisted of a plant community disturbance treatment where we (1) removed the shrub component, (2) removed the grass/forb component, or (3) removed all shrubs, grasses, and forbs. We followed this treatment with seeding of B. tectorum onto the soil surface that was (1) intact, or (2) disturbed. Each experimental plot had an associated control with no plant community disturbance but was seeded in the same manner. The experiment was replicated 20 times in two sites (high and low aboveground biomass). We measured emergence by counting seedlings in late spring and establishment by counting, removing, and weighing B. tectorum individuals in mid-summer. We also examined the influence of plant community disturbance on the soil environment by measuring extractable NH4 + and NO3 – four times each summer. Soil disturbance greatly influenced the number of B. tectorum individuals that emerged each spring. Plant community disturbance, specifically disturbance of the grass/forb component, increased N availability in the late growing season and biomass of B. tectorum the following summer. We conclude that soil disturbance and plant community disturbance interact to promote the initial invasion of B. tectorum in Intermountain West valley ecosystems.
Many recent studies in invasion science have identified species traits that determine either invasiveness or impact. Such analyses underpin risk assessments and attempts to prioritise management actions. However, the factors that mediate the capacity of an introduced species to establish and spread (i.e. its invasiveness) can differ from those that affect the nature and severity of impacts. Here we compare those traits correlated with invasiveness with those correlated with impact for Cactaceae (“cacti”) in South Africa. To assess impact magnitude, we scored 70 cacti (35 invasive and 35 non-invasive species) using the Generic Impact Scoring System (GISS) and identified traits correlated with impact using a decision tree approach. We then compared the traits correlated with impact with those identified in a recent study as correlated with invasiveness (i.e. native range size and growth form). We found that there is a significant correlation between native range size and both invasiveness and impact. Cacti with larger native ranges were more likely to become invasive (p=0.001) and cause substantial impacts (p=0.01). These results are important for prioritising efforts on the management of cactus species. Understanding when and why impact and invasiveness are correlated (as they appear to be for Cactaceae) is likely to be an important area of future research in risk assessment.
The risk of introducing weeds to new areas through grain (cereals, oilseeds and pulses) intended for processing or consumption is typically considered less than that from seed or plants for planting. However, within the range of end uses for grain, weed risk varies significantly and should not be ignored. In this paper, we discuss pathway risk analysis as a framework to examine the association of weed seeds with grain commodities throughout the production process from field to final end use, and present inspection sampling data for grain crops commonly imported to Canada. In the field, weed seed contamination of grain crops is affected by factors such as country of origin, climate, biogeography and production and harvesting practices. As it moves toward export, grain is typically cleaned at a series of elevators and the effectiveness and degree of cleaning are influenced by grain size, shape and density as well as by grade requirements. In cases where different grain lots are blended, uncertainty may be introduced with respect to the species and numbers of weed seed contaminants. During transport and storage, accidental spills and cross-contamination among conveyances may occur. At the point of import to Canada, inspection sampling data show that grain shipments contain a variety of contaminants including seeds of regulated weeds and species that represent new introductions. However, grain cleaning and processing methods tailored to end use at destination also affect the presence and viability of weed seeds. For example, grains that are milled or crushed for human use present a lower risk of introducing weed seeds to new environments than grains that undergo minimal or no processing for livestock feed, or screenings that are produced as a by-product of grain cleaning. Pathway risk analysis allows each of these stages to be evaluated in order to characterize the overall risk of introducing weeds with particular commodities, and guide regulatory decisions about trade and plant health.
Like most jurisdictions, Australia is managing a broad range of invasive alien species. Here, we provide the first holistic quantification of how much invasive species impact Australia’s economy, and how much Australia spends on their management. In the 01–02 financial year (June to July), the combined estimated cost (economic losses and control) of invasive species was $9.8 billion, rising to $13.6 billion in the 11–12 financial year. Approximately $726 million of grants funded through the Commonwealth of Australia (i.e. federal funding) was spent on invasive species management and research between 1996 to 2013. In 01–02, total national expenditure on invasive species was $2.31 billion, rising to $3.77 billion in 11–12. Agriculture accounted for more than 90% of the total cost. For 01–02 and 11–12, these expenditure figures equate to $123 and $197 per person per year respectively, as well as 0.32 and 0.29% of GDP respectively. All values provided here are most likely to be underestimates of the real values due to the significant constraints of the data obtainable. Invasive species are clearly a significant economic burden in Australia. Given the extent of the issue of invasive species globally, there is a clear need for better quantifications of both economic loss and expenditure in more jurisdictions, as well as in Australia.
In our recent Discussion paper, we presented our view that the only real distinction between biological invasions and natural colonisations is the human element. We agree that invasion science is a very important science, not only to better understand the role that human mediation plays for colonisation, but also for many other science fields. We agree with all invasion researchers that the human influence can result in spectacular differences, including in rates of species movement, rates of successful colonisation, the particular species being moved, the biogeography of dispersal pathways and rates of any resulting ecological disturbance and biodiversity loss. Our deep point is that that species dispersed by human-mediation or natural colonisation are all subject to the same basic laws and rules of ecology, identical to many other phenomenon that occur naturally and can be greatly influenced by people. The human dimension is merely a mechanistic distinction, albeit important because it exposes insights about the colonisation process that cannot be seen by the study of natural colonisations alone. We provide 10 hypotheses that can be scientifically tested to determine whether biological invasions and natural colonisations are two separate processes or the same process being influenced by different mechanisms.
The authors inserted an incorrect figure in Oswalt et al. (2015) that was printed as Fig. 2. The mapped species represented in Oswalt et al. (2015) is Triadica sebifera or Chinese tallow. The correct Fig. 2, representing Imperata cylindrica, is reproduced below. The correction does not alter the conclusions of Oswalt et al. (2015).
Since first of January 2015, the EU-regulation 1143/2014 obligates all member states to conduct costbenefit analyses in preparation of control programs for invasive alien species to minimize and mitigate their impacts. In addition, with ratification of the Rio Declaration and the amended Federal Nature Conservation Act, Germany is committed to control any further spread of invasive species. This is the first cost-benefit analysis estimating positive welfare effects and societal importance of H. mantagezzianum invasion control in Germany. The paper analyses possible control options limiting stands of giant hogweeds (H. mantegazzianum) based on survey data of n = 287 German districts. We differentiate between several control options (e.g. root destruction, mechanical cutting or mowing, chemical treatment and grazing) depending on infested area size and protection status. The calculation of benefits is based on stated preference results (choice experiment; n = 282). For the cost side, we calculate two different invasion scenarios (i) no re-infestation after successfully conducted control measures (optimistic) and (ii) re-infestation twice after conducting control measures occurring within ten years (pessimistic). Minimum costs of eradication measures including a time span of ten years and a social discount rate of 1% result in a total of 3,467,640 € for optimistic scenario and 6,254,932 € for pessimistic invasion scenario, where no success of the first eradication attempt is assumed. Benefits of invasion control in Germany result in a total of 238,063,641 € per year and overassessment-factor corrected in 59,515,910 € per year.
The management of invasive alien species (IAS) in protected areas has become increasingly important in recent years. In this study, we analyse IAS management in the bilateral National Park Thayatal-Podyjí at the Austrian-Czech border. Based on two surveys from the years 2001 and 2010 and on annual management data from 2001-2010 we analyse changes in distribution and the efficiency of IAS management of three invasive alien plants (Fallopia × bohemica, Impatiens glandulifera, Robinia pseudoacacia). In 2010, the three study species had invaded 161 ha (2%) of the study area. Despite a decade of management, F. × bohemica has become widespread, whereas I. glandulifera distribution has decreased strongly. The most widespread species, R. pseudoacacia, has declined substantially in cover, but the area invaded has increased. From 2001 to 2010, annual management effort declined by about half. Management effort per hectare and decade was highest for F. × bohemica (2,657 hours), followed by R. pseudoacacia (1,473 hours) and I. glandulifera (270 hours). Management effort for achieving the same amount of reduction in population size and cover was highest for R. pseudoacacia, followed by F. × bohemica and I. glandulifera. We conclude that substantial effort and resources are necessary to successfully manage the study species and have to be provided over prolonged time periods, and thus continued management of these species is recommended. We highly recommend a systematic approach for monitoring the efficiency of IAS management projects in protected areas.
In a recent Discussion Paper, Hoffmann and Courchamp (2016) posed the question: are biological invasions and natural colonisations that different? This apparently simple question resonates at the core of the biological study of human-induced global change, and we strongly believe that the answer is yes: biological invasions and natural colonisations differ in processes and mechanisms in ways that are crucial for science, management, and policy. Invasion biology has, over time, developed into the broader transdisciplinary field of invasion science. At the heart of invasion science is the realisation that biological invasions are not just a biological phenomenon: the human dimension of invasions is a fundamental component in the social-ecological systems in which invasions need to be understood and managed.
The Anthropocene Epoch is characterized by novel and increasingly complex dependencies between the environment and human civilization, with many challenges of biodiversity management emerging as wicked problems. Problems arising from the management of biological invasions can be either tame (with simple or obvious solutions) or wicked, where difficulty in appropriately defining the problem can make complete solutions impossible to find. We review four case studies that reflect the main goals in the management of biological invasions – prevention, eradication, and impact reduction – assessing the drivers and extent of wickedness in each. We find that a disconnect between the perception and reality of how wicked a problem is can profoundly influence the likelihood of successful management. For example, managing species introductions can be wicked, but shifting from species-focused to vector-focused risk management can greatly reduce the complexity, making it a tame problem. The scope and scale of the overall management goal will also dictate the wickedness of the problem and the achievability of management solutions (cf. eradication and ecosystem restoration). Finally, managing species that have both positive and negative impacts requires engagement with all stakeholders and scenario-based planning. Effective management of invasions requires either recognizing unavoidable wickedness, or circumventing it by seeking alternative management perspectives.
Bilberry (Vaccinium myrtillus L.) is a dwarf shrub with high ecological relevance as habitat and as a food source for many animals in mountain forests of central Europe. This species benefits from conifer forests and declines with an increase of broadleaved tree species in the canopy. The ongoing large-scale conversion from conifer to broadleaved forests may significantly alter the ground vegetation, especially the dominance of a key species such as bilberry. We used morphological indicators to investigate the vitality of bilberry. The first objective was to determine whether the vitality of bilberry is negatively impacted by increasing the proportion of beech (Fagus sylvatica L.) in Norway spruce (Picea abies Karst.) forests. The vitality of bilberry was measured by its cover, height, biomass, shoot length and basal diameter. The second objective was to determine whether these changes in bilberry vitality were related to light, canopy cover, soil pH, organic layer mass and tree species. The data was collected from three study areas in the southern and central Black Forest. The bedrock consisted of gneiss and granite whereas the stands were either: pure beech, a mixture of beech and spruce or pure spruce. The stands were located adjacent to each other. On all three areas a higher vitality of bilberry was observed under spruce compared to beech. Mixed effect models show that the occurrence of spruce is the most important variable explaining the increase in bilberry biomass. Light had a small positive effect, whereas soil properties had negligible effects and were site specific. These results are a strong indication of the negative influence that beech has on bilberry in conifer dominated forests. This has to be taken into consideration when developing silvicultural approaches and should be a consideration when making plans for the preservation of habitat for species like capercaillie (Tetrao urogallus L.). This is even more important today because the recent trend in central European forestry is to increase the proportion of beech.
Bioindicators are organisms able to provide indirectly or directly information on the impact of pollutants in the environment. The content of heavy metals or other toxic compounds in these living organisms is of great interest to assess the level of contaminants. Leaves of the most common deciduous trees (Acer pseudoplatanus L., Betula pendula Roth, Carpinus betulus L., Cercis siliquastrum L., Ginkgo biloba L., Liquidambar styraciflua, Quercus robur L. and Tilia cordata Miller) and two invasive tree species Ailanthus altissima P. Mill. and Robinia pseudoacacia L., in the City of Bolzano (southern Alps in Northern Italy), were therefore studied to assess their suitability as bioindicators for the trace elements Cd, Cu, Mn, Pb, and Zn, mainly considered as traffic related elements. Leaves and soil samples were investigated, both from high-density traffic roads and control sites of minor traffic impact, such as parks. Our data reveal that Betula pendula has a considerable Zn accumulation potential compared to the other investigated tree species. The maximum value measured for Zn in a Betula specimen is 200 mg kg-1 dry weight. With regard to the soils, considering the geoaccumulation index, most of the analyzed soils belong to the first class, i. e. uncontaminated (Igeo ≤ 0) for all analyzed elements. Moreover, in several samples collected in high traffic areas, Cu and Zn show values within 1 < Igeo ≤ 2 (moderately contaminated). This allows to hypothesize a traffic-related origin for these elements. For this reason, B. pendula can be considered a potential heavy metal accumulator and therefore a good bioindicator for these urban pollutants. Since B. pendula is widely distributed in urban areas in Central and Northern Europe, it can be considered a species suitable for a systematic and comparative monitoring network.
Infections of the central nervous system (CNS) are infrequently diagnosed in immunocompetent patients, but they do occur in a significant proportion of patients with hematological disorders. In particular, patients undergoing allogeneic hematopoietic stem-cell transplantation carry a high risk for CNS infections of up to 15%. Fungi and Toxoplasma gondii are the predominant causative agents. The diagnosis of CNS infections is based on neuroimaging, cerebrospinal fluid examination and biopsy of suspicious lesions in selected patients. However, identification of CNS infections in immunocompromised patients could represent a major challenge since metabolic disturbances, side-effects of antineoplastic or immunosuppressive drugs and CNS involvement of the underlying hematological disorder may mimic symptoms of a CNS infection. The prognosis of CNS infections is generally poor in these patients, albeit the introduction of novel substances (e.g. voriconazole) has improved the outcome in distinct patient subgroups. This guideline has been developed by the Infectious Diseases Working Party (AGIHO) of the German Society of Hematology and Medical Oncology (DGHO) with the contribution of a panel of 14 experts certified in internal medicine, hematology/oncology, infectious diseases, intensive care, neurology and neuroradiology. Grades of recommendation and levels of evidence were categorized by using novel criteria, as recently published by the European Society of Clinical Microbiology and Infectious Diseases.
IKZF1 deletion (ΔIKZF1) is an important predictor of relapse in childhood B-cell precursor acute lymphoblastic leukemia. Because of its clinical importance, we previously mapped breakpoints of intragenic deletions and developed a multiplex PCR assay to detect recurrent intragenic ΔIKZF1. Since the multiplex PCR was not able to detect complete deletions (IKZF1 Δ1-8), which account for ~30% of all ΔIKZF1, we aimed at investigating the genomic scenery of IKZF1 Δ1-8. Six samples of cases with IKZF1 Δ1-8 were analyzed by microarray assay, which identified monosomy 7, isochromosome 7q, and large interstitial deletions presenting breakpoints within COBL gene. Then, we established a multiplex ligation-probe amplification (MLPA) assay and screened copy number alterations within chromosome 7 in 43 diagnostic samples with IKZF1 Δ1-8. Our results revealed that monosomy and large interstitial deletions within chromosome 7 are the main causes of IKZF1 Δ1-8. Detailed analysis using long distance inverse PCR showed that six patients (16%) had large interstitial deletions starting within intronic regions of COBL at diagnosis, which is ~611 Kb downstream of IKZF1, suggesting that COBL is a hotspot for ΔIKZF1. We also investigated a series of 25 intragenic deletions (Δ2–8, Δ3–8 or Δ4–8) and 24 relapsed samples, and found one IKZF1-COBL tail-to-tail fusion, thus supporting that COBL is a novel hotspot for ΔIKZF1. Finally, using RIC score methodology, we show that breakpoint sequences of IKZF1 Δ1-8 are not analog to RAG-recognition sites, suggesting a different mechanism of error promotion than that suggested for intragenic ΔIKZF1.
Bone losses are common as a consequence of unloading and also in patients with chronic obstructive pulmonary disease (COPD). Although hypoxia has been implicated as an important factor to drive bone loss, its interaction with unloading remains unresolved. The objective therefore was to assess whether human bone loss caused by unloading could be aggravated by chronic hypoxia.
In a cross-over designed study, 14 healthy young men underwent 21-day interventions of bed rest in normoxia (NBR), bed rest in hypoxia (HBR), and hypoxic ambulatory confinement (HAmb). Hypoxic conditions were equivalent to 4000 m altitude. Bone metabolism (NTX, P1NP, sclerostin, DKK1) and phospho-calcic homeostasis (calcium and phosphate serum levels and urinary excretion, PTH) were assessed from regular blood samples and 24-hour urine collections, and tibia and femur bone mineral content was assessed by peripheral quantitative computed tomography (pQCT).
Urinary NTX excretion increased (P < 0.001) to a similar extent in NBR and HBR (P = 0.69) and P1NP serum levels decreased (P = 0.0035) with likewise no difference between NBR and HBR (P = 0.88). Serum total calcium was increased during bed rest by 0.059 (day D05, SE 0.05 mM) to 0.091 mM (day D21, P < 0.001), with no additional effect by hypoxia during bed rest (P = 0.199). HAmb led, at least temporally, to increased total serum calcium, to reduced serum phosphate, and to reduced phosphate and calcium excretion.
In conclusion, hypoxia did not aggravate bed rest-induced bone resorption, but led to changes in phospho-calcic homeostasis likely caused by hyperventilation. Whether hyperventilation could have mitigated the effects of hypoxia in this study remains to be established.
Ambrosia artemisiifolia is an invasive annual herb infamous for the high allergenicity of its pollen, which is related to increasing medical costs. Additionally, it can cause serious yield losses as agricultural weed. Common ragweed seeds accumulate in the soil and can remain therein viable for decades, which poses a problem for the sustainable management of these populations. A long term management should thus target a reduction of the soil seed bank. We observed the influence of four different mowing regimes on the ragweed soil seed bank at six roadside populations in eastern Austria. The mowing regimes were based on methods from common roadside management practice and specifically adapted to reduce seed production. After three years of application, the soil seed bank was indeed reduced by 45 to 80 percent through three of the four mowing regimes tested. Therefore, we suggest that the best mowing regime for the most effective reduction of the size of the soil seed bank is the one consisting of one cut just after the beginning of female flowering (around the 3rd week of August in Eastern Central Europe), followed by a second cut 2–3 weeks later.
As legislation, research and management of invasive alien species (IAS) are not fully coordinated across countries or different stakeholder groups, one approach leading to more or less standardized activities is based on producing lists of prominent IAS that attain high level of concern and are a subject of priority monitoring and management. These so-called Black, Grey and Watch (alert) Lists represent a convenient starting point for setting priorities in prevention, early warning and management systems. It is important that these lists be based on transparent and robust criteria so as to accommodate interests and perception of impacts by groups of concerned authorities and stakeholders representing sectors as diverse as, e.g. forestry, horticulture, aquaculture, hunting, and nature conservation, and to justify possible trade restrictions. The principles for blacklisting need to be general enough to accommodate differences among taxonomic groups (plants, invertebrates, vertebrates) and invaded environments (e.g. aquatic, terrestrial, urban, suburban, seminatural), and must take into account invasion dynamics, the impact the IAS pose, and management strategies suitable for each particular invader. With these assumptions in mind, we synthesize available information to present Black, Grey and Watch Lists of alien species for the Czech Republic, with recommended categorized management measures for land managers, policy makers and other stakeholders. We took into account differences in the listed species’ distribution, invasion status, known or estimated environmental impact, as well as possible management options, and apply these criteria to both plants and animals. Species with lower impact, but for which some level of management and regulation is desirable, are included on the Grey List. Some potentially dangerous species occurring in European countries with comparable climatic conditions, as well as those introduced in the past but without presently known wild populations in the Czech Republic, are listed on the Watch list. In total, there are 78 plant and 39 animal species on the Black List, 47 and 16 on the Grey List, and 25 and 27, respectively, on the Watch List. The multilayered approach to the classification of alien species, combining their impacts, population status and relevant management, can serve as a model for other countries that are in process of developing their Black Lists.
Distribution and abundance of exotic earthworms within a boreal forest system in southcentral Alaska
(2016)
Little is known about exotic earthworms (Oligochaeta: Lumbricidae) in Alaska outside its southeastern panhandle. This study documents the distribution of exotic earthworms in the relatively undisturbed Kenai National Wildlife Refuge (KNWR), a large, primarily wilderness refuge in southcentral Alaska. We sampled 69 sites near boat launches, along road corridors, and in low human impact areas > 5 km from the road, finding three species of earthworms (Dendrobaena octaedra, Dendrodrilus rubidus, and Lumbricus terrestris). Most road sites (90%) and boat launches (80%) contained earthworms; half (50%) of low human impact sites contained earthworms. Distance to roads was the only significant factor in predicting earthworm occurrence; soil pH, soil moisture, leaf litter depth, and vegetation cover were not. The disparate distributions of these three species suggest that within the KNWR road construction and vehicle traffic played a role in dispersal of the widespread, abundant Dendrobaena octaedra and uncommon Dendrodrilus rubidus; bait abandonment appeared to be the primary method of introduction of Lumbricus terrestris. While the distribution of harmful anecic earthworms in KNWR is currently limited, the prohibition of Lumbricus spp. as bait within conservation units in Alaska may be warranted.
Trichopsis vittata (Cuvier, 1831) is a small, freshwater gourami (Fam: Osphronemidae) native to southeast Asia. It was first detected in Florida in the 1970s and seems to have persisted for decades in a small area. In this study, we documented T. vittata’s ecophysiological tolerances (salinity and low-temperature) and qualitatively compared them to published values for other sympatric non-native species that have successfully invaded much of the Florida peninsula. Trichopsis vittata survived acute salinity shifts to 16 psu and was able to survive up to 20 psu when salinity was raised more slowly (5 psu per week). In a cold-tolerance experiment, temperature was lowered from 24 °C at 1 °C hr-1 until fish died. Mean temperature at death (i.e., lower lethal limit) was 7.2 °C. Trichopsis vittata seems as tolerant or more tolerant than many other sympatric non-native fishes for the variables we examined. However, T. vittata is the only species that has not dispersed since its introduction. Species other than T. vittata have broadly invaded ranges, many of which include the entire lower third of the Florida peninsula. It is possible that tolerance to environmental parameters serves as a filter for establishment, wherein candidate species must possess the ability to survive abiotic extremes as a first step. However, a species’ ability to expand its geographic range may ultimately rely on a secondary set of criteria including biotic interactions and life-history variables.