Refine
Year of publication
Document Type
- Article (19)
- Part of a Book (1)
- Conference Proceeding (1)
Has Fulltext
- yes (21)
Is part of the Bibliography
- no (21)
Keywords
- Acute myeloid leukaemia (1)
- Affect theory (1)
- Alzheimer’s dementia (1)
- Amyloid-beta 42 (1)
- Aneurysmal subarachnoid hemorrhage (1)
- Animal studies (1)
- Biomarkers (1)
- Blood-brain barrier (1)
- Brain tumor (1)
- COVID-19 (1)
Ecological speciation assumes reproductive isolation to be the product of ecologically based divergent selection. Beside natural selection, sexual selection via phenotype-assortative mating is thought to promote reproductive isolation. Using the neotropical fish Poecilia mexicana from a system that has been described to undergo incipient ecological speciation in adjacent, but ecologically divergent habitats characterized by the presence or absence of toxic H2S and darkness in cave habitats, we demonstrate a gradual change in male body colouration along the gradient of light/darkness, including a reduction of ornaments that are under both inter- and intrasexual selection in surface populations. In dichotomous choice tests using video-animated stimuli, we found surface females to prefer males from their own population over the cave phenotype. However, female cave fish, observed on site via infrared techniques, preferred to associate with surface males rather than size-matched cave males, likely reflecting the female preference for better-nourished (in this case: surface) males. Hence, divergent selection on body colouration indeed translates into phenotype-assortative mating in the surface ecotype, by selecting against potential migrant males. Female cave fish, by contrast, do not have a preference for the resident male phenotype, identifying natural selection against migrants imposed by the cave environment as the major driver of the observed reproductive isolation.
Background: The progression of mild cognitive impairment (MCI) to Alzheimer’s disease (AD) dementia can be predicted by cognitive, neuroimaging, and cerebrospinal fluid (CSF) markers. Since most biomarkers reveal complementary information, a combination of biomarkers may increase the predictive power. We investigated which combination of the Mini-Mental State Examination (MMSE), Clinical Dementia Rating (CDR)-sum-of-boxes, the word list delayed free recall from the Consortium to Establish a Registry of Dementia (CERAD) test battery, hippocampal volume (HCV), amyloid-beta1–42 (Aβ42), amyloid-beta1–40 (Aβ40) levels, the ratio of Aβ42/Aβ40, phosphorylated tau, and total tau (t-Tau) levels in the CSF best predicted a short-term conversion from MCI to AD dementia.
Methods: We used 115 complete datasets from MCI patients of the "Dementia Competence Network", a German multicenter cohort study with annual follow-up up to 3 years. MCI was broadly defined to include amnestic and nonamnestic syndromes. Variables known to predict progression in MCI patients were selected a priori. Nine individual predictors were compared by receiver operating characteristic (ROC) curve analysis. ROC curves of the five best two-, three-, and four-parameter combinations were analyzed for significant superiority by a bootstrapping wrapper around a support vector machine with linear kernel. The incremental value of combinations was tested for statistical significance by comparing the specificities of the different classifiers at a given sensitivity of 85%.
Results: Out of 115 subjects, 28 (24.3%) with MCI progressed to AD dementia within a mean follow-up period of 25.5 months. At baseline, MCI-AD patients were no different from stable MCI in age and gender distribution, but had lower educational attainment. All single biomarkers were significantly different between the two groups at baseline. ROC curves of the individual predictors gave areas under the curve (AUC) between 0.66 and 0.77, and all single predictors were statistically superior to Aβ40. The AUC of the two-parameter combinations ranged from 0.77 to 0.81. The three-parameter combinations ranged from AUC 0.80–0.83, and the four-parameter combination from AUC 0.81–0.82. None of the predictor combinations was significantly superior to the two best single predictors (HCV and t-Tau). When maximizing the AUC differences by fixing sensitivity at 85%, the two- to four-parameter combinations were superior to HCV alone.
Conclusion: A combination of two biomarkers of neurodegeneration (e.g., HCV and t-Tau) is not superior over the single parameters in identifying patients with MCI who are most likely to progress to AD dementia, although there is a gradual increase in the statistical measures across increasing biomarker combinations. This may have implications for clinical diagnosis and for selecting subjects for participation in clinical trials.
Background: Disease progression and delayed neurological complications are common after aneurysmal subarachnoid hemorrhage (aSAH). We explored the potential of quantitative blood-brain barrier (BBB) imaging to predict disease progression and neurological outcome.
Methods: Data were collected as part of the Co-Operative Studies of Brain Injury Depolarizations (COSBID). We analyzed retrospectively, blinded and semi-automatically magnetic resonance images from 124 aSAH patients scanned at 4 time points (24–48 h, 6–8 days, 12–15 days and 6–12 months) after the initial hemorrhage. Volume of brain with apparent pathology and/or BBB dysfunction (BBBD), subarachnoid space and lateral ventricles were measured. Neurological status on admission was assessed using the World Federation of Neurosurgical Societies and Rosen-Macdonald scores. Outcome at ≥6 months was assessed using the extended Glasgow outcome scale and disease course (progressive or non-progressive based on imaging-detected loss of normal brain tissue in consecutive scans). Logistic regression was used to define biomarkers that best predict outcomes. Receiver operating characteristic analysis was performed to assess accuracy of outcome prediction models.
Findings: In the present cohort, 63% of patients had progressive and 37% non-progressive disease course. Progressive course was associated with worse outcome at ≥6 months (sensitivity of 98% and specificity of 97%). Brain volume with BBBD was significantly larger in patients with progressive course already 24–48 h after admission (2.23 (1.23–3.17) folds, median with 95%CI), and persisted at all time points. The highest probability of a BBB-disrupted voxel to become pathological was found at a distance of ≤1 cm from the brain with apparent pathology (0·284 (0·122–0·594), p < 0·001, median with 95%CI). A multivariate logistic regression model revealed power for BBBD in combination with RMS at 24-48 h in predicting outcome (ROC area under the curve = 0·829, p < 0·001).
Interpretation: We suggest that early identification of BBBD may serve as a key predictive biomarker for neurological outcome in aSAH.
Fund: Dr. Dreier was supported by grants from the Deutsche Forschungsgemeinschaft (DFG) (DFG DR 323/5-1 and DFG DR 323/10–1), the Bundesministerium für Bildung und Forschung (BMBF) Center for Stroke Research Berlin 01 EO 0801 and FP7 no 602150 CENTER-TBI.
Dr. Friedman was supported by grants from Israel Science Foundation and Canada Institute for Health Research (CIHR). Dr. Friedman was supported by grants from European Union's Seventh Framework Program (FP7/2007–2013; grant #602102).
Only a few Methyl-[11C]-l-methionine (MET) positron emission tomography (PET) studies have focused on children and young adults with brain neoplasm. Due to radiation exposure, long scan acquisition time, and the need for sedation in young children MET-PET studies should be restricted to this group of patients when a decision for further therapy is not possible from routine diagnostic procedures alone, e.g., structural imaging. We investigated the diagnostic accuracy of MET-PET for the differentiation between tumorous and non-tumorous lesions in this group of patients. Forty eight MET-PET scans from 39 patients aged from 2 to 21 years (mean 15 ± 5.0 years) were analyzed. The MET tumor-uptake relative to a corresponding control region was calculated. A receiver operating characteristic (ROC) was performed to determine the MET-uptake value that best distinguishes tumorous from non-tumorous brain lesions. A differentiation between tumorous (n = 39) and non-tumorous brain lesions (n = 9) was possible at a threshold of 1.48 of relative MET-uptake with a sensitivity of 83% and a specificity of 92%, respectively. A differentiation between high grade malignant lesions (mean MET-uptake = 2.00 ± 0.46) and low grade tumors (mean MET-uptake = 1.84 ± 0.31) was not possible. There was a significant difference in MET-uptake between the histologically homogeneous subgroups of astrocytoma WHO grade II and anaplastic astrocytoma WHO grade III (P = 0.02). MET-PET might be a useful tool to differentiate tumorous from non-tumorous lesions in children and young adults when a decision for further therapy is difficult or impossible from routine structural imaging procedures alone. Keywords Brain tumor - Children - PET - Methionine - Molecular imaging
Primate multisensory object perception involves distributed brain regions. To investigate the network character of these regions of the human brain, we applied data-driven group spatial independent component analysis (ICA) to a functional magnetic resonance imaging (fMRI) data set acquired during a passive audio-visual (AV) experiment with common object stimuli. We labeled three group-level independent component (IC) maps as auditory (A), visual (V), and AV, based on their spatial layouts and activation time courses. The overlap between these IC maps served as definition of a distributed network of multisensory candidate regions including superior temporal, ventral occipito-temporal, posterior parietal and prefrontal regions. During an independent second fMRI experiment, we explicitly tested their involvement in AV integration. Activations in nine out of these twelve regions met the max-criterion (A < AV > V) for multisensory integration. Comparison of this approach with a general linear model-based region-of-interest definition revealed its complementary value for multisensory neuroimaging. In conclusion, we estimated functional networks of uni- and multisensory functional connectivity from one dataset and validated their functional roles in an independent dataset. These findings demonstrate the particular value of ICA for multisensory neuroimaging research and using independent datasets to test hypotheses generated from a data-driven analysis.
Background: Liver fibrosis in human immunodeficiency virus (HIV)-infected individuals is mostly attributable to co-infection with hepatitis B or C. The impact of other risk factors, including prolonged exposure to combined antiretroviral therapy (cART) is poorly understood. Our aim was to determine the prevalence of liver fibrosis and associated risk factors in HIV-infected individuals based on non-invasive fibrosis assessment using transient elastography (TE) and serum biomarkers (Fibrotest [FT]).
Methods: In 202 consecutive HIV-infected individuals (159 men; mean age 47 ± 9 years; 35 with hepatitis-C-virus [HCV] co-infection), TE and FT were performed. Repeat TE examinations were conducted 1 and 2 years after study inclusion.
Results: Significant liver fibrosis was present in 16% and 29% of patients, respectively, when assessed by TE (≥ 7.1 kPa) and FT (> 0.48). A combination of TE and FT predicted significant fibrosis in 8% of all patients (31% in HIV/HCV co-infected and 3% in HIV mono-infected individuals). Chronic ALT, AST and γ-GT elevation was present in 29%, 20% and 51% of all cART-exposed patients and in 19%, 8% and 45.5% of HIV mono-infected individuals. Overall, factors independently associated with significant fibrosis as assessed by TE (OR, 95% CI) were co-infection with HCV (7.29, 1.95-27.34), chronic AST (6.58, 1.30-33.25) and γ-GT (5.17, 1.56-17.08) elevation and time on dideoxynucleoside therapy (1.01, 1.00-1.02). In 68 HIV mono-infected individuals who had repeat TE examinations, TE values did not differ significantly during a median follow-up time of 24 months (median intra-patient changes at last TE examination relative to baseline: -0.2 kPa, p = 0.20).
Conclusions: Chronic elevation of liver enzymes was observed in up to 45.5% of HIV mono-infected patients on cART. However, only a small subset had significant fibrosis as predicted by TE and FT. There was no evidence for fibrosis progression during follow-up TE examinations.
Microplastics (MP) are contaminants of emerging concern in aquatic ecosystems. While the number of studies is rapidly increasing, a comparison of the toxicity of MP and natural particulate matter is largely missing. In addition, research focusses on the impacts of hydrophobic chemicals sorbed to plastics. However, the interactive effects of MP and hydrophilic, dissolved chemicals remain largely unknown. Therefore, we conducted chronic toxicity studies with larvae of the freshwater dipteran Chironomus riparius exposed to unplasticised polyvinyl chloride MP (PVC-MP) as well as kaolin and diatomite as reference materials for 28 days. In addition, we investigated the effects of particles in combination with the neonicotinoid imidacloprid in a multiple-stressor experiment. High concentrations of kaolin positively affected the chironomids. In contrast, exposure to diatomite and PVC-MP reduced the emergence and mass of C. riparius. Likewise, the toxicity of imidacloprid was enhanced in the presence of PVC-MP and slightly decreased in the co-exposure with kaolin. Overall, parallel experiments and chemical analysis indicate that the toxicity of PVC-MP was not caused by leached or sorbed chemicals. Our study demonstrates that PVC-MP induce more severe effects than both natural particulate materials. However, the latter are not benign per se, as the case of diatomite highlights. Considering the high, environmentally irrelevant concentrations needed to induce adverse effects, C. riparius is insensitive to exposures to PVC-MP.
Background/aims: Hepatocellular carcinoma (HCC) is a leading indication for liver transplantation (LT) worldwide. Early identification of patients at risk for HCC recurrence is of paramount importance since early treatment of recurrent HCC after LT may be associated with increased survival. We evaluated incidence of and predictors for HCC recurrence, with a focus on the course of AFP levels.
Methods: We performed a retrospective, single-center study of 99 HCC patients who underwent LT between January 28th, 1997 and May 11th, 2016. A multi-stage proportional hazards model with three stages was used to evaluate potential predictive markers, both by univariate and multivariable analysis, for influences on 1) recurrence after transplantation, 2) mortality without HCC recurrence, and 3) mortality after recurrence.
Results: 19/99 HCC patients showed recurrence after LT. Waiting time was not associated with overall HCC recurrence (HR = 1, p = 0.979). Similarly, waiting time did not affect mortality in LT recipients both with (HR = 0.97, p = 0.282) or without (HR = 0.99, p = 0.685) HCC recurrence. Log10-transformed AFP values at the time of LT (HR 1.75, p = 0.023) as well as after LT (HR 2.07, p = 0.037) were significantly associated with recurrence. Median survival in patients with a ratio (AFP at recurrence divided by AFP 3 months before recurrence) of 0.5 was greater than 70 months, as compared to a median of only 8 months in patients with a ratio of 5.
Conclusion: A rise in AFP levels rather than an absolute threshold could help to identify patients at short-term risk for HCC recurrence post LT, which may allow intensification of the surveillance strategy on an individualized basis.
Living on the edge: environmental variability of a shallow late Holocene cold-water coral mound
(2022)
Similar to their tropical counterparts, cold-water corals (CWCs) are able to build large three-dimensional reef structures. These unique ecosystems are at risk due to ongoing climate change. In particular, ocean warming, ocean acidification and changes in the hydrological cycle may jeopardize the existence of CWCs. In order to predict how CWCs and their reefs or mounds will develop in the near future one important strategy is to study past fossil CWC mounds and especially shallow CWC ecosystems as they experience a greater environmental variability compared to other deep-water CWC ecosystems. We present results from a CWC mound off southern Norway. A sediment core drilled from this relatively shallow (~ 100 m) CWC mound exposes in full detail hydrographical changes during the late Holocene, which were crucial for mound build-up. We applied computed tomography, 230Th/U dating, and foraminiferal geochemical proxy reconstructions of bottom-water-temperature (Mg/Ca-based BWT), δ18O for seawater density, and the combination of both to infer salinity changes. Our results demonstrate that the CWC mound formed in the late Holocene between 4 kiloannum (ka) and 1.5 ka with an average aggradation rate of 104 cm/kiloyears (kyr), which is significantly lower than other Holocene Norwegian mounds. The reconstructed BWTMg/Ca and seawater density exhibit large variations throughout the entire period of mound formation, but are strikingly similar to modern in situ observations in the nearby Tisler Reef. We argue that BWT does not exert a primary control on CWC mound formation. Instead, strong salinity and seawater density variation throughout the entire mound sequence appears to be controlled by the interplay between the Atlantic Water (AW) inflow and the overlying, outflowing Baltic-Sea water. CWC growth and mound formation in the NE Skagerrak was supported by strong current flow, oxygen replenishment, the presence of a strong boundary layer and larval dispersal through the AW, but possibly inhibited by the influence of fresh Baltic Water during the late Holocene. Our study therefore highlights that modern shallow Norwegian CWC reefs may be particularly endangered due to changes in water-column stratification associated with increasing net precipitation caused by climate change.