Sondersammelgebiets-Volltexte
Refine
Year of publication
Document Type
- Article (42)
- Conference Proceeding (3)
Language
- English (45)
Has Fulltext
- yes (45)
Is part of the Bibliography
- no (45)
Keywords
- visual cortex (2)
- BPTI (1)
- NACI (1)
- NMR spectroscopy (1)
- NMR spectrum (1)
- NMR structure determination (1)
- Naja naja atra (1)
- Non-negative matrix factorization (1)
- Peak overlap (1)
- Peak picking (1)
Institute
- Frankfurt Institute for Advanced Studies (FIAS) (45) (remove)
Summary
Wild relatives of crops thrive in habitats where environmental conditions can be restrictive for productivity and survival of cultivated species. The genetic basis of this variability, particularly for tolerance to high temperatures, is not well understood. We examined the capacity of wild and cultivated accessions to acclimate to rapid temperature elevations that cause heat stress (HS).
We investigated genotypic variation in thermotolerance of seedlings of wild and cultivated accessions. The contribution of polymorphisms associated with thermotolerance variation was examined regarding alterations in function of the identified gene.
We show that tomato germplasm underwent a progressive loss of acclimation to strong temperature elevations. Sensitivity is associated with intronic polymorphisms in the HS transcription factor HsfA2 which affect the splicing efficiency of its pre‐mRNA. Intron splicing in wild species results in increased synthesis of isoform HsfA2‐II, implicated in the early stress response, at the expense of HsfA2‐I which is involved in establishing short‐term acclimation and thermotolerance.
We propose that the selection for modern HsfA2 haplotypes reduced the ability of cultivated tomatoes to rapidly acclimate to temperature elevations, but enhanced their short‐term acclimation capacity. Hence, we provide evidence that alternative splicing has a central role in the definition of plant fitness plasticity to stressful conditions.
We present the black hole accretion code (BHAC), a new multidimensional general-relativistic magnetohydrodynamics module for the MPI-AMRVAC framework. BHAC has been designed to solve the equations of ideal general-relativistic magnetohydrodynamics in arbitrary spacetimes and exploits adaptive mesh refinement techniques with an efficient block-based approach. Several spacetimes have already been implemented and tested. We demonstrate the validity of BHAC by means of various one-, two-, and three-dimensional test problems, as well as through a close comparison with the HARM3D code in the case of a torus accreting onto a black hole. The convergence of a turbulent accretion scenario is investigated with several diagnostics and we find accretion rates and horizon-penetrating fluxes to be convergent to within a few percent when the problem is run in three dimensions. Our analysis also involves the study of the corresponding thermal synchrotron emission, which is performed by means of a new general-relativistic radiative transfer code, BHOSS. The resulting synthetic intensity maps of accretion onto black holes are found to be convergent with increasing resolution and are anticipated to play a crucial role in the interpretation of horizon-scale images resulting from upcoming radio observations of the source at the Galactic Center.
Abstract: Understanding the structure and dynamics of cortical connectivity is vital to understanding cortical function. Experimental data strongly suggest that local recurrent connectivity in the cortex is significantly non-random, exhibiting, for example, above-chance bidirectionality and an overrepresentation of certain triangular motifs. Additional evidence suggests a significant distance dependency to connectivity over a local scale of a few hundred microns, and particular patterns of synaptic turnover dynamics, including a heavy-tailed distribution of synaptic efficacies, a power law distribution of synaptic lifetimes, and a tendency for stronger synapses to be more stable over time. Understanding how many of these non-random features simultaneously arise would provide valuable insights into the development and function of the cortex. While previous work has modeled some of the individual features of local cortical wiring, there is no model that begins to comprehensively account for all of them. We present a spiking network model of a rodent Layer 5 cortical slice which, via the interactions of a few simple biologically motivated intrinsic, synaptic, and structural plasticity mechanisms, qualitatively reproduces these non-random effects when combined with simple topological constraints. Our model suggests that mechanisms of self-organization arising from a small number of plasticity rules provide a parsimonious explanation for numerous experimentally observed non-random features of recurrent cortical wiring. Interestingly, similar mechanisms have been shown to endow recurrent networks with powerful learning abilities, suggesting that these mechanism are central to understanding both structure and function of cortical synaptic wiring.
Author Summary: The problem of how the brain wires itself up has important implications for the understanding of both brain development and cognition. The microscopic structure of the circuits of the adult neocortex, often considered the seat of our highest cognitive abilities, is still poorly understood. Recent experiments have provided a first set of findings on the structural features of these circuits, but it is unknown how these features come about and how they are maintained. Here we present a neural network model that shows how these features might come about. It gives rise to numerous connectivity features, which have been observed in experiments, but never before simultaneously produced by a single model. Our model explains the development of these structural features as the result of a process of self-organization. The results imply that only a few simple mechanisms and constraints are required to produce, at least to the first approximation, various characteristic features of a typical fragment of brain microcircuitry. In the absence of any of these mechanisms, simultaneous production of all desired features fails, suggesting a minimal set of necessary mechanisms for their production.
Dendritic morphology has been shown to have a dramatic impact on neuronal function. However, population features such as the inherent variability in dendritic morphology between cells belonging to the same neuronal type are often overlooked when studying computation in neural networks. While detailed models for morphology and electrophysiology exist for many types of single neurons, the role of detailed single cell morphology in the population has not been studied quantitatively or computationally. Here we use the structural context of the neural tissue in which dendritic trees exist to drive their generation in silico. We synthesize the entire population of dentate gyrus granule cells, the most numerous cell type in the hippocampus, by growing their dendritic trees within their characteristic dendritic fields bounded by the realistic structural context of (1) the granule cell layer that contains all somata and (2) the molecular layer that contains the dendritic forest. This process enables branching statistics to be linked to larger scale neuroanatomical features. We find large differences in dendritic total length and individual path length measures as a function of location in the dentate gyrus and of somatic depth in the granule cell layer. We also predict the number of unique granule cell dendrites invading a given volume in the molecular layer. This work enables the complete population-level study of morphological properties and provides a framework to develop complex and realistic neural network models.
Background: Simple peak-picking algorithms, such as those based on lineshape fitting, perform well when peaks are completely resolved in multidimensional NMR spectra, but often produce wrong intensities and frequencies for overlapping peak clusters. For example, NOESY-type spectra have considerable overlaps leading to significant peak-picking intensity errors, which can result in erroneous structural restraints. Precise frequencies are critical for unambiguous resonance assignments.
Results: To alleviate this problem, a more sophisticated peaks decomposition algorithm, based on non-negative matrix factorization (NMF), was developed. We produce peak shapes from Fourier-transformed NMR spectra. Apart from its main goal of deriving components from spectra and producing peak lists automatically, the NMF approach can also be applied if the positions of some peaks are known a priori, e.g. from consistently referenced spectral dimensions of other experiments.
Conclusions: Application of the NMF algorithm to a three-dimensional peak list of the 23 kDa bi-domain section of the RcsD protein (RcsD-ABL-HPt, residues 688-890) as well as to synthetic HSQC data shows that peaks can be picked accurately also in spectral regions with strong overlap.
The information processing abilities of neural circuits arise from their synaptic connection patterns. Understanding the laws governing these connectivity patterns is essential for understanding brain function. The overall distribution of synaptic strengths of local excitatory connections in cortex and hippocampus is long-tailed, exhibiting a small number of synaptic connections of very large efficacy. At the same time, new synaptic connections are constantly being created and individual synaptic connection strengths show substantial fluctuations across time. It remains unclear through what mechanisms these properties of neural circuits arise and how they contribute to learning and memory. In this study we show that fundamental characteristics of excitatory synaptic connections in cortex and hippocampus can be explained as a consequence of self-organization in a recurrent network combining spike-timing-dependent plasticity (STDP), structural plasticity and different forms of homeostatic plasticity. In the network, associative synaptic plasticity in the form of STDP induces a rich-get-richer dynamics among synapses, while homeostatic mechanisms induce competition. Under distinctly different initial conditions, the ensuing self-organization produces long-tailed synaptic strength distributions matching experimental findings. We show that this self-organization can take place with a purely additive STDP mechanism and that multiplicative weight dynamics emerge as a consequence of network interactions. The observed patterns of fluctuation of synaptic strengths, including elimination and generation of synaptic connections and long-term persistence of strong connections, are consistent with the dynamics of dendritic spines found in rat hippocampus. Beyond this, the model predicts an approximately power-law scaling of the lifetimes of newly established synaptic connection strengths during development. Our results suggest that the combined action of multiple forms of neuronal plasticity plays an essential role in the formation and maintenance of cortical circuits.
Background: After induction of DNA double strand breaks (DSBs), the DNA damage response (DDR) is activated. One of the earliest events in DDR is the phosphorylation of serine 139 on the histone variant H2AX (gH2AX) catalyzed by phosphatidylinositol 3-kinases-related kinases. Despite being extensively studied, H2AX distribution[1] across the genome and gH2AX spreading around DSBs sites[2] in the context of different chromatin compaction states or transcription are yet to be fully elucidated.
Materials and methods: gH2AX was induced in human hepatocellular carcinoma cells (HepG2) by exposure to 10 Gy X-rays (250 kV, 16 mA). Samples were incubated 0.5, 3 or 24 hours post irradiation to investigate early, intermediate and late stages of DDR, respectively. Chromatin immunoprecipitation was performed to select H2AX, H3 and gH2AX-enriched chromatin fractions. Chromatin-associated DNA was then sequenced by Illumina ChIP-Seq platform. HepG2 gene expression and histone modification (H3K36me3, H3K9me3) ChIP-Seq profiles were retrieved from Gene Expression Omnibus (accession numbers GSE30240 and GSE26386, respectively).
Results: First, we combined G/C usage, gene content, gene expression or histone modification profiles (H3K36me3, H3K9me3) to define genomic compartments characterized by different chromatin compaction states or transcriptional activity. Next, we investigated H3, H2AX and gH2AX distributions in such defined compartments before and after exposure to ionizing radiation (IR) to study DNA repair kinetics during DDR. Our sequencing results indicate that H2AX distribution followed H3 occupancy and, thus, the nucleosome pattern. The highest H2AX and H3 enrichment was observed in transcriptionally active compartments (euchromatin) while the lowest was found in low G/C and gene-poor compartments (heterochromatin). Under physiological conditions, the body of highly and moderately transcribed genes was devoid of gH2AX, despite presenting high H2AX levels. gH2AX accumulation was observed in 5’ or 3’ flanking regions, instead. The same genes showed a prompt gH2AX accumulation during the early stage of DDR which then decreased over time as DDR proceeded.
Finally, during the late stage of DDR the residual gH2AX signal was entirely retained in heterochromatic compartments. At this stage, euchromatic compartments were completely devoid of gH2AX despite presenting high levels of non-phosphorylated H2AX.
Conclusions: We show that gH2AX distribution ultimately depends on H2AX occupancy, the latter following H3 occupancy and, thus, nucleosome pattern. Both H2AX and H3 levels were higher in actively transcribed compartments. However, gH2AX levels were remarkably low over the body of actively transcribed genes suggesting that transcription levels antagonize gH2AX spreading. Moreover, repair processes did not take place uniformly across the genome; rather, DNA repair was affected by genomic location and transcriptional activity. We propose that higher H2AX density in euchromaticcompartments results in high relative gH2AXconcentration soon after the activation of DDR, thus favoring the recruitment of the DNA repair machinery to those compartments. When the damage is repaired and gH2AX is removed, its residual fraction is retained in the heterochromatic compartments which are then targeted and repaired at later times.
When studying real world complex networks, one rarely has full access to all their components. As an example, the central nervous system of the human consists of 1011 neurons which are each connected to thousands of other neurons. Of these 100 billion neurons, at most a few hundred can be recorded in parallel. Thus observations are hampered by immense subsampling. While subsampling does not affect the observables of single neuron activity, it can heavily distort observables which characterize interactions between pairs or groups of neurons. Without a precise understanding how subsampling affects these observables, inference on neural network dynamics from subsampled neural data remains limited.
We systematically studied subsampling effects in three self-organized critical (SOC) models, since this class of models can reproduce the spatio-temporal activity of spontaneous activity observed in vivo. The models differed in their topology and in their precise interaction rules. The first model consisted of locally connected integrate- and fire units, thereby resembling cortical activity propagation mechanisms. The second model had the same interaction rules but random connectivity. The third model had local connectivity but different activity propagation rules. As a measure of network dynamics, we characterized the spatio-temporal waves of activity, called avalanches. Avalanches are characteristic for SOC models and neural tissue. Avalanche measures A (e.g. size, duration, shape) were calculated for the fully sampled and the subsampled models. To mimic subsampling in the models, we considered the activity of a subset of units only, discarding the activity of all the other units.
Under subsampling the avalanche measures A depended on three main factors: First, A depended on the interaction rules of the model and its topology, thus each model showed its own characteristic subsampling effects on A. Second, A depended on the number of sampled sites n. With small and intermediate n, the true A¬ could not be recovered in any of the models. Third, A depended on the distance d between sampled sites. With small d, A was overestimated, while with large d, A was underestimated.
Since under subsampling, the observables depended on the model's topology and interaction mechanisms, we propose that systematic subsampling can be exploited to compare models with neural data: When changing the number and the distance between electrodes in neural tissue and sampled units in a model analogously, the observables in a correct model should behave the same as in the neural tissue. Thereby, incorrect models can easily be discarded. Thus, systematic subsampling offers a promising and unique approach to model selection, even if brain activity was far from being fully sampled.
Neuronal dynamics differs between wakefulness and sleep stages, so does the cognitive state. In contrast, a single attractor state, called self-organized critical (SOC), has been proposed to govern human brain dynamics for its optimal information coding and processing capabilities. Here we address two open questions: First, does the human brain always operate in this computationally optimal state, even during deep sleep? Second, previous evidence for SOC was based on activity within single brain areas, however, the interaction between brain areas may be organized differently. Here we asked whether the interaction between brain areas is SOC. ...
Tumour cells show a varying susceptibility to radiation damage as a function of the current cell cycle phase. While this sensitivity is averaged out in an unperturbed tumour due to unsynchronised cell cycle progression, external stimuli such as radiation or drug doses can induce a resynchronisation of the cell cycle and consequently induce a collective development of radiosensitivity in tumours. Although this effect has been regularly described in experiments it is currently not exploited in clinical practice and thus a large potential for optimisation is missed. We present an agent-based model for three-dimensional tumour spheroid growth which has been combined with an irradiation damage and kinetics model. We predict the dynamic response of the overall tumour radiosensitivity to delivered radiation doses and describe corresponding time windows of increased or decreased radiation sensitivity. The degree of cell cycle resynchronisation in response to radiation delivery was identified as a main determinant of the transient periods of low and high radiosensitivity enhancement. A range of selected clinical fractionation schemes is examined and new triggered schedules are tested which aim to maximise the effect of the radiation-induced sensitivity enhancement. We find that the cell cycle resynchronisation can yield a strong increase in therapy effectiveness, if employed correctly. While the individual timing of sensitive periods will depend on the exact cell and radiation types, enhancement is a universal effect which is present in every tumour and accordingly should be the target of experimental investigation. Experimental observables which can be assessed non-invasively and with high spatio-temporal resolution have to be connected to the radiosensitivity enhancement in order to allow for a possible tumour-specific design of highly efficient treatment schedules based on induced cell cycle synchronisation.
Author Summary: The sensitivity of a cell to a dose of radiation is largely affected by its current position within the cell cycle. While under normal circumstances progression through the cell cycle will be asynchronous in a tumour mass, external influences such as chemo- or radiotherapy can induce a synchronisation. Such a common progression of the inner clock of the cancer cells results in the critical dependence on the effectiveness of any drug or radiation dose on a suitable timing for its administration. We analyse the exact evolution of the radiosensitivity of a sample tumour spheroid in a computer model, which enables us to predict time windows of decreased or increased radiosensitivity. Fractionated radiotherapy schedules can be tailored in order to avoid periods of high resistance and exploit the induced radiosensitivity for an increase in therapy efficiency. We show that the cell cycle effects can drastically alter the outcome of fractionated irradiation schedules in a spheroid cell system. By using the correct observables and continuous monitoring, the cell cycle sensitivity effects have the potential to be integrated into treatment planing of the future and thus to be employed for a better outcome in clinical cancer therapies.