Refine
Year of publication
Document Type
- Preprint (2084) (remove)
Has Fulltext
- yes (2084)
Keywords
- Kollisionen schwerer Ionen (33)
- heavy ion collisions (27)
- Deutsch (23)
- Quark-Gluon-Plasma (14)
- equation of state (13)
- QGP (12)
- Kongress (10)
- Syntax (10)
- quark-gluon plasma (10)
- Multicomponent Tree Adjoining Grammar (9)
Institute
- Physik (1278)
- Frankfurt Institute for Advanced Studies (FIAS) (889)
- Informatik (748)
- Medizin (172)
- Extern (82)
- Biowissenschaften (71)
- Ernst Strüngmann Institut (69)
- Mathematik (47)
- Psychologie (46)
- MPI für Hirnforschung (45)
We review the representation problem based on factoring and show that this problem gives rise to alternative solutions to a lot of cryptographic protocols in the literature. And, while the solutions so far usually either rely on the RSA problem or the intractability of factoring integers of a special form (e.g., Blum integers), the solutions here work with the most general factoring assumption. Protocols we discuss include identification schemes secure against parallel attacks, secure signatures, blind signatures and (non-malleable) commitments.
Single-particle electron cryo-microscopy (cryoEM) has undergone a “resolution revolution” that makes it possible to characterize megadalton (MDa) complexes at atomic resolution without crystals. To fully exploit the new opportunities in molecular microscopy, new procedures for the cloning, expression and purification of macromolecular complexes need to be explored. Macromolecular assemblies are often unstable, and invasive construct design or inadequate purification conditions or sample preparation methods can result in disassembly or denaturation. The structure of the 2.6 MDa yeast fatty acid synthase (FAS) has been studied by electron microscopy since the 1960s. We report a new, streamlined protocol for the rapid production of purified yeast FAS for structure determination by high-resolution cryoEM. Together with a companion protocol for preparing cryoEM specimens on a hydrophilized graphene layer, our new protocol has yielded a 3.1 Å map of yeast FAS from 15,000 automatically picked particles within a day. The high map quality enabled us to build a complete atomic model of an intact fungal FAS.
Neuroscience studies in non-human primates (NHP) often follow the rule of thumb that results observed in one animal must be replicated in at least one other. However, we lack a statistical justification for this rule of thumb, or an analysis of whether including three or more animals is better than including two. Yet, a formal statistical framework for experiments with few subjects would be crucial for experimental design, ethical justification, and data analysis. Also, including three or four animals in a study creates the possibility that the results observed in one animal will differ from those observed in the others: we need a statistically justified rule to resolve such situations. Here, I present a statistical framework to address these issues. This framework assumes that conducting an experiment will produce a similar result for a large proportion of the population (termed ‘representative’), but will produce spurious results for a substantial proportion of animals (termed ‘outliers’); the fractions of ‘representative’ and ‘outliers’ animals being defined by a prior distribution. I propose a procedure in which experimenters collect results from M animals and accept results that are observed in at least N of them (‘N-out-of-M’ procedure). I show how to compute the risks α (of reaching an incorrect conclusion) and β (of failing to reach a conclusion) for any prior distribution, and as a function of N and M. Strikingly, I find that the N-out-of-M model leads to a similar conclusion across a wide range of prior distributions: recordings from two animals lowers the risk α and therefore ensures reliable result, but leaves a large risk β; and recordings from three animals and accepting results observed in two of them strikes an efficient balance between acceptable risks α and β. This framework gives a formal justification for the rule of thumb of using at least two animals in NHP studies, suggests that recording from three animals when possible markedly improves statistical power, provides a statistical solution for situations where results are not consistent between all animals, and may apply to other types of studies involving few animals.
Strangeness enhancement is discussed as a feature specific to relativistic nuclear collisions which create a fireball of strongly interacting matter at high energy density. At very high energy this is suggested to be partonic matter, but at lower energy it should consist of yet unknown hadronic degrees of freedom. The freeze-out of this high density state to a hadron gas can tell us about properties of fireball matter. The hadron gas at the instant of its formation captures conditions directly at the QCD phase boundary at top SPS and RHIC energy, chiefly the critical temperature and energy density.
The goal of this paper is to re-examine the status of the condition in (1) proposed in Alexiadou and Anagnostopoulou (2001; henceforth A&A 2001), in view of recent developments in syntactic theory. (1) The subject-in-situ generalization (SSG) By Spell-Out, vP can contain only one argument with a structural Case feature. We argue that (1) is a more general condition than previously recognized, and that the domain of its application is parametrized. More specifically, based on a comparison between Indo-European (IE) and Khoisan languages, we argue that (1) supports an interpretation of the EPP as a general principle, and not as a property of T. Viewed this way, the SSG is a condition that forces dislocation of arguments as a consequence of a constraint on Case checking.
The thrombopoietin receptor agonist eltrombopag was successfully used against human cytomegalovirus (HCMV)-associated thrombocytopenia refractory to immunomodulatory and antiviral drugs. These effects were ascribed to effects of eltrombopag on megakaryocytes. Here, we tested whether eltrombopag may also exert direct antiviral effects. Therapeutic eltrombopag concentrations inhibited HCMV replication in human fibroblasts and adult mesenchymal stem cells infected with six different virus strains and drug-resistant clinical isolates. Eltrombopag also synergistically increased the anti-HCMV activity of the mainstay drug ganciclovir. Time-of-addition experiments suggested that eltrombopag interferes with HCMV replication after virus entry. Eltrombopag was effective in thrombopoietin receptor-negative cells, and addition of Fe3+ prevented the anti-HCMV effects, indicating that it inhibits HCMV replication via iron chelation. This may be of particular interest for the treatment of cytopenias after haematopoietic stem cell transplantation, as HCMV reactivation is a major reason for transplantation failure. Since therapeutic eltrombopag concentrations are effective against drug-resistant viruses and synergistically increase the effects of ganciclovir, eltrombopag is also a drug repurposing candidate for the treatment of therapy-refractory HCMV disease.
The successful elimination of bacteria such as Streptococcus pneumoniae from a host involves the coordination between different parts of the immune system. Previous studies have explored the effects of the initial pneumococcal load (bacterial dose) on different representations of innate immunity, finding that pathogenic outcomes can vary with the size of the bacterial dose. However, others yield support to the notion of dose-independent factors contributing to bacterial clearance. In this paper, we seek to provide a deeper understanding of the immune responses associated to the pneumococcus. To this end, we formulate a model that realizes an abstraction of the innate-regulatory immune host response. Stability and bifurcation analyses of the model reveal the following trichotomy of pneumococcal outcomes determined by the bifurcation parameters: (i) dose-independent clearance; (ii) dose-independent persistence; and (iii) dose-limited clearance. Bistability, where the bacteria-free equilibrium co-stabilizes with the most substantial steady-state bacterial load is the specific result behind dose-limited clearance. The trichotomy of pneumococcal outcomes here described integrates all previously observed bacterial fates into a unified framework.
This paper proposes a corpus encoding standard that meets the needs of linguistic research using a variety of linguistic data structures. The standard was developed in SFB 441, a research project at the University of Tuebingen. The principal concern of SFB 441 are the empirical data structures which feed into linguistic theory building. SFB 441 consists of several projects, most of which are building corpora to empirically investigate various linguistic phenomena in various languages (e.g. modal verbs in German, forms of address and politeness in Russian). These corpora will form the components of the "Tuebingen collection of reusable, empirical, linguistic data structures (TUSNELDA)". The TUSNELDA annotation standard aims at providing a uniform encoding scheme for all subcorpora and texts of TUSNELDA such that they can be processed with uniform standardized tools. To guarantee maximal reusability we use XML for encoding. Previous SGML standards for text encoding were provided by the Text Encoding Initiative (TEI) and the Expert Advisory Group on Language Engineering Standards (Corpus Encoding Standard, CES). The TUSNELDA standard is based on TEI and XCES (XML version of CES) but takes into account the specific needs of the SFB projects, i.e. the peculiarities of the examined languages and linguistic phenomena.
The purpose of this paper is to describe the TüBa-D/Z treebank of written German and to compare it to the independently developed TIGER treebank (Brants et al., 2002). Both treebanks, TIGER and TüBa-D/Z, use an annotation framework that is based on phrase structure grammar and that is enhanced by a level of predicate-argument structure. The comparison between the annotation schemes of the two treebanks focuses on the different treatments of free word order and discontinuous constituents in German as well as on differences in phrase-internal annotation.
VASP is a member of the Enabled/VASP protein family that is involved in cortical actin dynamics and may also contribute to the formation of gap junctions. In vessels, gap junctional coupling allows the transfer of signals along the vessel wall and coordinates vascular behavior. Moreover, VASP is reportedly a mediator of NO-induced inhibition of platelet aggregation. Therefore, we hypothesized that VASP exerts also important physiologic functions in arterioles. We examined the spread of vasodilations enabled by gap junctional coupling in endothelial cells as well as NO-induced arteriolar dilations in VASP-deficient mice by intravital microscopy of the microcirculation in a skeletal muscle in anesthetized mice. Conducted dilations were initiated by brief, locally confined stimulation of the arterioles with acetylcholine. The maximal diameters of the arterioles under study ranged from 30 to 40 μm. Brief stimulation with acetylcholine induced a short dilation at the local site that was also observed at remote, upstream sites without an attenuation of the amplitude up to a distance of 1.2 mm in control animals (wild-type). In contrast, remote dilations were reduced in VASP-deficient mice despite a similar local dilation indicating an impairment of conducted dilations. Superfusion of NOdonors induced a concentration-dependent dilation in wild-type mice. However, these dilations were slightly reduced in VASP-deficient animals. In contrast, dilations induced by the endothelial stimulator acetylcholine were fully preserved in VASP-deficient mice. In summary, this study suggests that VASP exerts critical functions in arteriolar diameter control. It is crucial for the conduction of dilator signals along the endothelial cell layer. The impairment possibly reflects a perturbed formation of gap junctions in the endothelial cell membrane. VASP also participates in the full dilatory potential of NOdonors although the effect of its deficiency is only subtle. In contrast, VASP is not required for dilations initiated by endothelial stimulation which are mediated in the murine microcirculation by an EDH-mechanism.
Twisted heterostructures of van der Waals materials have received much attention for their many remarkable properties. Here, we present a comprehensive theory of the long-range ordered magnetic phases of twisted bilayer α-RuCl3 via a combination of first-principles calculations and atomistic simulations. While a monolayer exhibits zigzag antiferromagnetic order with three possible ordering wave vectors, a rich phase diagram is obtained for moiré superlattices as a function of interlayer exchange and twist angle. For large twist angles, each layer spontaneously picks a single zigzag ordering wave vector, whereas, for small twist angles, the ground state involves a combination of all three wave vectors in a complex hexagonal domain structure. This multi-domain order minimizes the interlayer energy while enduring the energy cost due to the domain wall formation. Our results indicate that magnetic frustration due to stacking-dependent interlayer exchange in moiré superlattices can be used to tune the magnetic ground state and enhance quantum fluctuations in α-RuCl3.
The rapidity distribution of thermal photons produced in Pb+Pb collisions at CERN-SPS energies is calculated within scaling and three- fluid hydrodynamics. It is shown that these scenarios lead to very different rapidity spectra. A measurement of the rapidity dependence of photon radiation can give cleaner insight into the reaction dynamics than pion spectra, especially into the rapidity dependence of the temperature.
Off-central heavy-ion collisions are known to feature magnetic fields with magnitudes and characteristic gradients corresponding to the scale of the strong interactions. In this work, we employ equilibrium lattice simulations of the underlying theory, QCD, involving similar inhomogeneous magnetic field profiles to achieve a better understanding of this system. We simulate three flavors of dynamical staggered quarks with physical masses at a range of magnetic fields and temperatures, and extrapolate the results to the continuum limit. Analyzing the impact of the field on the quark condensate and the Polyakov loop, we find non-trivial spatial features that render the QCD medium qualitatively different as in the homogeneous setup, especially at temperatures around the transition. In addition, we construct leading-order chiral perturbation theory for the inhomogeneous background and compare its prediction to our lattice results at low temperature. Our findings will be useful to benchmark effective theories and low-energy models of QCD for a better description of peripheral heavy-ion collisions.
Off-central heavy-ion collisions are known to feature magnetic fields with magnitudes and characteristic gradients corresponding to the scale of the strong interactions. In this work, we employ equilibrium lattice simulations of the underlying theory, QCD, involving similar inhomogeneous magnetic field profiles to achieve a better understanding of this system. We simulate three flavors of dynamical staggered quarks with physical masses at a range of magnetic fields and temperatures, and extrapolate the results to the continuum limit. Analyzing the impact of the field on the quark condensate and the Polyakov loop, we find non-trivial spatial features that render the QCD medium qualitatively different as in the homogeneous setup, especially at temperatures around the transition. In addition, we construct leading-order chiral perturbation theory for the inhomogeneous background and compare its prediction to our lattice results at low temperature. Our findings will be useful to benchmark effective theories and low-energy models of QCD for a better description of peripheral heavy-ion collisions.
Off-central heavy-ion collisions are known to feature magnetic fields with magnitudes and characteristic gradients corresponding to the scale of the strong interactions. In this work, we employ equilibrium lattice simulations of the underlying theory, QCD, involving similar inhomogeneous magnetic field profiles to achieve a better understanding of this system. We simulate three flavors of dynamical staggered quarks with physical masses at a range of magnetic fields and temperatures, and extrapolate the results to the continuum limit. Analyzing the impact of the field on the quark condensate and the Polyakov loop, we find non-trivial spatial features that render the QCD medium qualitatively different as in the homogeneous setup, especially at temperatures around the transition. In addition, we construct leading-order chiral perturbation theory for the inhomogeneous background and compare its prediction to our lattice results at low temperature. Our findings will be useful to benchmark effective theories and low-energy models of QCD for a better description of peripheral heavy-ion collisions.
Isothermal titration calorimetry (ITC) is a widely used technique for the characterization of protein-protein and protein-ligand interactions. It provides information on the stoichiometry, affinity, and the thermodynamic driving forces of interactions. This chapter exemplifies the use of ITC to investigate interactions between human autophagy modifiers (LC3/GABARAP proteins) and their interaction partners, the LIR motif containing sequences. The purpose of this report is to present a detailed protocol for the production of LC3/GABARAP-interacting LIR peptides using E. coli expression systems. In addition, we outline the design of ITC experiments using the LC3/GABARAP:peptide interactions as an example. Comprehensive troubleshooting notes are provided to facilitate the adaptation of these protocols to different ligand-receptor systems. The methodology outlined for studying protein-ligand interactions will help to avoid common errors and misinterpretations of experimental results.
Hadron lists based on experimental studies summarized by the Particle Data Group (PDG) are a crucial input for the equation of state and thermal models used in the study of strongly-interacting matter produced in heavy-ion collisions. Modeling of these strongly-interacting systems is carried out via hydrodynamical simulations, which are followed by hadronic transport codes that also require a hadronic list as input. To remain consistent throughout the different stages of modeling of a heavy-ion collision, the same hadron list with its corresponding decays must be used at each step. It has been shown that even the most uncertain states listed in the PDG from 2016 are required to reproduce partial pressures and susceptibilities from Lattice Quantum Chromodynamics with the hadronic list known as the PDG2016+. Here, we update the hadronic list for use in heavy-ion collision modeling by including the latest experimental information for all states listed in the Particle Data Booklet in 2021. We then compare our new list, called PDG2021+, to Lattice Quantum Chromodynamics results and find that it achieves even better agreement with the first principles calculations than the PDG2016+ list. Furthermore, we develop a novel scheme based on intermediate decay channels that allows for only binary decays, such that PDG2021+ will be compatible with the hadronic transport framework SMASH. Finally, we use these results to make comparisons to experimental data and discuss the impact on particle yields and spectra.
Cryo-electron tomography (cryo-ET) is a powerful method to elucidate subcellular architecture and to structurally analyse biomolecules in situ by subtomogram averaging (STA). Specimen thickness is a key factor affecting cryo-ET data quality. Cells that are too thick for transmission imaging can be thinned by cryo-focused-ion-beam (cryo-FIB) milling. However, optimal specimen thickness for cryo-ET on lamellae has not been systematically investigated. Furthermore, the ions used to ablate material can cause damage in the lamellae, thereby reducing STA resolution. Here, we systematically benchmark the resolution depending on lamella thickness and the depth of the particles within the sample. Up to ca. 180 nm, lamella thickness does not negatively impact resolution. This shows that there is no need to generate very thin lamellae and thickness can be chosen such that it captures major cellular features. Furthermore, we show that gallium-ion-induced damage extends to depths of up to 30 nm from either lamella surface.
A three-center phenomenological model able to explain, at least from a qualitative point of view, the difference in the observed yield of a particle-accompanied fission and that of binary fission was developed. It is derived from the liquid drop model under the assumption that the aligned configuration, with the emitted particle between the light and heavy fragment is obtained by increasing continuously the separation distance, while the radii of the light fragment and of the light particle are kept constant. During the first stage of the deformation one has a two-center evolution until the neck radius becomes equal to the radius of the emitted particle. Then the three center starts developing by decreasing with the same amount the two tip distances. In such a way a second minimum, typical for a cluster molecule, appears in the deformation energy. Examples are presented for $^{240}$Pu parent nucleus emitting $\alpha$-particles and $^{14}$C in a ternary process.
Dual coding theories of knowledge suggest that meaning is represented in the brain by a double code, which comprises language-derived representations in the Anterior Temporal Lobe and sensory-derived representations in perceptual and motor regions. This approach predicts that concrete semantic features should activate both codes, whereas abstract features rely exclusively on the linguistic code. Using magnetoencephalography (MEG), we adopted a temporally resolved multiple regression approach to identify the contribution of abstract and concrete semantic predictors to the underlying brain signal. Results evidenced early involvement of anterior-temporal and inferior-frontal brain areas in both abstract and concrete semantic information encoding. At later stages, occipito-temporal regions showed greater responses to concrete compared to abstract features. The present findings shed new light on the temporal dynamics of abstract and concrete semantic representations in the brain and suggest that the concreteness of words processed first with a transmodal/linguistic code, housed in frontotemporal brain systems, and only after with an imagistic/sensorimotor code in perceptual and motor regions.
This paper is one argument for a theory of grammatical relations in Chinese in which there are no grammatical relations beyond semantic roles, and no lexical relation-changing rules. As the passive rule is one of the most common relation changing rules cross-linguistically, in this paper I will address the question of whether or not Mandarin Chinese has lexical passives, that is, passives defined as in Relational Grammar (see for example Perlmutter and Postal 1977) and the early Lexical Functional Grammar (LFG) literature (e.g. Bresnan 1982), where a 2-arc (object) is promoted to a 1-arc (subject).
Abstract
Co-infections by multiple pathogens have important implications in many aspects of health, epidemiology and evolution. However, how to disentangle the contributing factors of the immune response when two infections take place at the same time is largely unexplored. Using data sets of the immune response during influenza-pneumococcal co-infection in mice, we employ here topological data analysis to simplify and visualise high dimensional data sets.
We identified persistent shapes of the simplicial complexes of the data in the three infection scenarios: single viral infection, single bacterial infection, and co-infection. The immune response was found to be distinct for each of the infection scenarios and we uncovered that the immune response during the co-infection has three phases and two transition points. During the first phase, its dynamics is inherited from its response to the primary (viral) infection. The immune response has an early (few hours post co-infection) and then modulates its response to finally react against the secondary (bacterial) infection. Between 18 to 26 hours post co-infection the nature of the immune response changes again and does no longer resembles either of the single infection scenarios.
Author summary
The mapper algorithm is a topological data analysis technique used for the qualitative analysis, simplification and visualisation of high dimensional data sets. It generates a low-dimensional image that captures topological and geometric information of the data set in high dimensional space, which can highlight groups of data points of interest and can guide further analysis and quantification.
To understand how the immune system evolves during the co-infection between viruses and bacteria, and the role of specific cytokines as contributing factors for these severe infections, we use Topological Data Analysis (TDA) along with an extensive semi-unsupervised parameter value grid search, and k-nearest neighbour analysis.
We find persistent shapes of the data in the three infection scenarios, single viral and bacterial infections and co-infection. The immune response is shown to be distinct for each of the infections scenarios and we uncover that the immune response during the co-infection has three phases and two transition points, a previously unknown property regarding the dynamics of the immune response during co-infection.
Motivated by recently reported magnetic-field induced topological phases in ultracold atoms and correlated Moiré materials, we investigate topological phase transitions in a minimal model consisting of interacting spinless fermions described by the Hofstadter model on a square lattice. For interacting lattice Hamiltonians in the presence of a commensurate magnetic flux it has been demonstrated that the quantized Hall conductivity is constrained by a Lieb-Schultz-Mattis (LSM)-type theorem due to magnetic translation symmetry. In this work, we revisit the validity of the theorem for such models and establish that a topological phase transition from a topological to a trivial insulating phase can be realized but must be accompanied by spontaneous magnetic translation symmetry breaking caused by charge ordering of the spinless fermions. To support our findings, the topological phase diagram for varying interaction strength is mapped out numerically with exact diagonalization for different flux quantum ratios and band fillings using symmetry indicators. We discuss our results in the context of the LSM-type theorem.
A commonly held view in the literature on Scrambling and Clitic Doubling is that both constructions are sensitive to Specificity. For this reason Sportiche (1992) proposes to unify the two, an approach which has become quite standard in the relevant literature ever since. However, the claim that clitic doubling is the counterpart of Germanic scrambling has never been substantiated. In this paper we present extensive evidence from Greek that Clitic Doubling has common formal properties with Germanic Scrambling/Object Shift. Our evidence consists mainly of binding facts observed when doubling takes place, which seem, at first sight, to be completely unexpected. On closer inspection, however, it turns out that these facts are strongly reminiscent of the effects showing up in Germanic scrambling. We propose that these properties can be derived under a theory of clitic constructions along the lines of Sportiche (1992) implemented into the framework of Chomsky (1995). Finally we suggest the that the crosslinguistic distribution of Scrambling as opposed to Clitic Doubling should be linked to a parameter relating to properties of Agr: Move/Merge XP vs. Move/Merge X° to Agr. We show that this parameter unifies the behaviour of subjects and objects within a language and across languages. The paper is organised as follows. In section 2 we present evidence from binding, interpretational and prosodic effects that doubling and scrambling display very similar properties. In section 3 we present Sportiches account and point out some problems for it. In section 4 we present our proposal.
Quantitative evaluation of parsers has traditionally centered around the PARSEVAL measures of crossing brackets, (labeled) precision, and (labeled) recall. However, it is well known that these measures do not give an accurate picture of the quality of the parsers output. Furthermore, we will show that they are especially unsuited for partial parsers. In recent years, research has concentrated on dependencybased evaluation measures. We will show in this paper that such a dependency-based evaluation scheme is particularly suitable for partial parsers. TüBa-D, the treebank used here for evaluation, contains all the necessary dependency information so that the conversion of trees into a dependency structure does not have to rely on heuristics. Therefore, the dependency representations are not only reliable, they are also linguistically motivated and can be used for linguistic purposes.
This paper presents an approach to the question whether it is possible to construct a parser based on ideas from case-based reasoning. Such a parser would employ a partial analysis of the input sentence to select a (nearly) complete syntax tree and then adapt this tree to the input sentence. The experiments performed on German data from the Tüba-D/Z treebank and the KaRoPars partial parser show that a wide range of levels of generality can be reached, depending on which types of information are used to determine the similarity between input sentence and training sentences. The results are such that it is possible to construct a case-based parser. The optimal setting out of those presented here need to be determined empirically.
Background: Marked sex differences in autism prevalence accentuate the need to understand the role of biological sex-related factors in autism. Efforts to unravel sex differences in the brain organization of autism have, however, been challenged by the limited availability of female data.
Methods: We addressed this gap by using a large sample of males and females with autism and neurotypical (NT) control individuals (ABIDE; Autism: 362 males, 82 females; NT: 409 males, 166 females; 7-18 years). Discovery analyses examined main effects of diagnosis, sex and their interaction across five resting-state fMRI (R-fMRI) metrics (voxel-level Z > 3.1, cluster-level P < 0.01, gaussian random field corrected). Secondary analyses assessed the robustness of the results to different pre-processing approaches and their replicability in two independent samples: the EU-AIMS Longitudinal European Autism Project (LEAP) and the Gender Explorations of Neurogenetics and Development to Advance Autism Research (GENDAAR).
Results: Discovery analyses in ABIDE revealed significant main effects across the intrinsic functional connectivity (iFC) of the posterior cingulate cortex, regional homogeneity and voxel-mirrored homotopic connectivity (VMHC) in several cortical regions, largely converging in the default network midline. Sex-by-diagnosis interactions were confined to the dorsolateral occipital cortex, with reduced VMHC in females with autism. All findings were robust to different pre-processing steps. Replicability in independent samples varied by R-fMRI measures and effects with the targeted sex-by-diagnosis interaction being replicated in the larger of the two replication samples – EU-AIMS LEAP.
Limitations: Given the lack of a priori harmonization among the discovery and replication datasets available to date, sample-related variation remained and may have affected replicability.
Conclusions: Atypical cross-hemispheric interactions are neurobiologically relevant to autism. They likely result from the combination of sex-dependent and sex-independent factors with a differential effect across functional cortical networks. Systematic assessments of the factors contributing to replicability are needed and necessitate coordinated large-scale data collection across studies.
Competing Interest Statement: ADM receives royalties from the publication of the Italian version of the Social Responsiveness Scale Child Version by Organization Speciali, Italy. JKB has been a consultant to, advisory board member of, and a speaker for Takeda/Shire, Medice, Roche, and Servier. He is not an employee of any of these companies and not a stock shareholder of any of these companies. He has no other financial or material support, including expert testimony, patents, or royalties. CFB is director and shareholder in SBGneuro Ltd. TC has received consultancy from Roche and Servier and received book royalties from Guildford Press and Sage. DM has been a consultant to, and advisory board member, for Roche and Servier. He is not an employee of any of these companies, and not a stock shareholder of any of these companies. TB served in an advisory or consultancy role for Lundbeck, Medice, Neurim Pharmaceuticals, Oberberg GmbH, Shire, and Infectopharm. He received conference support or speakers fee by Lilly, Medice, and Shire. He received royalties from Hogrefe, Kohlhammer, CIP Medien, Oxford University Press; the present work is unrelated to these relationships. JT is a consultant to Roche. The remaining authors declare no competing interests.
Three-body nuclear forces play an important role in the structure of nuclei and hypernuclei and are also incorporated in models to describe the dynamics of dense baryonic matter, such as in neutron stars. So far, only indirect measurements anchored to the binding energies of nuclei can be used to constrain the three-nucleon force, and if hyperons are considered, the scarce data on hypernuclei impose only weak constraints on the three-body forces. In this work, we present the first direct measurement of the p−p−p and p−p−Λ systems in terms of three-particle correlation functions carried out for pp collisions at s√=13 TeV. Three-particle cumulants are extracted from the correlation functions by applying the Kubo formalism, where the three-particle interaction contribution to these correlations can be isolated after subtracting the known two-body interaction terms. A negative cumulant is found for the p−p−p system, hinting to the presence of a residual three-body effect while for p−p−Λ the cumulant is consistent with zero. This measurement demonstrates the accessibility of three-baryon correlations at the LHC.
Three-body nuclear forces play an important role in the structure of nuclei and hypernuclei and are also incorporated in models to describe the dynamics of dense baryonic matter, such as in neutron stars. So far, only indirect measurements anchored to the binding energies of nuclei can be used to constrain the three-nucleon force, and if hyperons are considered, the scarce data on hypernuclei impose only weak constraints on the three-body forces. In this work, we present the first direct measurement of the p−p−p and p−p−Λ systems in terms of three-particle mixed moments carried out for pp collisions at s√ = 13 TeV. Three-particle cumulants are extracted from the normalised mixed moments by applying the Kubo formalism, where the three-particle interaction contribution to these moments can be isolated after subtracting the known two-body interaction terms. A negative cumulant is found for the p−p−p system, hinting to the presence of a residual three-body effect while for p−p−Λ the cumulant is consistent with zero. This measurement demonstrates the accessibility of three-baryon correlations at the LHC.
Fungi play pivotal roles in ecosystem functioning, but little is known about their global patterns of diversity, endemicity, vulnerability to global change drivers and conservation priority areas. We applied the high-resolution PacBio sequencing technique to identify fungi based on a long DNA marker that revealed a high proportion of hitherto unknown fungal taxa. We used a Global Soil Mycobiome consortium dataset to test relative performance of various sequencing depth standardization methods (calculation of residuals, exclusion of singletons, traditional and SRS rarefaction, use of Shannon index of diversity) to find optimal protocols for statistical analyses. Altogether, we used six global surveys to infer these patterns for soil-inhabiting fungi and their functional groups. We found that residuals of log-transformed richness (including singletons) against log-transformed sequencing depth yields significantly better model estimates compared with most other standardization methods. With respect to global patterns, fungal functional groups differed in the patterns of diversity, endemicity and vulnerability to main global change predictors. Unlike α-diversity, endemicity and global-change vulnerability of fungi and most functional groups were greatest in the tropics. Fungi are vulnerable mostly to drought, heat, and land cover change. Fungal conservation areas of highest priority include wetlands and moist tropical ecosystems.
Background: MDM2 inhibitors are under investigation for the treatment of acute myeloid leukaemia (AML) patients in phase III clinical trials. To study resistance formation to MDM2 inhibitors in AML cells, we here established 45 sub-lines of the AML TP53 wild-type cell lines MV4-11 (15 sub-lines), OCI-AML-2 (10 sub-lines), OCI-AML-3 (12 sub-lines), and SIG-M5 (8 sub-lines) with resistance to the MDM2 inhibitor nutlin-3.
Methods: Nutlin-3-resistant sub-lines were established by continuous exposure to stepwise increasing drug concentrations. The TP53 status was determined by next generation sequencing, cell viability was measured by MTT assay, and p53 was depleted using lentiviral vectors encoding shRNA.
Results: All MV4-11 sub-lines harboured the same R248W mutation and all OCI-AML-2 sub-lines the same Y220C mutation, indicating the selection of pre-existing TP53-mutant subpopulations. In concordance, rare alleles harbouring the respective mutations could be detected in the parental MV4-11 and OCI-AML-2 cell lines. The OCI-AML-3 and SIG-M5 sub-lines were characterised by varying TP53 mutations or wild type TP53, indicating the induction of de novo TP53 mutations. Doxorubicin, etoposide, gemcitabine, cytarabine, and fludarabine resistance profiles revealed a noticeable heterogeneity among the sub-lines even of the same parental cell lines. Loss-of-p53 function was not generally associated with decreased sensitivity to cytotoxic drugs.
Conclusion: We introduce a substantial set of models of acquired MDM2 inhibitor resistance in AML. MDM2 inhibitors select, in dependence on the nature of a given AML cell population, pre-existing TP53-mutant subpopulations or induce de novo TP53 mutations. Although loss-of-p53 function has been associated with chemoresistance in AML, nutlin-3-adapted sub-lines displayed in the majority of experiments similar or increased drug sensitivity compared to the respective parental cells. Hence, chemotherapy may remain an option for AML patients after MDM2 inhibitor therapy failure. Even sub-lines of the same parental cancer cell line displayed considerable heterogeneity in their response to other anti-cancer drugs, indicating the need for the detailed understanding and monitoring of the evolutionary processes in cancer cell populations in response to therapy as part of future individualised treatment protocols.
How is semantic information stored in the human mind and brain? Some philosophers and cognitive scientists argue for vectorial representations of concepts, where the meaning of a word is represented as its position in a high-dimensional neural state space. At the intersection of natural language processing and artificial intelligence, a class of very successful distributional word vector models has developed that can account for classic EEG findings of language, i.e., the ease vs. difficulty of integrating a word with its sentence context. However, models of semantics have to account not only for context-based word processing, but should also describe how word meaning is represented. Here, we investigate whether distributional vector representations of word meaning can model brain activity induced by words presented without context. Using EEG activity (event-related brain potentials) collected while participants in two experiments (English, German) read isolated words, we encode and decode word vectors taken from the family of prediction-based word2vec algorithms. We find that, first, the position of a word in vector space allows the prediction of the pattern of corresponding neural activity over time, in particular during a time window of 300 to 500 ms after word onset. Second, distributional models perform better than a human-created taxonomic baseline model (WordNet), and this holds for several distinct vector-based models. Third, multiple latent semantic dimensions of word meaning can be decoded from brain activity. Combined, these results suggest that empiricist, prediction-based vectorial representations of meaning are a viable candidate for the representational architecture of human semantic knowledge.
Tracking influenza a virus infection in the lung from hematological data with machine learning
(2022)
The tracking of pathogen burden and host responses with minimal-invasive methods during respiratory infections is central for monitoring disease development and guiding treatment decisions. Utilizing a standardized murine model of respiratory Influenza A virus (IAV) infection, we developed and tested different supervised machine learning models to predict viral burden and immune response markers, i.e. cytokines and leukocytes in the lung, from hematological data. We performed independently in vivo infection experiments to acquire extensive data for training and testing purposes of the models. We show here that lung viral load, neutrophil counts, cytokines like IFN-γ and IL-6, and other lung infection markers can be predicted from hematological data. Furthermore, feature analysis of the models shows that blood granulocytes and platelets play a crucial role in prediction and are highly involved in the immune response against IAV. The proposed in silico tools pave the path towards improved tracking and monitoring of influenza infections and possibly other respiratory infections based on minimal-invasively obtained hematological parameters.
The multistep PROTAC (PROteolysis TArgeting Chimeras) degradation process poses challenges for their rational development, as rate limiting steps determining PROTAC efficiency remain largely unknown. Moreover, the slow throughput of currently used endpoint assays does not allow the comprehensive analysis of larger series of PROTACs. Here we developed cell-based assays using NanoLuciferase and HaloTags, that allow measuring PROTAC induced degradation and ternary complex formation kinetics and stability in cells. Using PROTACs developed for degradation of WDR5, the characterization of the mode of action of these PROTACs in the early degradation cascade revealed a key role of ternary complex formation and stability. Comparing a series of ternary complex crystal structures highlighted the importance of an efficient E3-target interface for ternary complex stability. The developed assays outline a strategy for the rational optimization of PROTACs using a series of live cell assays monitoring key steps of the early PROTAC induced degradation pathway.
Significance The multistep PROTAC induced degradation process of a POI poses a significant challenge for the rational design of these bifunctional small molecules as critical steps that limit PROTAC efficacy cannot be easily assayed at required throughput. In addition, the cellular location of the POI may pose additional challenges as some cellular compartments, such as the nucleus, may not be easily reached by PROTAC molecules and the targeted E3 ligases may not be present in this cellular compartment. We propose therefore a comprehensive assay panel for PROTACs evaluation in cellular environments using a sensor system that allows continuous monitoring of the protein levels of the endogenous POI. We developed a cell line expressing WDR5 from its endogenous locus in fusion with a small sequence tag (HiBIT) that can be reconstituted to functional NanoLuciferase (NLuc). This system allowed continuous monitoring of endogenous WDR5 levels in cells and together with HaloTag system also the continuous monitoring of ternary complex (E3, WDR5 and PROTAC) formation. As this assay can be run at high throughput, we used this versatile system monitoring three diverse chemical series of WDR5 PROTACs that markedly differ in their degradation properties. Monitoring cell penetration, binary complex formation (PROTAC-WDR5 and PROTAC-VHL) as well as ternary complex formation we found that PROTAC efficiency highly correlated with synergy of ternary complex formation in cells. This study represents a first data set on diverse PROTACs studying this property in cellulo and it outlines a strategy for the rational optimization of PROTACs. It also provided kinetic data on ternary complex assembly and dissociation that may serve as a benchmark for future studies utilizing also kinetic properties for PROTAC development. Comparative structural studies revealed larger PROTAC mediated interaction surfaces for PROTACs that efficiently formed ternary complexes highlighting the utility of structure based optimization of PROTAC induced ternary complexes in the development process.
Transfer RNA fragments replace microRNA regulators of the cholinergic post-stroke immune blockade
(2020)
Stroke is a leading cause of death and disability. Recovery depends on balance between inflammatory response and immune suppression, which can be CNS-protective but may worsen prognosis by increasing patients’ susceptibility to infections. Peripheral cholinergic blockade of immune reactions fine-tunes this immune response, but its molecular regulators are unknown. Therefore, we sought small RNA balancers of the cholinergic anti-inflammatory pathway in peripheral blood from ischemic stroke patients. Using RNA-sequencing and RT-qPCR, we discovered in patients’ blood on day 2 after stroke a “change of guards” reflected in massive decreases in microRNAs (miRs) and increases in transfer RNA fragments (tRFs) targeting cholinergic transcripts. Electrophoresis-based size-selection followed by RT-qPCR validated the top 6 upregulated tRFs in a separate cohort of stroke patients, and independent small RNA-sequencing datasets presented post-stroke enriched tRFs as originating from lymphocytes and monocytes. In these immune compartments, we found CD14+ monocytes to express the highest amounts of cholinergic transcripts. In-depth analysis of CD14+ regulatory circuits revealed minimally overlapping subsets of transcription factors carrying complementary motifs to miRs or tRFs, indicating different roles for the stroke-perturbed members of these small RNA species. Furthermore, LPS-stimulated murine RAW264.7 cells presented dexamethasone-suppressible upregulation of the top 6 tRFs identified in human patients, indicating an evolutionarily conserved and pharmaceutically treatable tRF response to inflammatory cues. Our findings identify tRF/miR subgroups which may co-modulate the homeostatic response to stroke in patients’ blood and open novel venues for establishing RNA-targeted concepts for post-stroke diagnosis and therapeutics.
Die Ressource "Wissen" rückte in den letzten Jahrzehnten als Quelle wissenschaftlicher Innovation immer stärker ins Zentrum des Interesses. Diese Fokussierung mündete in eine Selbstreflexion der Wissenschaft und der wissenschaftlichen Disziplinen: Thematisiert werden vor allem die Art und Weise, wie Wissen gewonnen wird, sowie die damit zusammenhängende Frage nach der Konstruktion von Wissenschaftlichkeit, womit das Bewusstsein gleichzeitig auf die mehr und mehr sich auflösende Abgrenzung zwischen den Disziplinen beziehungsweise zwischen den drei hauptsächlichen Wissenschaftskulturen, von Natur-, Geistes- und Kultur- sowie Sozialwissenschaften gelenkt wird. Innerhalb und außerhalb der Universitäten bildeten und bilden sich nicht immer klar verortbare "trading zones" (Gallison 1997), in denen neue Formen und Techniken der Wissensproduktion und Wissensvermittlung geprüft, geübt und teilweise auch institutionalisiert werden. ...
The equilibration of hot and dense nuclear matter produced in the central region in central Au+Au collisions at square root s = 200A GeV is studied within the microscopic transport model UrQMD. The pressure here becomes isotropic at t approx 5 fm/c. Within the next 15 fm/c the expansion of the matter proceeds almost isentropically with the entropy per baryon ratio S/A approx 150. During this period the equation of state in the (P, epsilon)-plane has a very simple form, P = 0.15 epsilon. Comparison with the statistical model (SM) of an ideal hadron gas reveals that the time of approx 20 fm/c may be too short to attain the fully equilibrated state. Particularly, the fractions of resonances are overpopulated in contrast to the SM values. The creation of such a long-lived resonance-rich state slows down the relaxation to chemical equilibrium and can be detected experimentally.
Ribosomes catalyze protein synthesis by cycling through various functional states. These states have been extensively characterized in vitro, yet their distribution in actively translating human cells remains elusive. Here, we optimized a cryo-electron tomography-based approach and resolved ribosome structures inside human cells with a local resolution of up to 2.5 angstroms. These structures revealed the distribution of functional states of the elongation cycle, a Z tRNA binding site and the dynamics of ribosome expansion segments. In addition, we visualized structures of Homoharringtonine, a drug for chronic myeloid leukemia treatment, within the active site of the ribosome and found that its binding reshaped the landscape of translation. Overall, our work demonstrates that structural dynamics and drug effects can be assessed at near-atomic detail within human cells.
Mitochondrial matrix peptidase CLPP is crucial during cell stress. Its loss causes Perrault syndrome type 3 (PRLTS3) with infertility, neurodegeneration and growth deficit. Its target proteins are disaggregated by CLPX, which also regulates heme biosynthesis via unfolding ALAS enzyme, providing access of pyridoxal-5’-phosphate (PLP). Despite efforts in diverse organisms with multiple techniques, CLPXP substrates remain controversial. Here, avoiding recombinant overexpression, we employed complexomics in mitochondria from three mouse tissues to identify endogenous targets. CLPP absence caused accumulation and dispersion of CLPX-VWA8 as AAA+ unfoldases, and of PLPBP. Similar changes and CLPX-VWA8 comigration were evident for mitoribosomal central protuberance clusters, translation factors like GFM1-HARS2, RNA granule components LRPPRC-SLIRP, and enzymes OAT-ALDH18A1. Mitochondrially translated proteins in testis showed reductions to <30% for MTCO1-3, misassembly of complex-IV supercomplex, and accumulated metal-binding assembly factors COX15-SFXN4. Indeed, heavy metal levels were increased for iron, molybdenum, cobalt and manganese. RT-qPCR showed compensatory downregulation only for Clpx mRNA, most accumulated proteins appeared transcriptionally upregulated. Immunoblots validated VWA8, MRPL38, MRPL18, GFM1 and OAT accumulation. Coimmunoprecipitation confirmed CLPX binding to MRPL38, GFM1 and OAT, so excess CLPX and PLP may affect their activity. Our data elucidate mechanistically the mitochondrial translation fidelity deficits, which underlie progressive hearing impairment in PRLTS3.
We propose to use the hadron number fluctuations in the limited momentum regions to study the evolution of initial flows in high energy nuclear collisions. In this method by a proper preparation of a collision sample the projectile and target initial flows are marked in fluctuations in the number of colliding nucleons. We discuss three limiting cases of the evolution of flows, transparency, mixing and reflection, and present for them quantitative predictions obtained within several models. Finally, we apply the method to the NA49 results on fluctuations of the negatively charged hadron multiplicity in Pb+Pb interactions at 158A GeV and conclude that the data favor a hydrodynamical model with a significant degree of mixing of the initial flows at the early stage of collisions.
Dilepton spectra are calculated within the microscopic transport model UrQMD and compared to data from the CERES experiment. The invariant mass spectra in the region between 300 MeV and 600 MeV depend strongly on the mass dependence of the rho meson decay width which is not sufficiently determined by the Vector Meson Dominance model. A consistent explanation of both the recent Pb+Au data and the proton induced data can be given without additional medium effects.
The pion source as seen through HBT correlations at RHIC energies is investigated within the UrQMD approach. We find that the calculated transverse momentum, centrality, and system size dependence of the Pratt-HBT radii R_L and R_S are reasonably well in line with experimental data. The predicted R_O values in central heavy ion collisions are larger as compared to experimental data. The corresponding quantity sqrt R_O^2-R_S^2 of the pion emission source is somewhat larger than experimental estimates.
Based on the UrQMD transport model, the transverse momentum and the rapidity dependence of the Hanbury-Brown-Twiss (HBT) radii R_L, R_O, R_S as well as the cross term R_OL at SPS energies are investigated and compared with the experimental NA49 and CERES data. The rapidity dependence of the R_L, R_O, R_S is weak while the R_OL is significantly increased at large rapidities and small transverse momenta. The HBT "life-time" issue (the phenomenon that the calculated sqrt R_O^2-R_S^2 value is larger than the correspondingly extracted experimental data) is also present at SPS energies.
We compare multiplicities as well as rapidity and transverse momentum distributions of protons, pions and kaons calculated within presently available transport approaches for heavy ion collisions around 1 AGeV. For this purpose, three reactions have been selected: Au+Au at 1 and 1.48 AGeV and Ni+Ni at 1.93 AGeV.
Background Transposable elements (TEs) are an important source of genome plasticity across the tree of life. Accumulating evidence suggests that TEs may not be randomly distributed in the genome. Drift and natural selection are important forces shaping TE distribution and accumulation, acting directly on the TE element or indirectly on the host species. Fungi, with their multifaceted phenotypic diversity and relatively small genome size, are ideal models to study the role of TEs in genome evolution and their impact on the host’s ecological and life history traits. Here we present an account of all TEs found in a high-quality reference genome of the lichen-forming fungus Umbilicaria pustulata, a macrolichen species comprising two climatic ecotypes: Mediterranean and cold-temperate. We trace the occurrence of the newly identified TEs in populations along three replicated elevation gradients using a Pool-Seq approach, to identify TE insertions of potential adaptive significance.
Results We found that TEs cover 21.26 % of the 32.9 Mbp genome, with LTR Gypsy and Copia clades being the most common TEs. Out of a total of 182 TE copies we identified 28 insertions displaying consistent insertion frequency differences between the two host ecotypes across the elevation gradients. Most of the highly differentiated insertions were located near genes, indicating a putative function.
Conclusions This pioneering study into the content and climate niche-specific distribution of TEs in a lichen-forming fungus contributes to understanding the roles of TEs in fungal evolution. Particularly, it may serve as a foundation for assessing the impact of TE dynamics on fungal adaptation to the abiotic environment, and the impact of TE activity on the evolution and maintenance of a symbiotic lifestyle.
Transverse activity of kaons and the deconfinement phase transition in nucleus-nucleus collisions
(2003)
We found that the experimental results on transverse mass spectra of kaons produced in central Pb+Pb (Au+Au) interactions show an anomalous dependence on the collision energy. The inverse slopes of the spectra increase with energy in the low (AGS) and high (RHIC) energy domains, whereas they are constant in the intermediate (SPS) energy range. We argue that this anomaly is probably caused by a modification of the equation of state in the transition region between confined and deconfined matter. This observation may be considered as a new signal, in addition to the previously reported anomalies in the pion and strangeness production, of the onset of deconfinement located in the low SPS energy domain.
The production of prompt charmed mesons D0, D+ and D∗+, and their antiparticles, was measured with the ALICE detector in Pb-Pb collisions at the centre-of-mass energy per nucleon pair, sNN−−−√, of 2.76 TeV. The production yields for rapidity |y|<0.5 are presented as a function of transverse momentum, pT, in the interval 1-36 GeV/c for the centrality class 0-10% and in the interval 1-16 GeV/c for the centrality class 30-50%. The nuclear modification factor RAA was computed using a proton-proton reference at s√=2.76 TeV, based on measurements at s√=7 TeV and on theoretical calculations. A maximum suppression by a factor of 5-6 with respect to binary-scaled pp yields is observed for the most central collisions at pT of about 10 GeV/c. A suppression by a factor of about 2-3 persists at the highest pT covered by the measurements. At low pT (1-3 GeV/c), the RAA has large uncertainties that span the range 0.35 (factor of about 3 suppression) to 1 (no suppression). In all pT intervals, the RAA is larger in the 30-50% centrality class compared to central collisions. The D-meson RAA is also compared with that of charged pions and, at large pT, charged hadrons, and with model calculations.
The production of prompt charmed mesons D0, D+ and D∗+, and their antiparticles, was measured with the ALICE detector in Pb-Pb collisions at the centre-of-mass energy per nucleon pair, sNN−−−√, of 2.76 TeV. The production yields for rapidity |y|<0.5 are presented as a function of transverse momentum, pT, in the interval 1-36 GeV/c for the centrality class 0-10% and in the interval 1-16 GeV/c for the centrality class 30-50%. The nuclear modification factor RAA was computed using a proton-proton reference at s√=2.76 TeV, based on measurements at s√=7 TeV and on theoretical calculations. A maximum suppression by a factor of 5-6 with respect to binary-scaled pp yields is observed for the most central collisions at pT of about 10 GeV/c. A suppression by a factor of about 2-3 persists at the highest pT covered by the measurements. At low pT (1-3 GeV/c), the RAA has large uncertainties that span the range 0.35 (factor of about 3 suppression) to 1 (no suppression). In all pT intervals, the RAA is larger in the 30-50% centrality class compared to central collisions. The D-meson RAA is also compared with that of charged pions and, at large pT, charged hadrons, and with model calculations.
Results are presented on event-by-event fluctuations in transverse momentum of charged particles, produced at forward rapidities in p+p, C+C, Si+Si and Pb+Pb collisions at 158 AGeV. Three different characteristics are discussed: the average transverse momentum of the event, the Phi_pT fluctuation measure and two-particle transverse momentum correlations. In the kinematic region explored, the dynamical fluctuations are found to be small. However, a significant system size dependence of Phi_pT is observed, with the largest value measured in peripheral Pb+Pb interactions. The data are compared with predictions of several models. PACS numbers: 14.20.Jn, 13.75.Cs, 12.39.-x