Refine
Year of publication
Document Type
- Preprint (2084) (remove)
Has Fulltext
- yes (2084)
Keywords
- Kollisionen schwerer Ionen (33)
- heavy ion collisions (27)
- Deutsch (23)
- Quark-Gluon-Plasma (14)
- equation of state (13)
- QGP (12)
- Kongress (10)
- Syntax (10)
- quark-gluon plasma (10)
- Multicomponent Tree Adjoining Grammar (9)
Institute
- Physik (1278)
- Frankfurt Institute for Advanced Studies (FIAS) (889)
- Informatik (748)
- Medizin (172)
- Extern (82)
- Biowissenschaften (71)
- Ernst Strüngmann Institut (69)
- Mathematik (47)
- Psychologie (46)
- MPI für Hirnforschung (45)
One of the key challenges for nuclear physics today is to understand from first principles the effective interaction between hadrons with different quark content. First successes have been achieved using techniques that solve the dynamics of quarks and gluons on discrete space-time lattices. Experimentally, the dynamics of the strong interaction have been studied by scattering hadrons off each other. Such scattering experiments are difficult or impossible for unstable hadrons and so high-quality measurements exist only for hadrons containing up and down quarks. Here we demonstrate that measuring correlations in the momentum space between hadron pairs produced in ultrarelativistic proton-proton collisions at the CERN Large Hadron Collider (LHC) provides a precise method with which to obtain the missing information on the interaction dynamics between any pair of unstable hadrons. Specifically, we discuss the case of the interaction of baryons containing strange quarks (hyperons). We demonstrate how, using precision measurements of p-omega baryon correlations, the effect of the strong interaction for this hadron-hadron pair can be studied with precision similar to, and compared with, predictions from lattice calculations. The large number of hyperons identified in proton-proton collisions at the LHC, together with an accurate modelling of the small (approximately one femtometre) inter-particle distance and exact predictions for the correlation functions, enables a detailed determination of the short-range part of the nucleon-hyperon interaction.
Results are presented from a search for the decays D0 -> K min pi plus and D0 bar -> K plus pi min in a sample of 3.8x10^6 central Pb-Pb events collected with a beam energy of 158A GeV by NA49 at the CERN SPS. No signal is observed. An upper limit on D0 production is derived and compared to predictions from several models.
UrQMD at RHIC energies
(1999)
We adopt Markert and Nissim (2005)’s approach of using the World Wide Web to resolve cases of coreferent bridging for German and discuss the strength and weaknesses of this approach. As the general approach of using surface patterns to get information on ontological relations between lexical items has only been tried on English, it is also interesting to see whether the approach works for German as well as it does for English and what differences between these languages need to be accounted for. We also present a novel approach for combining several patterns that yields an ensemble that outperforms the best-performing single patterns in terms of both precision and recall.
Oncogenic transformation of lung epithelial cells is a multi-step process, frequently starting with the inactivation of tumor suppressors and subsequent activating mutations in proto-oncogenes, such as members of the PI3K or MAPK family. Cells undergoing transformation have to adjust to changes, such as metabolic requirements. This is achieved, in part, by modulating the protein abundance of transcription factors, which manifest these adjustments. Here, we report that the deubiquitylase USP28 enables oncogenic reprogramming by regulating the protein abundance of proto-oncogenes, such as c-JUN, c-MYC, NOTCH and ΔNP63, at early stages of malignant transformation. USP28 is increased in cancer compared to normal cells due to a feed-forward loop, driven by increased amounts of oncogenic transcription factors, such as c-MYC and c-JUN. Irrespective of oncogenic driver, interference with USP28 abundance or activity suppresses growth and survival of transformed lung cells. Furthermore, inhibition of USP28 via a small molecule inhibitor reset the proteome of transformed cells towards a ‘pre-malignant’ state, and its inhibition cooperated with clinically established compounds used to target EGFRL858R, BRAFV600E or PI3KH1047R driven tumor cells. Targeting USP28 protein abundance already at an early stage via inhibition of its activity therefore is a feasible strategy for the treatment of early stage lung tumours and the observed synergism with current standard of care inhibitors holds the potential for improved targeting of established tumors.
The establishment and maintenance of protected areas (PAs) is viewed as a key action in delivering post-2020 biodiversity targets. PAs often need to meet multiple objectives, ranging from biodiversity protection to ecosystem service provision and climate change mitigation, but available land and conservation funding is limited. Therefore, optimizing resources by selecting the most beneficial PAs is vital. Here, we advocate for a flexible and transparent approach to selecting protected areas based on multiple objectives, and illustrate this with a decision support tool on a global scale. The tool allows weighting and prioritization of different conservation objectives according to user-specified preferences, as well as real-time comparison of the selected areas that result from such different priorities. We apply the tool across 1347 terrestrial PAs and highlight frequent trade-offs among different objectives, e.g., between species protection and ecosystem integrity. Outputs indicate that decision makers frequently face trade-offs among conflicting objectives. Nevertheless, we show that transparent decision-support tools can reveal synergies and trade-offs associated with PA selection, thereby helping to illuminate and resolve land-use conflicts embedded in divergent societal and political demands and values.
In this paper, we argue that difficulties in the definition of coreference itself contribute to lower inter-annotator agreement in certain cases. Data from a large referentially annotated corpus serves to corroborate this point, using a quantitative investigation to assess which effects or problems are likely to be the most prominent. Several examples where such problems occur are discussed in more detail, and we then propose a generalisation of Poesio, Reyle and Stevenson’s Justified Sloppiness Hypothesis to provide a unified model for these cases of disagreement and argue that a deeper understanding of the phenomena involved allows to tackle problematic cases in a more principled fashion than would be possible using only pre-theoretic intuitions.
We present results of hard X-ray angle-resolved photoemission spectroscopy and photoemission diffraction measurements performed on high-quality single crystals of the valence transition compound EuPd2Si2 for temperatures 25~K ≤ T ≤ 300~K. At low temperatures we observe a Eu 4f valence v=2.5, % occupation number n=6.5, which decreases to v=2.1 for temperatures above the valence transition around TV≈160~K. The experimental valence numbers resulting from an evaluation of the Eu(III)/Eu(II) 3d core levels, are used for calculating band structures using density functional theory. The valence transition significantly changes the band structure as determined by angle-resolved photoemission spectroscopy. In particular, the Eu 5d valence bands are shifted to lower binding energies with increasing Eu 4f occupancy. To a lesser extent, bands derived from the Si 3p and Pd 4d orbitals are also affected. This observation suggests a partial charge transfer between Eu and Pd/Si sites. Comparison with {\it ab-initio} theory shows a good agreement with experiment, in particular concerning the unequal band shift with increasing Eu 4f occupancy.
Conventional cluster and virial expansions are generalized to momentum dependent interparticle potentials. The model with Lorentz contracted hard core potentials is considered, e.g. as hadron gas model. A Van der Waals-type model with a temperature dependent excluded volume is derived. Lorentz contraction effects at given temperature are stronger for light particles and make their effective excluded volume smaller than that of heavy ones.
Sprachwahl und Sprachwahrnehmung sind im Deutschen unabdingbar geprägt durch das Wissen von einer Standardsprache. Dieses Wissen basiert für die meisten Sprecher auf der Erfahrung, dass in der Schule manche sprachliche Formen als korrekt, andere als falsch bewertet werden, außerdem auf der Tatsache, dass es Fixierungen der Regeln des Standards in Lexika und Grammatiken gibt. Wissen und Anerkennung dieses Standards sind unabhängig davon, dass keine dieser Kodifikationen unumstritten ist, dass viele Sprecher die Regeln nicht genau kennen und dass als Vorbilder anerkannte Personen (Nachrichtensprecher, Journalisten bestimmter Zeitschriften, Lehrer, Literaten u.a.) keineswegs einheitliche Regeln verfolgen. Der Standard ist fest assoziiert mit der Erfahrung einer legitimen Regelhaftigkeit, also mit Ordnung. Verwendung von Nonstandard wird mit Bezug auf diese Ordnung und von ihr unterschieden wahrgenommen. Diese relationale Sicht der Dinge ist sowohl subjektiv als auch intersubjektiv.
In der deutschsprachigen Schweiz stehen sich gesprochene Mundarten und geschriebene Standardsprache gegenüber. Außer in formellen Situationen wird Mundart gesprochen, und bis vor kurzem wurde nur selten Mundart geschrieben, sondern die hochdeutsche Schriftsprache. Die Chat-Kommunikation zeigt einerseits durch die nicht-zeitversetzte quasi-direkte Kommunikation wesentliche Züge von Mündlichkeit, die zusammen mit der Informalität im Chat den Mundartgebrauch fördert. Andererseits ist das Medium immer noch die Schrift, welche die Domäne der Standardsprache darstellt. Mundart und Standardsprache stehen sich also in Chaträumen in direkter Konkurrenz gegenüber. Der folgende Beitrag analysiert quantitativ und qualitativ das Neben- und Miteinander der beiden Varietäten in Schweizer Chaträumen und untersucht das Vorkommen und die Bedingungen von Code-Alternation und Code-Switches.
Background Vasoplegic syndrome is frequently observed during cardiac surgery and resembles a complication of high mortality and morbidity. There is a clinical need for therapy and prevention of vasoplegic syndrome during complex cardiac surgical procedures. Therefore, we investigated different strategies in a porcine model of vasoplegia.
Methods We evaluated new medical therapies and prophylaxis to avoid vasoplegic syndrome in a porcine model. After induction of anesthesia, cardiopulmonary bypass was established through median sternotomy and central cannulation. Prolonged aortic cross-clamping (120 min) simulated a complex surgical procedure. The influence of sevoflurane-guided anesthesia (sevoflurane group) and the administration of glibenclamide (glibenclamide group) were compared to a control group, which received standard anesthesia using propofol. Online hemodynamic assessment was performed using PiCCO® measurements. In addition, blood and tissue samples were taken to evaluate hemodynamic effects and the degree of inflammatory response.
Results Glibenclamide was able to break through early vasoplegic syndrome by raising the blood pressure and systemic vascular resistance as well as less need of norepinephrine doses. Sevoflurane reduced the occurrence of the vasoplegic syndrome in the mean of stable blood pressure and less need of norepinephrine doses.
Conclusion Glibenclamide could serve as a potent drug to reduce effects of vasoplegic syndrome. Sevoflurane anesthesia during cardiopulmonary bypass shows less occurrence of vasoplegic syndrome and therefore could be used to prevent it in high-risk patients.
Clinical Perspective; what is new?
* to our knowledge, this is the first randomized in vivo study evaluating the hemodynamic effects of glibenclamide after the onset of vasoplegic syndrome
* furthermore according to literature research, there is no study showing the effect of sevoflurane-guided anesthesia on the occurrence of a vasoplegic syndrome
Clinical Perspective; clinical implications?
to achieve better outcomes after complex cardiac surgery there is a need for optimized drug therapy and prevention of the vasoplegic syndrome
Im Rahmen des Bund-Länder-Programms "Qualitätspakt Lehre" hat die Goethe-Universität Frankfurt erfolgreich das Programm "Starker Start ins Studium" eingeworben. Dadurch verfügt das Institut für Psychologie nun über die personellen Möglichkeiten, die fachliche und soziale Integration neuer Psychologiestudierender im sechssemestrigen Bachelorstudiengang Psychologie zu verbessern. Hierzu wurden zwei obligate je zweisemestrige Lehrmodule entwickelt. In dem vorliegenden Beitrag wird das übergeordnete Lehrkonzept beschrieben und dessen Implementierung im Fach Psychologie als Praxisbeispiel illustriert.
Avventurarsi e poi inoltrarsi nell'opera di Thomas Bernhard non è precisamente come fare una passeggiata, ma la passeggiata è un motivo ricorrente nell'opera bernhardiana (insieme a quella di Handke, di Sebald, di Walser, per fare solo alcuni nomi di passeggiatori nel Novecento di lingua tedesca). Le figure di Bernhard camminano, marciano, corrono, ma in una "direzione opposta" rispetto a quella indicata da Stifter. Talvolta i loro percorsi si snodano nella natura, come quando entrano in un bosco per non fare più ritorno (Gelo, Al limite boschivo, La partita a carte), a volte marciano nel chiuso della loro "casa-prigione", seguendo i percorsi labirintici e infiniti della loro mente (La Fornace, Cemento), altre volte ancora si muovono in un contesto cittadino e metropolitano, a Roma in Estinzione (dove la passeggiata con l'allievo Gambetti conserva un alone aristotelico, il peripatetico) o - più spesso - a Vienna.
Verständnisvolle Dozenten haben weniger Fachwissen : Wirkungen der sprachlichen Anpassung an Laien
(2012)
In der Interaktion mit Studierenden ist schriftliche Online-Kommunikation ein wichtiges Arbeitsmedium für jeden Lehrenden geworden. Die Interaktionspartner haben dabei für ihre Urteilsbildung über den jeweils anderen ausschließlich den geschriebenen Text mit seinen lexikalen und grammatikalischen Merkmalen zur Verfügung. Das Ausmaß der lexikalen Anpassung an die Wortwahl eines Studierenden kann daher einen Einfluss auf die studentische Bewertung ihrer Dozenten hinsichtlich unterschiedlicher Persönlichkeitseigenschaften haben. In der vorliegenden Studie beurteilten Studierende jeweils zwei Dozenten hinsichtlich Verständnis, Gewissenhaftigkeit und Intellekt (IPIP, Goldberg, Johnson, Eber et al., 2006) auf Grundlage einer Emailkommunikation. Der Grad der lexikalen Anpassung der Lehrenden wurde dabei variiert. Es zeigte sich, dass Studierende Dozenten mit umgangssprachlicher Wortwahl als verständnisvoller, gewissenhafter aber tendenziell weniger wissend einschätzen.
Viewpoint effects on object recognition interact with object-scene consistency effects. While recognition of objects seen from “accidental” viewpoints (e.g., a cup from below) is typically impeded compared to processing of objects seen from canonical viewpoints (e.g., the string-side of a guitar), this effect is reduced by meaningful scene context information. In the present study we investigated if these findings established by using photographic images, generalise to 3D models of objects. Using 3D models further allowed us to probe a broad range of viewpoints and empirically establish accidental and canonical viewpoints. In Experiment 1, we presented 3D models of objects from six different viewpoints (0°, 60°, 120°, 180° 240°, 300°) in colour (1a) and grayscaled (1b) in a sequential matching task. Viewpoint had a significant effect on accuracy and response times. Based on the performance in Experiments 1a and 1b, we determined canonical (0°-rotation) and non-canonical (120°-rotation) viewpoints for the stimuli. In Experiment 2, participants again performed a sequential matching task, however now the objects were paired with scene backgrounds which could be either consistent (e.g., a cup in the kitchen) or inconsistent (e.g., a guitar in the bathroom) to the object. Viewpoint interacted significantly with scene consistency in that object recognition was less affected by viewpoint when consistent scene information was provided, compared to inconsistent information. Our results show that viewpoint-dependence and scene context effects generalize to depth rotated 3D objects. This supports the important role object-scene processing plays for object constancy.
Virtuelles Lernen in Gruppen : Rollenspiele und Online-Diskussionen und die Bedeutung von Lerntypen
(2000)
Ziel dieses Beitrags ist, Erfahrungen aus einem virtuellen Tutorium, in dem der Gruppenarbeit und Rollenspielen in Chats eine besondere Bedeutung zukam, auszuwerten. Eingangs wird die Lernumgebung vorgestellt, die auf Basis standardisierter Internetdienste realisiert wurde und vor allem kooperatives Lernen durch Anwendung und Übung in den Mittelpunkt stellte. Diese Evaluation der Veranstaltungen, deren Untersuchung abschließend einen Bezug zwischen Lerntypen und Mediennutzungspräferenzen hergestellt, soll einen Beitrag zur Weiterentwicklung und Ausgestaltung internetbasierter Lehrveranstaltungen leisten.
The brain adapts to the sensory environment. For example, simple sensory exposure can modify the response properties of early sensory neurons. How these changes affect the overall encoding and maintenance of stimulus information across neuronal populations remains unclear. We perform parallel recordings in the primary visual cortex of anesthetized cats and find that brief, repetitive exposure to structured visual stimuli enhances stimulus encoding by decreasing the selectivity and increasing the range of the neuronal responses that persist after stimulus presentation. Low-dimensional projection methods and simple classifiers demonstrate that visual exposure increases the segregation of persistent neuronal population responses into stimulus-specific clusters. These observed refinements preserve the representational details required for stimulus reconstruction and are detectable in post-exposure spontaneous activity. Assuming response facilitation and recurrent network interactions as the core mechanisms underlying stimulus persistence, we show that the exposure-driven segregation of stimulus responses can arise through strictly local plasticity mechanisms, also in the absence of firing rate changes. Our findings provide evidence for the existence of an automatic, unguided optimization process that enhances the encoding power of neuronal populations in early visual cortex, thus potentially benefiting simple readouts at higher stages of visual processing.
The development of super-resolution microscopy (SRM) has widened our understanding of biomolecular structure and function in biological materials. Imaging multiple targets within a single area would elucidate their spatial localization relative to the cell matrix and neighboring biomolecules, revealing multi-protein macromolecular structures and their functional co-dependencies. SRM methods are, however, limited to the number of suitable fluorophores that can be imaged during a single acquisition as well as the loss of antigens during antibody washing and restaining for organic dye multiplexing. We report the visualization of multiple protein targets within the pre- and postsynapse in 350-400 nm thick neuronal tissue sections using DNA-assisted single-molecule localization microscopy. Using antibodies labeled with short DNA oligonucleotides, multiple targets are visualized successively by sequential exchange of fluorophore-labeled complementary oligonucleotides present in the imaging buffer. The structural integrity of the tissue is maintained owing to only a single labelling step during sample preparation. Multiple targets are imaged using a single laser wavelength, minimizing chromatic aberration. This method proved robust for multi-target imaging in semi-thin tissue sections, paving the way towards structural cell biology with single-molecule super-resolution microscopy.
Wie Rolf Parr in seinem Aufsatz 'Liminale und andere Übergänge. Theoretische Modellierung von Grenzzonen, Normalitätsaspekten, Schwellen, Übergängen und Zwischenräumen in Literatur- und Kulturwissenschaft' deutlich macht, ist die Intertextualitäts- und Intermedialitätstheorie, die er im Anschluss an die Arbeiten Michel Foucaults und Jürgen Links vertritt, wesentlich von einem Moment der Grenzüberschreitung bestimmt. An die Stelle klar konturierter Grenzen treten Schwellen als "räumlichtopographische Zonen der Unentschiedenheit", die zugleich als zeitliche Erinnerungsschwellen fungieren. Parr richtet im Rekurs auf Foucault den Blick nicht allein auf diskursive Grenzen der Sagbarkeit durch Ausschlussmechanismen, Verbote etc. Er macht zugleich auf Foucaults frühes Konzept der Heterotopie aufmerksam, wo dieser Grenzziehungen auf bestimmte Raumstrukturen bezieht. Parrs eigenes Interesse liegt in diesem Zusammenhang in der Überführung der diskurstheoretischen Arbeiten Foucaults in eine Interdiskurstheorie, die eben die Schwellen einzelner Diskurse zu überschreiten hätte. Ich möchte hier einen anderen Akzent setzen und die Bedeutung von Schwellenerfahrungen bei Foucault selbst herausarbeiten. Ich konzentriere mich dabei zunächst auf den Begriff des historischen Aprioris aus 'Die Ordnung der Dinge', um daran anschließend auf den Begriff der Heterotopie einzugehen, der die Entstehung der 'Ordnung der Dinge' in den sechziger Jahren in gewisser Weise begleitet und komplementiert. Der Vergleich von Foucaults Schwellendenken mit dem Walter Benjamins soll zugleich erlauben, das Thema des Liminalen im Sinne Parrs als ein Grundmotiv von Foucaults Denken auszumachen.
The W and Z boson production was measured via the muonic decay channel in proton-lead collisions at sNN−−−√=5.02 TeV at the Large Hadron Collider with the ALICE detector. The measurement covers backward (−4.46<ycms<−2.96) and forward (2.03<ycms<3.53) rapidity regions, corresponding to Pb-going and p-going directions, respectively. The Z-boson production cross section, with dimuon invariant mass of 60<mμμ<120 GeV/c2 and muon transverse momentum (pμT) larger than 20 GeV/c, is measured. The production cross section and charge asymmetry of muons from W-boson decays with pμT>10 GeV/c are determined. The results are compared to theoretical calculations both with and without including the nuclear modification of the parton distribution functions. The W-boson production is also studied as a function of the collision centrality: the cross section of muons from W-boson decays is found to scale with the average number of binary nucleon-nucleon collisions within uncertainties.
The W and Z boson production was measured via the muonic decay channel in proton-lead collisions at sNN−−−√=5.02 TeV at the Large Hadron Collider with the ALICE detector. The measurement covers backward (−4.46<ycms<−2.96) and forward (2.03<ycms<3.53) rapidity regions, corresponding to Pb-going and p-going directions, respectively. The Z-boson production cross section, with dimuon invariant mass of 60<mμμ<120 GeV/c2 and muon transverse momentum (pμT) larger than 20 GeV/c, is measured. The production cross section and charge asymmetry of muons from W-boson decays with pμT>10 GeV/c are determined. The results are compared to theoretical calculations both with and without including the nuclear modification of the parton distribution functions. The W-boson production is also studied as a function of the collision centrality: the cross section of muons from W-boson decays is found to scale with the average number of binary nucleon-nucleon collisions within uncertainties.
Die Frage, was Literatur ist, scheint nicht nur die grundlegendste zu sein, die sich der Literaturwissenschaft stellt, sie ist zugleich ihre abgründigste. Grundlegend ist sie, weil sie nach dem Wesen der Literatur fragt und damit eigentlich eine Selbstverständlichkeit aufruft, die die Auseinandersetzung mit Literatur begleitet. Abgründig ist sie, weil auch die scheinbar selbstverständlichsten Definitionen der Literatur bisher nicht zu einer einheitlichen Auffassung vom Wesen der Literatur geführt haben. So steht die Literaturwissenschaft bereits mit der ersten Frage, die sich ihr stellt, vor einem scheinbar unaufhebbaren Dilemma. Auf den Gegenstand angesprochen, der ihr zugehört und der entsprechend über ihre Berechtigung als Wissenschaft Auskunft zu geben vermöchte, bleibt sie im Unklaren.
Weak function word shift
(2004)
The fact that object shift only affects weak pronouns in mainland Scandinavian is seen as an instance of a more general observation that can be made in all Germanic languages: weak function words tend to avoid the edges of larger prosodic domains. This generalisation has been formulated within Optimality Theory in terms of alignment constraints on prosodic structure by Selkirk (1996) in explaining thedistribution of prosodically strong and weak forms of English functionwords, especially modal verbs, prepositions and pronouns. But a purely phonological account fails to integrate the syntactic licensing conditions for object shift in an appropriate way. The standard semantico-syntactic accounts of object shift, onthe other hand, fail to explain why it is only weak pronouns that undergo object shift. This paper develops an Optimality theoretic model of the syntax-phonology interface which is based on the interaction of syntactic and prosodic factors. The account can successfully be applied to further related phenomena in English and German.
Die Bedeutung des philosophischen Programms John McDowells, das schon in der theoretischen Philosophie eine revolutionäre Neuausrichtung vornimmt, kann erst voll erkannt werden, wenn man auch seine Konsequenzen für die praktische Philosophie in den Blick nimmt. Zwar geht Geist und Welt primär von Dilemmata der Erkenntnistheorie aus. Aus McDowells Vorschlag, die Gleichsetzung der äußeren Natur mit dem bedeutungsfreien Raum der Naturgesetze zugunsten einer Konzeption von Gründen in der Welt aufzugeben, ergibt sich aber die Möglichkeit einer so neuartigen Perspektive auf die Natur moralischer Urteile, dass es fast so scheint, als sei McDowells theoretisches Programm auf diesen Gewinn für die praktische Philosophie hin angelegt worden.
Abstract Trial-to-trial variability and spontaneous activity of cortical recordings have been suggested to reflect intrinsic noise. This view is currently challenged by mounting evidence for structure in these phenomena: Trial-to-trial variability decreases following stimulus onset and can be predicted by previous spontaneous activity. This spontaneous activity is similar in magnitude and structure to evoked activity and can predict decisions. Allof the observed neuronal properties described above can be accounted for, at an abstract computational level, by the sampling-hypothesis, according to which response variability reflects stimulus uncertainty. However, a mechanistic explanation at the level of neural circuit dynamics is still missing.
In this study, we demonstrate that all of these phenomena can be accounted for by a noise-free self-organizing recurrent neural network model (SORN). It combines spike-timing dependent plasticity (STDP) and homeostatic mechanisms in a deterministic network of excitatory and inhibitory McCulloch-Pitts neurons. The network self-organizes to spatio-temporally varying input sequences.
We find that the key properties of neural variability mentioned above develop in this model as the network learns to perform sampling-like inference. Importantly, the model shows high trial-to-trial variability although it is fully deterministic. This suggests that the trial-to-trial variability in neural recordings may not reflect intrinsic noise. Rather, it may reflect a deterministic approximation of sampling-like learning and inference. The simplicity of the model suggests that these correlates of the sampling theory are canonical properties of recurrent networks that learn with a combination of STDP and homeostatic plasticity mechanisms.
Author Summary Neural recordings seem very noisy. If the exact same stimulus is shown to an animal multiple times, the neural response will vary. In fact, the activity of a single neuron shows many features of a stochastic process. Furthermore, in the absence of a sensory stimulus, cortical spontaneous activity has a magnitude comparable to the activity observed during stimulus presentation. These findings have led to a widespread belief that neural activity is indeed very noisy. However, recent evidence indicates that individual neurons can operate very reliably and that the spontaneous activity in the brain is highly structured, suggesting that much of the noise may in fact be signal. One hypothesis regarding this putative signal is that it reflects a form of probabilistic inference through sampling. Here we show that the key features of neural variability can be accounted for in a completely deterministic network model through self-organization. As the network learns a model of its sensory inputs, the deterministic dynamics give rise to sampling-like inference. Our findings show that the notorious variability in neural recordings does not need to be seen as evidence for a noisy brain. Instead it may reflect sampling-like inference emerging from a self-organized learning process.
White matter abnormalities across different epilepsy syndromes in adults: an ENIGMA Epilepsy study
(2019)
The epilepsies are commonly accompanied by widespread abnormalities in cerebral white matter. ENIGMA-Epilepsy is a large quantitative brain imaging consortium, aggregating data to investigate patterns of neuroimaging abnormalities in common epilepsy syndromes, including temporal lobe epilepsy, extratemporal epilepsy, and genetic generalized epilepsy. Our goal was to rank the most robust white matter microstructural differences across and within syndromes in a multicentre sample of adult epilepsy patients. Diffusion-weighted MRI data were analyzed from 1,069 non-epileptic controls and 1,249 patients: temporal lobe epilepsy with hippocampal sclerosis (N=599), temporal lobe epilepsy with normal MRI (N=275), genetic generalized epilepsy (N=182) and nonlesional extratemporal epilepsy (N=193). A harmonized protocol using tract-based spatial statistics was used to derive skeletonized maps of fractional anisotropy and mean diffusivity for each participant, and fiber tracts were segmented using a diffusion MRI atlas. Data were harmonized to correct for scanner-specific variations in diffusion measures using a batch-effect correction tool (ComBat). Analyses of covariance, adjusting for age and sex, examined differences between each epilepsy syndrome and controls for each white matter tract (Bonferroni corrected at p<0.001). Across “all epilepsies” lower fractional anisotropy was observed in most fiber tracts with small to medium effect sizes, especially in the corpus callosum, cingulum and external capsule. Less robust effects were seen with mean diffusivity. Syndrome-specific fractional anisotropy and mean diffusivity differences were most pronounced in patients with hippocampal sclerosis in the ipsilateral parahippocampal cingulum and external capsule, with smaller effects across most other tracts. Those with temporal lobe epilepsy and normal MRI showed a similar pattern of greater ipsilateral than contralateral abnormalities, but less marked than those in patients with hippocampal sclerosis. Patients with generalized and extratemporal epilepsies had pronounced differences in fractional anisotropy in the corpus callosum, corona radiata and external capsule, and in mean diffusivity of the anterior corona radiata. Earlier age of seizure onset and longer disease duration were associated with a greater extent of microstructural abnormalities in patients with hippocampal sclerosis. We demonstrate microstructural abnormalities across major association, commissural, and projection fibers in a large multicentre study of epilepsy. Overall, epilepsy patients showed white matter abnormalities in the corpus callosum, cingulum and external capsule, with differing severity across epilepsy syndromes. These data further define the spectrum of white matter abnormalities in common epilepsy syndromes, yielding new insights into pathological substrates that may be used to guide future therapeutic and genetic studies.
In recent years, research in parsing has extended in several new directions. One of these directions is concerned with parsing languages other than English. Treebanks have become available for many European languages, but also for Arabic, Chinese, or Japanese. However, it was shown that parsing results on these treebanks depend on the types of treebank annotations used. Another direction in parsing research is the development of dependency parsers. Dependency parsing profits from the non-hierarchical nature of dependency relations, thus lexical information can be included in the parsing process in a much more natural way. Especially machine learning based approaches are very successful (cf. e.g.). The results achieved by these dependency parsers are very competitive although comparisons are difficult because of the differences in annotation. For English, the Penn Treebank has been converted to dependencies. For this version, Nivre et al. report an accuracy rate of 86.3%, as compared to an F-score of 92.1 for Charniaks parser. The Penn Chinese Treebank is also available in a constituent and a dependency representations. The best results reported for parsing experiments with this treebank give an F-score of 81.8 for the constituent version and 79.8% accuracy for the dependency version. The general trend in comparisons between constituent and dependency parsers is that the dependency parser performs slightly worse than the constituent parser. The only exception occurs for German, where F-scores for constituent plus grammatical function parses range between 51.4 and 75.3, depending on the treebank, NEGRA or TüBa-D/Z. The dependency parser based on a converted version of Tüba-D/Z, in contrast, reached an accuracy of 83.4%, i.e. 12 percent points better than the best constituent analysis including grammatical functions.
canning tunneling microscopy (STM) is perhaps the most promising way to detect the superconducting gap size and structure in the canonical unconventional superconductor Sr2RuO4 directly. However, in many cases, researchers have reported being unable to detect the gap at all in simple STM conductance measurements. Recently, an investigation of this issue on various local topographic structures on a Sr-terminated surface found that superconducting spectra appeared only in the region of small nanoscale canyons, corresponding to the removal of one RuO surface layer. Here, we analyze the electronic structure of various possible surface structures using first principles methods, and argue that bulk conditions favorable for superconductivity can be achieved when removal of the RuO layer suppresses the RuO4 octahedral rotation locally. We further propose alternative terminations to the most frequently reported Sr termination where superconductivity surfaces should be observed.
Several studies suggested that transcription factor (TF) binding to DNA may be impaired or enhanced by DNA methylation. We present MeDeMo, a toolbox for TF motif analysis that combines information about DNA methylation with models capturing intra-motif dependencies. In a large-scale study using ChIP-seq data for 335 TFs, we identify novel TFs that are affected by DNA methylation. Overall, we find that CpG methylation decreases the likelihood of binding for the majority of TFs. For a considerable subset of TFs, we show that intra-motif dependencies are pivotal for accurately modelling the impact of DNA methylation on TF binding.
Wikis in der Hochschullehre
(2012)
Dieser Beitrag gibt einen Überblick über Einsatzszenarien von Wikis in Lern- und Lehrprozessen und deren Eignung für die kollaborative Wissensproduktion, während zugleich Einschränkungen, Bedingungen und Gestaltungsempfehlungen thematisiert werden. Zudem werden Erfahrungen mit verschiedenen Wiki-Anwendungen an der Universität Frankfurt dokumentiert, die vom begleitenden Einsatz im Seminar bis hin zur studentisch initiierten Bereitstellung studienbegleitender Materialien reichen. Die vorher ausgearbeiteten Aspekte werden nochmals anhand der Beispiele aufgegriffen und ihrer Praxisrelevanz verdeutlicht.
Chern numbers can be calculated within a frame of vortex fields related to phase conventions of a wave function. In a band protected by gaps the Chern number is equivalent to the total number of flux carrying vortices. In the presence of topological defects like Dirac cones this method becomes problematic, in particular if they lack a well-defined winding number. We develop a scheme to include topological defects into the vortex field frame. A winding number is determined by the behavior of the phase in reciprocal space when encircling the defect's contact point. To address the possible lack of a winding number we utilize a more general concept of winding vectors. We demonstrate the usefulness of this ansatz on Dirac cones generated from bands of the Hofstadter model.
Chern numbers can be calculated within a frame of vortex fields related to phase conventions of a wave function. In a band protected by gaps the Chern number is equivalent to the total number of flux carrying vortices. In the presence of topological defects like Dirac cones this method becomes problematic, in particular if they lack a well-defined winding number. We develop a scheme to include topological defects into the vortex field frame. A winding number is determined by the behavior of the phase in reciprocal space when encircling the defect's contact point. To address the possible lack of a winding number we utilize a more general concept of winding vectors. We demonstrate the usefulness of this ansatz on Dirac cones generated from bands of the Hofstadter model.
Der Workshop "Nationale Spezifika und internationale Aspekte in der Wissenschaftsentwicklung – unter besonderer Berücksichtigung der Narratologie" soll, so die Organisatoren in ihrer Einladung – "Gelegenheit bieten, Bedingungen und Möglichkeiten integrativer Ansätze zur Untersuchung von Wissenschaftsprozessen zu diskutieren und wichtige Faktoren der Wissenschaftsentwicklung zu benennen und kritisch zu beleuchten." Die Produktion, Distribution und Rezeption von Wissenssystemen vollziehe sich, schreiben die Organisatoren, "in unterschiedlichen nationalen und internationalen sozialen Räumen, die sowohl die Form als auch den kognitiven Gehalt von Theorien mitunter stark mitstrukturieren, ihre Durchsetzung begünstigen oder behindern. Das wird besonders deutlich, wenn man Transferprozesse von Theorien verfolgt." Den Begriff des Wissenstransfers, der hier in Anschlag gebracht wird, möchte ich in meinem Beitrag einer terminologischen Klärung zuführen. Dazu möchte ich zunächst einige terminologische Überlegungen über den Status der Teilbegriffe anstellen, aus denen der Begriff zusammengesetzt ist (I.), dann die Verwendung des Begriffs in verschiedenen disziplinären Kontexten beobachten (II.) und schließlich einen Vorschlag für eine differenzierte Verwendung des Begriffs als Analysekategorie der Wissenschaftsentwicklung machen (III).
Die Driften der Wörter in öffentlichen Räumen sind vielfältig. Neue Wortentwicklungen belegen unterschiedliche Interessen, "chillen" und "dissen" andere als das in der konservativen Züricher Zeitung zuerst erschienene "share-holder-value". Im Folgenden soll eine sinnbezoge Verallgemeinerung unternommen werden, die die Handlungen der Akteure mit der strukturellen Ebene verbindet. Die Veränderungen in den Verwendungen sollen zu strukturellen sozialen und sprachlichen Rahmenbedingungen in Bezug gesetzt werden. Wie werden Neuerungen und Änderungen der Anwendungsbedingungen von Wörtern vor dem Hintergrund des Wissens um die traditionelle Standardsprache und deren soziale Funktion wahrgenommen? Welche Funktionen haben Neologismen in Abgrenzung zu diesem Standard?
W±-boson production in p–Pb collisions at √sNN = 8.16 TeV and Pb–Pb collisions at √sNN = 5.02 TeV
(2022)
The production of the W± bosons measured in p−Pb collisions at a centre-of-mass energy per nucleon−nucleon collision sNN−−−−√=8.16 TeV and Pb−Pb collisions at sNN−−−−√=5.02 TeV with ALICE at the LHC is presented. The W± bosons are measured via their muonic decay channel, with the muon reconstructed in the pseudorapidity region −4<ημlab<−2.5 with transverse momentum pμT>10 GeV/c. While in Pb−Pb collisions the measurements are performed in the forward (2.5<yμcms<4) rapidity region, in p−Pb collisions, where the centre-of-mass frame is boosted with respect to the laboratory frame, the measurements are performed in the backward (−4.46<yμcms<−2.96) and forward (2.03<yμcms<3.53) rapidity regions. The W− and W+ production cross sections, lepton-charge asymmetry, and nuclear modification factors are evaluated as a function of the muon rapidity. In order to study the production as a function of the p−Pb collision centrality, the production cross sections of the W− and W+ bosons are combined and normalised to the average number of binary nucleon−nucleon collision ⟨Ncoll⟩. In Pb−Pb collisions, the same measurements are presented as a function of the collision centrality. Study of the binary scaling of the W±-boson cross sections in p−Pb and Pb−Pb collisions is also reported. The results are compared with perturbative QCD (pQCD) calculations, with and without nuclear modifications of the Parton Distribution Functions (PDFs), as well as with available data at the LHC. Significant deviations from the theory expectations are found in the two collision systems, indicating that the measurements can provide additional constraints for the determination of nuclear PDF (nPDFs) and in particular of the light-quark distributions.
W±-boson production in p–Pb collisions at √sNN = 8.16 TeV and Pb–Pb collisions at √sNN = 5.02 TeV
(2022)
The production of the W± bosons measured in p−Pb collisions at a centre-of-mass energy per nucleon−nucleon collision sNN−−−−√=8.16 TeV and Pb−Pb collisions at sNN−−−−√=5.02 TeV with ALICE at the LHC is presented. The W± bosons are measured via their muonic decay channel, with the muon reconstructed in the pseudorapidity region −4<ημlab<−2.5 with transverse momentum pμT>10 GeV/c. While in Pb−Pb collisions the measurements are performed in the forward (2.5<yμcms<4) rapidity region, in p−Pb collisions, where the centre-of-mass frame is boosted with respect to the laboratory frame, the measurements are performed in the backward (−4.46<yμcms<−2.96) and forward (2.03<yμcms<3.53) rapidity regions. The W− and W+ production cross sections, lepton-charge asymmetry, and nuclear modification factors are evaluated as a function of the muon rapidity. In order to study the production as a function of the p−Pb collision centrality, the production cross sections of the W− and W+ bosons are combined and normalised to the average number of binary nucleon−nucleon collision ⟨Ncoll⟩. In Pb−Pb collisions, the same measurements are presented as a function of the collision centrality. Study of the binary scaling of the W±-boson cross sections in p−Pb and Pb−Pb collisions is also reported. The results are compared with perturbative QCD (pQCD) calculations, with and without nuclear modifications of the Parton Distribution Functions (PDFs), as well as with available data at the LHC. Significant deviations from the theory expectations are found in the two collision systems, indicating that the measurements can provide additional constraints for the determination of nuclear PDF (nPDFs) and in particular of the light-quark distributions.
W±-boson production in p–Pb collisions at √sNN = 8.16 TeV and Pb–Pb collisions at √sNN = 5.02 TeV
(2023)
The production of the W± bosons measured in p−Pb collisions at a centre-of-mass energy per nucleon−nucleon collision sNN−−−−√=8.16 TeV and Pb−Pb collisions at √sNN=5.02 TeV with ALICE at the LHC is presented. The W± bosons are measured via their muonic decay channel, with the muon reconstructed in the pseudorapidity region −4<ημlab<−2.5 with transverse momentum pμT>10 GeV/c. While in Pb−Pb collisions the measurements are performed in the forward (2.5<yμcms<4) rapidity region, in p−Pb collisions, where the centre-of-mass frame is boosted with respect to the laboratory frame, the measurements are performed in the backward (−4.46<yμcms<−2.96) and forward (2.03<yμcms<3.53) rapidity regions. The W− and W+ production cross sections, lepton-charge asymmetry, and nuclear modification factors are evaluated as a function of the muon rapidity. In order to study the production as a function of the p−Pb collision centrality, the production cross sections of the W− and W+ bosons are combined and normalised to the average number of binary nucleon−nucleon collision ⟨Ncoll⟩. In Pb−Pb collisions, the same measurements are presented as a function of the collision centrality. Study of the binary scaling of the W±-boson cross sections in p−Pb and Pb−Pb collisions is also reported. The results are compared with perturbative QCD (pQCD) calculations, with and without nuclear modifications of the Parton Distribution Functions (PDFs), as well as with available data at the LHC. Significant deviations from the theory expectations are found in the two collision systems, indicating that the measurements can provide additional constraints for the determination of nuclear PDF (nPDFs) and in particular of the light-quark distributions.
We report measurements of Xi and Xi-bar hyperon absolute yields as a function of rapidity in 158 GeV/c Pb+Pb collisions. At midrapidity, dN/dy = 2.29 +/- 0.12 for Xi, and 0.52 +/- 0.05 for Xi-bar, leading to the ratio of Xi-bar/Xi = 0.23 +/- 0.03. Inverse slope parameters fitted to the measured transverse mass spectra are of the order of 300 MeV near mid-rapidity. The estimated total yield of Xi particles in Pb+Pb central interactions amounts to 7.4 +/- 1.0 per collision. Comparison to Xi production in properly scaled p+p reactions at the same energy reveals a dramatic enhancement (about one order of magnitude) of Xi production in Pb+Pb central collisions over elementary hadron interactions.
Results of the production of Xi and Xi-bar hyperons in central Pb+Pb interactions at 158 GeV/c per nucleon are presented. This analysis utilises a global reconstruction procedure, which allows a measurement of 4pi integrated yields to be made for the first time. Inverse slope paramters, which are determined from an exponential fit to the transverse mass spectra, are shown. Central rapidity densities are found to be 1.49 +- 0.08 and 0.33 +- 0.04 per event per unit of rapidity for Xi and Xi-bar respectively. Yields integrated to full phase space are 4.12 +- 0.02 and 0.77 +- 0.04 for Xi and Xi-bar. The ratio of Xi-bar/Xi at mid-rapidity is 0.22 +- 0.03.
In this paper, we introduce an extension of the XMG system (eXtensibleMeta-Grammar) in order to allow for the description of Multi-Component Tree Adjoining Grammars. In particular, we introduce the XMG formalism and its implementation, and show how the latter makes it possible to extend the system relatively easily to different target formalisms, thus opening the way towards multi-formalism.
Z-boson production in p–Pb collisions at √sNN = 8.16 TeV and Pb–Pb collisions at √sNN = 5.02 TeV
(2021)
Measurement of Z-boson production in p-Pb collisions at sNN−−−√=8.16 TeV and Pb-Pb collisions at sNN−−−√=5.02 TeV is reported. It is performed in the dimuon decay channel, through the detection of muons with pseudorapidity −4<ημ<−2.5 and transverse momentum pμT>20 GeV/c in the laboratory frame. The invariant yield and nuclear modification factor are measured for opposite-sign dimuons with invariant mass 60<mμμ<120 GeVc2 and rapidity 2.5<yμμcms<4. They are presented as a function of rapidity and, for the Pb-Pb collisions, of centrality as well. The results are compared with theoretical calculations, both with and without nuclear modifications to the Parton Distribution Functions (PDFs). In p-Pb collisions the center-of-mass frame is boosted with respect to the laboratory frame, and the measurements cover the backward (−4.46<yμμcms<−2.96) and forward (2.03<yμμcms<3.53) rapidity regions. For the p-Pb collisions, the results are consistent within experimental and theoretical uncertainties with calculations that include both free-nucleon and nuclear-modified PDFs. For the Pb-Pb collisions, a 3.4σ deviation is seen in the integrated yield between the data and calculations based on the free-nucleon PDFs, while good agreement is found once nuclear modifications are considered.
Z-boson production in p–Pb collisions at √sNN = 8.16 TeV and Pb–Pb collisions at √sNN = 5.02 TeV
(2020)
Measurement of Z-boson production in p-Pb collisions at sNN−−−√=8.16 TeV and Pb-Pb collisions at sNN−−−√=5.02 TeV is reported. It is performed in the dimuon decay channel, through the detection of muons with pseudorapidity −4<ημ<−2.5 and transverse momentum pμT>20 GeV/c in the laboratory frame. The invariant yield and nuclear modification factor are measured for opposite-sign dimuons with invariant mass 60<mμμ<120 GeVc2 and rapidity 2.5<yμμcms<4. They are presented as a function of rapidity and, for the Pb-Pb collisions, of centrality as well. The results are compared with theoretical calculations, both with and without nuclear modifications to the Parton Distribution Functions (PDFs). In p-Pb collisions the center-of-mass frame is boosted with respect to the laboratory frame, and the measurements cover the backward (−4.46<yμμcms<−2.96) and forward (2.03<yμμcms<3.53) rapidity regions. For the p-Pb collisions, the results are consistent within experimental and theoretical uncertainties with calculations that include both free-nucleon and nuclear-modified PDFs. For the Pb-Pb collisions, a 3.4σ deviation is seen in the integrated yield between the data and calculations based on the free-nucleon PDFs, while good agreement is found once nuclear modifications are considered.
Zinsänderungsrisiken und langfristige Zinsbindung vor dem Hintergrund der hessischen Zinsswaps
(2019)
Johannes Kasinger, Lukas Nöh und Alfons Weichenrieder nehmen die derzeitige Niedrigzinsphase und die Debatte um den Einsatz von Zinsswaps in Hessen zum Anlass, um die Fristigkeitsstruktur der Staatsschulden sowie den Einsatz von langfristigen Zinsswaps zu erörtern. Die Autoren betonen, dass im Gegensatz zu einem privaten Bauherrn der Staat nicht für sich wirtschaftet, sondern als Sachwalter der Steuerzahler agieren sollte. Den Zinserhöhungsrisiken des Staates stehen Zinserhöhungschancen der Steuerzahler in deren Funktion als Kreditgeber gegenüber. Letzteres schwächt das Argument für langfristige Verschuldung, sei es durch die Emission langfristiger Anleihen oder durch den Einsatz von Finanzderivaten. Grundsätzlich kann eine Glättung der Zinslast allerdings dabei helfen, die für den Schuldendienst notwendigen Steuern zu glätten und die Zusatzlast der Besteuerung zu mindern.
Die unten folgende Stellungnahme wurde dem Herausgeber der Zeitschrift für deutsches Altertum und deutsche Literatur angeboten, um eine Reihe von gravierenden Missverständnissen eines Rezensenten (Jürgen Schulz-Grobert) auszuräumen, die dieser in seiner Besprechung des zweiten Bandes der Sämtlichen Werke Johann Fischarts der Fachwelt gegenüber erkennen ließ. Der Herausgeber der Zeitschrift verweigerte sich einer Diskussion und lehnte den Abdruck unserer Entgegnung ab. Dies ist umso bedauerlicher, als uns der Rezensent den Vorwurf gemacht hat, unsere "Diskussionsbereitschaft [...] [sei] auch in anderen entscheidenden Fragen ausgesprochen begrenzt", was immer er damit meint.
Die zentrale These des vorliegenden Aufsatzes ist es, dass es ein Adam Smith-Problem im traditionellen Sinne nicht gibt, aber sehr wohl einen Selbstwiderspruch in Adam Smith ökonomischer Theorie.
Der Aufsatz behandelt zunächst die enge systematische Verbindung von Smith ökonomischer und ethischer Theorie. Die Verbindung beruht auf der Annahme eines höchsten Wesens und einer daraus gefolgerten prästabilisierenden Harmonie Dem religiösen Vertrauen auf eine natürliche Ordnung korresponiert der Glaube an die Gerechtigkeit des Marktes. Smith weitere politische Analyse produziert allerdings einen Selbstwiderspruch. Smith zeigt auf, dass die unternehmerischen Eigeninteressen dem Allgemeininteresse der Gesellschaft widersprechen und die Unternehmer zudem virtuoser und erfolgreicher beim Durchsetzen ihrer eigenen Interessen agieren als andere Marktakteure. Dennoch hält Smith an der Annahme fest, der Markt entfalte eine harmonisierende und den allseitigen Wohlstand fördernde Wirkung. Diese Annahme mutiert bei seinen Epigonen zu einer ontologischen Gewissheit.