Refine
Year of publication
- 2012 (137)
- 2011 (60)
- 2022 (53)
- 2005 (37)
- 2006 (36)
- 2008 (31)
- 2018 (31)
- 2007 (29)
- 2013 (29)
- 2004 (28)
- 2009 (28)
- 2023 (27)
- 2019 (24)
- 2021 (21)
- 2010 (19)
- 2014 (18)
- 2001 (17)
- 2015 (17)
- 2003 (15)
- 2002 (14)
- 2000 (13)
- 2017 (13)
- 2016 (9)
- 1994 (6)
- 1998 (6)
- 2020 (6)
- 1996 (5)
- 1997 (4)
- 1999 (3)
- 1995 (2)
- 2024 (2)
- 1944 (1)
- 1972 (1)
- 1984 (1)
- 1991 (1)
- 1992 (1)
- 1993 (1)
Document Type
- Conference Proceeding (746) (remove)
Language
- English (746) (remove)
Keywords
- Computerlinguistik (20)
- Informationsstruktur (16)
- Phonetik (12)
- Japanisch (9)
- Democracy (8)
- Englisch (7)
- Grammatik (7)
- Law (6)
- Maschinelle Übersetzung (6)
- Nungisch (6)
Institute
- Physik (241)
- Rechtswissenschaft (101)
- Medizin (79)
- Universitätsbibliothek (68)
- Informatik (37)
- Extern (27)
- Frankfurt Institute for Advanced Studies (FIAS) (27)
- Wirtschaftswissenschaften (14)
- Geowissenschaften (13)
- Biochemie und Chemie (11)
Phase transitions in a non-perturbative regime can be studied by ab initio Lattice Field Theory methods. The status and future research directions for LFT investigations of Quantum Chromo-Dynamics under extreme conditions are reviewed, including properties of hadrons and of the hypothesized QCD axion as inferred from QCD topology in different phases. We discuss phase transitions in strong interactions in an extended parameter space, and the possibility of model building for Dark Matter and Electro-Weak Symmetry Breaking. Methodological challenges are addressed as well, including new developments in Artificial Intelligence geared towards the identification of different phases and transitions.
The annotation of texts and other material in the field of digital humanities and Natural Language Processing (NLP) is a common task of research projects. At the same time, the annotation of corpora is certainly the most time- and cost-intensive component in research projects and often requires a high level of expertise according to the research interest. However, for the annotation of texts, a wide range of tools is available, both for automatic and manual annotation. Since the automatic pre-processing methods are not error-free and there is an increasing demand for the generation of training data, also with regard to machine learning, suitable annotation tools are required. This paper defines criteria of flexibility and efficiency of complex annotations for the assessment of existing annotation tools. To extend this list of tools, the paper describes TextAnnotator, a browser-based, multi-annotation system, which has been developed to perform platform-independent multimodal annotations and annotate complex textual structures. The paper illustrates the current state of development of TextAnnotator and demonstrates its ability to evaluate annotation quality (inter-annotator agreement) at runtime. In addition, it will be shown how annotations of different users can be performed simultaneously and collaboratively on the same document from different platforms using UIMA as the basis for annotation.
Over the past two decades the “one drug – one target – one disease” concept became the prevalent paradigm in drug discovery. The main idea of this approach is the identification of a single protein target whose inhibition leads to a successful treatment of the examined disease. The predominant assumption is that highly selective ligands would avoid unwanted side effects caused by binding to secondary non-therapeutic targets. In recent years the results of post-genomic and network biology showed that proteins rarely act in isolated systems but rather as a part of a highly connected network [1]. In addition this connectivity leads to more robust systems that cannot be interfered by the inhibition of a single target of that network and consequently might not lead to the desired therapeutic effect [2]. Furthermore studies prove that robust systems are rather affected by weak inhibitions of several parts than by a complete inhibition of a single selected element of that system [3]. Therefore there is an increasing interest in developing drugs that take effect on multiple targets simultaneously but is concurrently a great challenge for medicinal chemists. There has to be a sufficient activity on each target as well as an adequate pharmacokinetic profile [4]. Early design strategies tried to link the pharmacophors of known inhibitors, however these methods often lead to high molecular weight and low ligand efficacy. We present a new rational approach based on a retrosynthetic combinatorial analysis procedure [5] on approved ligands of multiple targets. These RECAP fragments are used to design a large combinatorial library containing molecules featuring chemical properties of each ligand class. The molecules are further validated by machine learning models, like random forests and self-organizing maps, regarding their activity on the targets of interest.
Introduction: aims and points of departure. 1. The problem of the knowledge of law: whether previous general rules may support a casuistic decision. 2. The problem of legal ethics: whether there are autonomous rights, which do not depend on positive law. 3. The ways of modern dogmatics to deal with these problems. 4. The question remains the same.
In this paper, an analysis of Robert Frost’s poem Mending Wall is presented as a hermeneutical key to investigate and criticize two examples of the oblivion of the reasonable distinction and the reasonable relationship between ethics and law proposed by a new Brazilian private law movement called Escola do Direito Civil-Constitucional (The Private-Constitutional School of Thought). Those examples of unreasonable relationship between ethics and law are: 1) the right to be loved and 2) the right to get a private education without paying for it.
In his works, Hans Kelsen elaborates several objections to the so-called “doctrine of natural law”, especially in his essay The Natural-Law Doctrine Before the Tribunal of Science. Kelsen argues that natural law theorists, searching for an absolute criterion for justice, try to deduce from nature the rules of human behavior. Robert P. George, in the essay Kelsen and Aquinas on the ‘Natural Law Doctrine’ examines his criticism and concludes that what Kelsen understands as the Natural-law doctrine does not include the natural law theory elaborated by Thomas Aquinas. In this paper, we will try to corroborate George’s theses and try to show how Aquinas’ natural law theory can be vindicated against Kelsens criticisms.
This article considers the Brazilian Legal System and the requirements of an act performed by public administration. To do so, it presents six main chapters. The first one considers Brazilian Constitution as it regards State form, legal and judicial systems. The second chapter presents the public administration stated in the Constitution. The requirements of a public administration act are presented in the third chapter. The improbity law, which determines how public administration acts should be performed, is presented on the fourth chapter. How one of the main judicial courts of Brazil has understood this law is the topic of the fifth chapter. The sixth chapter presents a proposal of how could be Phronesis used to solve misunderstandings about improbity in the Brazilian Legal System.
The Specialized Information Service Biodiversity Research (BIOfid) has been launched to mobilize valuable biological data from printed literature hidden in German libraries for over the past 250 years. In this project, we annotate German texts converted by OCR from historical scientific literature on the biodiversity of plants, birds, moths and butterflies. Our work enables the automatic extraction of biological information previously buried in the mass of papers and volumes. For this purpose, we generated training data for the tasks of Named Entity Recognition (NER) and Taxa Recognition (TR) in biological documents. We use this data to train a number of leading machine learning tools and create a gold standard for TR in biodiversity literature. More specifically, we perform a practical analysis of our newly generated BIOfid dataset through various downstream-task evaluations and establish a new state of the art for TR with 80.23% F-score. In this sense, our paper lays the foundations for future work in the field of information extraction in biology texts.
There is little doubt that Quantumchromodynamics (QCD) is the theory which describes strong interaction physics. Lattice gauge simulations of QCD predict that in the m,T plane there is a line where a transition from confined hadronic matter to deconfined quarks takes place. The transition is either a cross over (at low m) or of first order (at high m). It is the goal of the present and future heavy ion experiment at RHIC and FAIR to study this phase transition at different locations in the m,T plane and to explore the properties of the deconfined phase. It is the purpose of this contribution to discuss some of the observables which are considered as useful for this purpose.
Targeting for rare observables, the CBM experiment will operate at high interaction rates of up to 10 MHz, which is unprecedented in heavy-ion experiments so far. It requires a novel free-streaming readout system and a new concept of data processing. The huge data rates of the CBM experiment will be reduced online to the recordable rate before saving the data to the mass storage. Full collision reconstruction and selection will be performed online in a dedicated processor farm. In order to make an efficient event selection online a clean sample of particles has to be provided by the reconstruction package called First Level Event Selection (FLES).
The FLES reconstruction and selection package consists of several modules: track finding, track fitting, event building, short-lived particles finding, and event selection. Since detector measurements contain also time information, the event building is done at all stages of the reconstruction process. The input data are distributed within the FLES farm in a form of time-slices. A time-slice is reconstructed in parallel between processor cores. After all tracks of the whole time-slice are found and fitted, they are collected into clusters of tracks originated from common primary vertices, which then are fitted, thus identifying the interaction points. Secondary tracks are associated with primary vertices according to their estimated production time. After that short-lived particles are found and the full event building process is finished. The last stage of the FLES package is a selection of events according to the requested trigger signatures. The event reconstruction procedure and the results of its application to simulated collisions in the CBM detector setup are presented and discussed in detail.
Chromatic, geometric and space charge effects on laser accelerated protons focused by a solenoid
(2011)
We studied numerically emittance and transmission effects by chromatic and geometric aberrations, with and without space charge, for a proton beam behind a solenoid in the laser proton experiment LIGHT at GSI. The TraceWin code was employed using a field map for the solenoid and an initial distribution with exponential energy dependence close to the experiment. The results show a strong effect of chromatic, and a relatively weak one of geometric aberrations as well as dependence of proton transmission on distance from the solenoid. The chromatic effect has an energy filtering property due to the finite radius beam pipe. Furthermore, a relatively modest dependence of transmission on space charge is found for p production intensity below 1011.
The standard implementation of the HRG model has been shown to be unable to describe all the available data on QCD matter. Here we show the balance of repulsive and attractive hadronic interactions on QCD thermodynamics through observables both calculated by lattice simulations and measured in experiment. Attractive interactions are mediated by resonance formation, which are here implemented through extra states predicted by the Quark Model, while repulsive interactions are modelled by means of Excluded Volume (EV) effects. Informations on flavour dependent effective sizes are extracted. It is found that EV effects are present in lattice QCD thermodynamics, and are essential for a comprehensive description of higher order fluctuations of conserved charges.
[Tagungsbericht] Making finance sustainable: Ten years equator principles – success or letdown?
(2013)
In 2003, a number of banks adopted the Equator Principles (EPs), a voluntary Code of Conduct based on the International Finance Corporation’s (IFC) performance standards, to ensure the ecological and social sustainability of project finance. These so called Equator Principles Financial Institutions (EPFI) commit to requiring their borrowers to adopt sustainable management plans of environmental and social risks associated with their projects. The Principles apply to the project finance business segment of the banks and cover projects with a total cost of US $10 million or more. While for long developing countries relied on World Bank and other public assistance to finance infrastructure projects there has occurred a shift in recent years to private funding. The NGOs have been frustrated by this shift of project finance as they had spent their resources to exercise pressure on the public financial institutions to incorporate environmental and social standards in their project finance activities. However, after a shift of NGO pressure to private financial institutions the latter adopted the EPs for fear of reputational risks. NGOs had laid down their own more ambitious ideas about sustainable finance in the Collevecchio Declaration on Financial Institutions and Sustainability. Legally speaking, the EPs are a self-regulatory soft law instrument. However, it has a hard law dimension as the Equator Banks require their borrowers to comply with the EPs through covenants in the loan contracts that may trigger a default in a case of violation. ...
We show the first results for parton distribution functions within the proton at the physical pion mass, employing the method of quasi-distributions. In particular, we present the matrix elements for the iso-vector combination of the unpolarized, helicity and transversity quasi-distributions, obtained with Nf = 2 twisted mass cloverimproved fermions and a proton boosted with momentum = 0.83 GeV. The momentum smearing technique has been applied to improve the overlap with the proton boosted state. Moreover, we present the renormalized helicity matrix elements in the RI’ scheme, following the non-perturbative renormalization prescription recently developed by our group.
We discuss the current developments by the European Twisted Mass Collaboration in extracting parton distribution functions from the quasi-PDF approach. We concentrate on the non-perturbative renormalization prescription recently developed by us, using the RI′ scheme. We show results for the renormalization functions of matrix elements needed for the computation of quasi-PDFs, including the conversion to the MS scheme, and for renormalized matrix elements. We discuss the systematic effects present in the Z-factors and the possible ways of addressing them in the future.
The paper is structured as follows. Section 2.1 introduces the basic classes of adjectives that constitute the factual core of the paper. Section 2.2 summarizes in greater detail the X° and the XP movement approaches to word order variation within the DP. Section 3 briefly discusses problems for both approaches. Sections 4.1, 5.1, and 5.2 draw from Alexiadou (2001) and contain a discussion of Greek DS and its relevance for a re-analysis of the word order variation in the Romance DP. Section 4.2 introduces refinements to Alexiadou & Wilder (1998) and Alexiadou (2001). Section 5.3. discusses certain issues that arise from the analysis of postnominal adjectives in Romance as involving raising of XPs. Section 6 discusses phenomena found in other languages, which at first sight seem similar to DS. However, I show that double definiteness in e.g. Hebrew, Scandinavian or other Balkan languages constitutes a different type of phenomenon from Greek DS, thus making a distinction between determiners that introduce CPs (Greek) and those that are merely morphological/agreement markers (Hebrew, Scandinavian, Albanian).
Word formation in Distributed Morphology (see Arad 2005, Marantz 2001, Embick 2008): 1. Language has atomic, non-decomposable, elements = roots. 2. Roots combine with the functional vocabulary and build larger elements. 3. Roots are category neutral. They are then categorized by combining with category defining functional heads.
Experimental results and theoretical predictions in laser acceleration of protons achieved energies of ten to several tens of MeV. The LIGHT project (Laser Ion Generation, Handling and Transport) is proposed to use the PHELIX laser accelerated protons and to provide transport, focusing and injection into a conventional accelerator. This study demonstrates transport and focusing of laser-accelerated 10 MeV protons by a pulsed 18 T magnetic solenoid. The effect of co-moving electrons on the beam dynamics is investigated. The unique features of the proton distribution like small emittances and high yield of the order of 1013 protons per shot open new research area. The possibility of creating laser based injectors for ion accelerators is addressed. With respect to transit energies, direct matching into DTL's seems adequate. The bunch injection into a proposed CH− structure is under investigation at IAP Frankfurt. Options and simulation tools are presented.
This paper traces the development of National Socialist cultural and legal policy towards the arts. It examines the role of censure in this development starting with Hitler's first attempts at power in the Weimar republic. It then looks more closely into aspects of the development of new policies in and after 1933 and their implementation in institutions of the totalitarian state. As the paper shows, policies were carried out within a legal framework that included parliament and constitutional law but they were often also accompanied by aggressive political actions. Racial and nationalistic ideologies were at the heart of the National Socialist discourse about culture. This discourse quickly established modernity as its principal enemy and saw modernist culture (in the broad sense of the word), and especially art criticism, as being under Jewish domination. True German Kultur was set against this; Hitler himself promoted German art both through exhibitions and through policies which included the removal of un-German art and the exclusion of writers and artists who did not conform the cultural ideal. As Jewish artists and intellectuals in modernist culture posed the greatest threat to the establishment of a new German culture, Nazi policies towards the arts embarked on a process of censure, exclusion and annihilation. The purpose of these policies was nothing less than the elimination of all modernist (Jewish and ‘degenerate’) culture and any memory of it.
The superconducting CH-structure (Crossbar-H-mode) is a multi-cell drift tube cavity for the low and medium energy range operated in the H21-mode, which has been developed at the Institute for Applied Physics (IAP) of Frankfurt University. With respect to different high power applications two types of superconducting CH-structures (f = 325 MHz, β = 0.16, seven cells and f = 217 MHz, β = 0.059, 15 cells) are presently under construction and accordingly under development. The structural mechanical simulation is a very important aspect of the cavity design. Furthermore, several simulations with ANSYS Workbench have been performed to predict the deformation of the cavity walls due to the cavity cool-down, pressure effects and mechanical vibrations. To readjust the fast frequency changes in consequence of the cavity shape deformation, a new concept for the dynamic frequency tuning has been investigated, including a novel type of bellow-tuner.
The revolution will be tweeted : how the internet can stimulate the public exercise of freedoms
(2012)
This article discusses how new technologies of communication, especially the Internet and, more specifically, social network services, can interfere in social interactions and in political relations. The main objective is to problematize the concept of public liberty and verify how the new technologies can promote the reoccupation of public spaces and the recovery of public life, in opposition to the tendency to valorize the private sphere, observed in the second half of the twentieth century. The theoretical benchmark adopted for the investigation is Hannah Arendt's theory about the exercise of fundamental political capacities in order to establish a public space of freedom, as presented in “On Revolution”. The “Praia da Estação” (“Station Beach”) case is chosen to test the hypothesis. In 2010 in the Brazilian city of Belo Horizonte, different individuals articulated a movement through blogs, Twitter and facebook, in order to protest against the Mayor’s act that banned the assembling of cultural events in one of the main public places of the city, the “Praça da Estação” (Station Square). By applying Arendt's concepts to the selected case, it is possible to demonstrate that the Internet can assume an important role against governmental arbitrariness and abuse of power, as it can stimulate the public exercise of fundamental freedoms, such as freedom of assembly and manifestation.
The present study focuses on the beam line optimization from the heavy-ion synchrotron SIS18 to the HADES experiment. BOBYQA (Bound Optimization BY Quadratic Approximation) solves bound constrained optimization problems without using derivatives of the objective function. The Bayesian optimization is another strategy for global optimization of costly, noisy functions without using derivatives. A python programming interface to MADX allow the use of the python implementation of BOBYQA and Bayesian method. This gave the possibility to use tracking simulation with MADX to determine the loss budget for each lattice setting during the optimization and compare both optimization methods.
Due to the massive parallel operation modes at GSI accelerators, a lot of accelerator setup and re-adjustment has to be made by operators during a beam time. This is typically done manually using potentiometers and is very time-consuming. With the FAIR project the complexity of the accelerator facility increases further and for efficiency reasons it is recommended to establish a high level of automation for future operation. Modern Accelerator Control Systems allow a fast access to both, accelerator settings and beam diagnostics data. This provides the opportunity to implement algorithms for automated adjustment of e.g. magnet settings to maximize transmission and optimize required beam parameters. The fast-switching magnets in GSI-beamlines are an optimal basis for an automatic exploration of the parameter-space. The optimization of the parameters for the SIS18 multi-turn-injection using a genetic algorithm has already been simulated*. The first results of our automatized online parameter optimization at the CRYRING@ESR injector are presented here.
In this article I advance an account of human rights as individual claims that can be justified within the conceptual framework of social contract theories. The contractarian approach at issue here aims, initially, at a justification of morality at large, and then at the specific domain of morality which contains human rights concepts. The contractarian approach to human rights has to deal with the problem of universality, i.e. how can human rights be ‘universal’? I deal with this problem by examining the relationship between moral dispositions and what I call ‘diffuse legal structure’.
This paper proposes a new approach for the encoding of images by only a few important components. Classically, this is done by the Principal Component Analysis (PCA). Recently, the Independent Component Analysis (ICA) has found strong interest in the neural network community. Applied to images, we aim for the most important source patterns with the highest occurrence probability or highest information called principal independent components (PIC). For the example of a synthetic image composed by characters this idea selects the salient ones. For natural images it does not lead to an acceptable reproduction error since no a-priori probabilities can be computed. Combining the traditional principal component criteria of PCA with the independence property of ICA we obtain a better encoding. It turns out that this definition of PIC implements the classical demand of Shannon’s rate distortion theory.
While the existence of a strongly interacting state of matter, known as “quark-gluon plasma” (QGP), has been established in heavy ion collision experiments in the past decade, the task remains to map out the transition from the hadronic matter to the QGP. This is done by measuring the dependence of key observables (such as particle suppression and elliptic flow) on the collision energy of the heavy ions. This procedure, known as "beam energy scan", has been most recently performed at the Relativistic Heavy Ion Collider (RHIC).
Utilizing a Boltzmann+hydrodynamics hybrid model, we study the collision energy dependence of initial state eccentricities and the final state elliptic and triangular flow. This approach is well suited to investigate the relative importance of hydrodynamics and hadron transport at different collision energies.
Challenges of FAIR phase 0
(2018)
After two-year's shutdown, the GSI accelerators plus the latest addition of storage ring CRYRING, will be back into operation in 2018 as the FAIR phase 0 with the goal to fulfill the needs of scientific community and the FAIR accelerators and detector development. Even though GSI has been well known for its operation of a variety of ion beams ranging from proton up to uranium for multi research areas such as nuclear physics, astrophysics, biophysics, material science, the upcoming beam time faces a number of challenges in re-commissioning its existing circular accelerators with brand new control system and upgrade of beam instrumentations, as well as in rising failures of dated components and systems. The cycling synchrotron SIS18 has been undergoing a set of upgrade measures for fulfilling future FAIR operation, among which many measures will also be commissioned during the upcoming beam time. This paper presents the highlights of the challenges such as re-establishing the high intensity heavy ion operation as well as parallel operation mode for serving multi users. The status of preparation including commissioning results will also be reported.
In keeping with the views of its guru, Stephen Harnard, the open access movement is only prepared to discuss the two models of the "green road" and the "golden road" as sole alternatives for the future of scientific publishing. The "golden road" is put forward as the royal road for solving the journals crisis. However, no one has drawn attention to the fact that the golden road represents a purely socialist solution to a free-market problem and thus continues the "samizdat" tradition of underground literature in the former Eastern bloc. The present paper reveals the alarmingly low level at which the open access movement intends to publish top-class results from science and research, and the low degree of professionalism with which they are satisfied.
Akrasia, or weak-will, is a term denoting a phenomenon when one acts freely and intentionally contrary to his or her better judgment. Discussion of akrasia originates in the Plato's Protagoras where he states that “No one who either knows or believes that there is another possible course of action, better than the one he is following, will ever continue on his present course”. However, in his influential article from 1970, Donald Davidson argued that akrasia is theoretically possible yet irrational. Some other critics of Plato's stance point out that phenomenon of akrasia is common in our everyday experience, therefore it must be possible.
These two arguments in favor of akrasia existence – theoretical and empirical – will be discussed from both – philosophical and psychological points of view. Especially, George Ainslie's argument that akrasia results from hyperbolic discounting will be taken into consideration to show how it affects traditional thinking about weak-willed actions.
Finally, the paper will discuss how the contemporary notion of akrasia may affect the idea of responsibility and free will. Implications for the philosophy of law will be shown, i.a. whether it is possible to claim that a given example of a weak-willed action was indeed free and intentional and one should be held responsible for its results.
The increase in the volume of litigation verified since the 1990’s, having the Brazilian society as context, made the judiciary open itself to new technologies which facilitate the access to justice, as well as to a faster resolution of the demands. However, the intense insertion of technical rationalization in the process and decision operations by the judiciary, during the last years, led to a legalization supported by presuppositions of technical-instrumental regulation. According to the goal policy established by the CNJ, the annoyance of the instrumental rationality is present “with respect to purposes”, which demands, more and more, a mere fulfillment of previously instituted goals from the law operators. The matter is to know if the implementation of new technologies to solve the growing litigation coming from the complexity of societies is enough to adjust the Law to a post-conventional platform. If the social complexity implies resources coming from new technologies, it’s not certain that such technologies, on their own, satisfactorily answer a judicial model which, seen under the eyes of the post- conventional legitimacy and regulation, is adequate to complex societies. This illustrates that a judicial model, able to deal with the social plurality, must take into account not only the rules of instrumental rationality, but also the fundamental issues of communicative rationality. This current work intends to evaluate if the applicability of the instrumental rationality in the judiciary equally allows the law to extent the useful conditions of the communicative rationality to the consensual formation of will and opinion in the Democratic State of Law.
Delayed-onset muscle soreness (DOMS) is a common symptom in people participating in exercise, sport, or recreational physical activities. Several remedies have been proposed to prevent and alleviate DOMS. In 2008 and 2015, two studies have been conducted to investigate the effects of acupuncture on symptoms and muscle function in eccentric exercise-induced DOMS of the biceps brachii muscle. In 2008 a prospective, randomized, controlled, observer and subject-blinded trial was undertaken with 22 healthy subjects (22–30 years; 12 females) being randomly assigned to three treatment groups: real acupuncture (deep needling at classic acupuncture points and tender points; n = 7), sham-acupuncture (superficial needling at non-acupuncture points; n = 8), and control (n = 7). In 2015, a five-arm randomized controlled study was conducted with 60 subjects (22 females, 23.6 ± 2.8 years). Participants were randomly allocated to needle, laser, sham needle, sham laser acupuncture, and no intervention.
In both cases treatment was applied immediately, 24 and 48 hours after DOMS induction.
The outcome measures included pain perception (visual analogue scale; VAS), mechanical pain threshold (MPT), maximum isometric voluntary force (MIVF) and pressure pain threshold (PPT).
Results: In 2008, following nonparametric testing, there were no significant differences between groups in outcome measures at baseline. After 72 hours, pain perception (VAS) was significantly lower in the acupuncture group compared to the sham acupuncture and control subjects. However, the mean MPT and MIVF scores were not significantly different between groups. This lead to the conclusion, that acupuncture seemed to have no effects on MPT and muscle function, but reduced perceived pain arising from exercise-induced DOMS.
The more recent results from 2015 indicated that neither verum nor sham interventions significantly improved outcomes within 72 hours when compared with the no treatment control (P > 0.05).
Based on a non-rigorous formalism called the “cavity method”, physicists have made intriguing predictions on phase transitions in discrete structures. One of the most remarkable ones is that in problems such as random k-SAT or random graph k-coloring, very shortly before the threshold for the existence of solutions there occurs another phase transition called condensation [Krzakala et al., PNAS 2007]. The existence of this phase transition seems to be intimately related to the difficulty of proving precise results on, e. g., the k-colorability threshold as well as to the performance of message passing algorithms. In random graph k-coloring, there is a precise conjecture as to the location of the condensation phase transition in terms of a distributional fixed point problem. In this paper we prove this conjecture, provided that k exceeds a certain constant k0.
This paper is aimed to re-elaborate questions and discuss them rather than presenting answers. It starts with the dialog concerning specific contributions of philosophy of language to Law, followed by the re-elaboration of some yet unanswered problems, as well as the discussion of possible paths for this issue.
We present results of lattice QCD simulations with mass-degenerate up and down and mass-split strange and charm (Nf = 2+1+1) dynamical quarks using Wilson twisted mass fermions at maximal twist. The tuning of the strange and charm quark masses is performed at three values of the lattice spacing a ~ 0:06 fm, a ~ 0:08 fm and a ~ 0:09 fm with lattice sizes ranging from L ~ 1:9 fm to L ~ 3:9 fm. We perform a preliminary study of SU(2) chiral perturbation theory by combining our lattice data from these three values of the lattice spacing.
We present first results from runs performed with Nf = 2+1+1 flavours of dynamical twisted mass fermions at maximal twist: a degenerate light doublet and a mass split heavy doublet. An overview of the input parameters and tuning status of our ensembles is given, together with a comparison with results obtained with Nf = 2 flavours. The problem of extracting the mass of the K- and D-mesons is discussed, and the tuning of the strange and charm quark masses examined. Finally we compare two methods of extracting the lattice spacings to check the consistency of our data and we present some first results of cPT fits in the light meson sector.
We present the status of runs performed in the twisted mass formalism with Nf =2+1+1 flavours of dynamical fermions: a degenerate light doublet and a mass split heavy doublet. The procedure for tuning to maximal twist will be described as well as the current status of the runs using both thin and stout links. Preliminary results for a few observables obtained on ensembles at maximal twist will be given. Finally, a reweighting procedure to tune to maximal twist will be described.
A CW RFQ prototype
(2011)
A short RFQ prototype was built for RF-tests of high power RFQ structures. We will study thermal effects and determine critical points of the design. HF-simulations with CST Microwave Studio and measurements were done. The cw-tests with 20 kW/m RF-power and simulations of thermal effects with ALGOR were finished successfully. The optimization of some details of the HF design is on focus now. First results and the status of the project will be presented.
Beam measurements with the new RFQ beam matching section at the Frankfurt Funneling Experiment
(2011)
Funneling is a method to increase low energy beam currents in multiple stages. The Frankfurt Funneling Experiment is a model of such a stage. The experiment is built up of two ion sources with electrostatic lens systems, a Two-Beam-RFQ accelerator, a funneling deflector and a beam diagnostic system. The two beams are bunched and accelerated in a Two-Beam RFQ. A funneling deflector combines the bunches to a common beam axis. A new beam transport system between RFQ accelerator and deflector has been constructed and mounted. With these extended RFQ-electrodes the drift between the Two-Beam-RFQ and the rf-deflector will be minimized and therefore unwanted emittance growth reduced. After first rf measurements current work are beam tests with the improved Two-Beam-RFQ. First results will be presented.
We analyze the reaction dynamics of central Pb+Pb collisions at 160 GeV/nucleon. First we estimate the energy density pile-up at mid-rapidity and calculate its excitation function: The energy density is decomposed into hadronic and partonic contributions. A detailed analysis of the collision dynamics in the framework of a microscopic transport model shows the importance of partonic degrees of freedom and rescattering of leading (di)quarks in the early phase of the reaction for E >= 30 GeV/nucleon. The energy density reaches up to 4 GeV/fm 3, 95% of which are contained in partonic degrees of freedom. It is shown that cells of hadronic matter, after the early reaction phase, can be viewed as nearly chemically equilibrated. This matter never exceeds energy densities of 0.4 GeV/fm 3, i.e. a density above which the notion of separated hadrons loses its meaning. The final reaction stage is analyzed in terms of hadron ratios, freeze-out distributions and a source analysis for final state pions.
Stabilized Wilson fermions are a reformulation of Wilson clover fermions that incorporates several numerical stabilizing techniques, but also a local change of the fermion action - the original clover term being replaced with an exponentiated version of it. We intend to apply the stabilized Wilson fermions toolbox to the thermodynamics of QCD, starting on the Nf=3 symmetric line on the Columbia plot, and to compare the results with those obtained with other fermion discretizations.
The debates about the interrelations between reason and law have undergone a change after the eighteenth century. References to the recta ratio of jusnaturalistic tradition have not disappeared, but other comprehensions of legal reason have developed. The European debate over legal positivist science has contributed to this in a manifestation of the rationality of law. This transformation may be considered the basis for the development of true “legal technologies” throughout the twentieth century. On the other hand, in the context of theories of positive law which have taken the relation between ethics and legal reason as a problem, the formation of discourses on coercion (Austin and Holmes), on validity (Kelsen and Hart) and on justification (Alexy and Dworkin) has also contributed to the emergence of new models of legal rationality. In this paper, it is highlighted that the construction of these models is linked to the “points of view” which theories have proposed as legitimate for the interpretation of legal phenomenon. And it is suggested that the discussion over points of view (defined as “focuses”, term which is close to the notion of “attitude”, “stance” or “place of speech”) may aid in the debate on the normativity of law.
Background: The most frequent therapy of hydrocephalus is the implantation of ventriculoperitoneal shunts for diverting cerebrospinal fluid from the ventricles into the peritoneum. We compared two adjustable valves, the proGAV and proGAV 2.0, for complications which resulted in revision operations.
Methods: Four hundred patients who underwent primary shunt implantation between 2014 and 2020 were analyzed for overall revision rate, one-year revision rate, revision free survival and overall survival observing patient age group, gender, etiology of hydrocephalus, implantation site, prior diversion of cerebrospinal fluid and cause of revision.
Results: All data were available of all 400 patients (female/male 208/192). Overall, 99 patients underwent revision surgery after primary implantation. ProGAV valve was implanted in 283 patients, proGAV 2.0 in 117 patients. There was no significant difference between the two shunt valves concerning revision rate (p=0.8069), one-year revision rate (p=0.9077), revision free survival (p=0.6921) and overall survival (p=0.3232). Furthermore, regarding one-year revision rate, we observed no significant difference between the two shunt valves in pediatric patients (40.7% vs 27.6%; p=0.2247). Revision operation had to be performed more frequently in pediatric patients (46.6% vs 24.8%; p=0.0093) with a significant higher number of total revisions with proGAV than proGAV 2.0 (55.9% vs. 27.6%; p=0.0110) most likely due to longer follow up in the proGAV -group.
Conclusion: According to the target variables we analyzed, aside from lifetime revision rate in pediatric patients there is no significant difference between the two shunt valves. From our subjective point of view, implantation of the newer proGAV 2.0 valve is preferable due to higher adjustment comfort for both patients and physicians.
This paper traces the military role of Tibnīn and its rulers in the Latin East against the Muslims until 1187/ 583. Tibnīn played a key role in overcoming the Muslims in Tyre and controlled it in 1124. It also played a vital role in the conflict between Damascus and the Kingdom of Jerusalem. Tibnīn participated in defending Antioch, Banyas, Hebron and Transjordan several times. Furthermore, its soldiers and Knights joined the army of the Kingdom of Jerusalem to capture Ascalon in 1153, and joined the campaigns of Amaury I, King of Jerusalem, against Egypt from 1164 to1169. The military situation of Tibnīn under the rule of the royal house until its fall to the Muslims in 1187/ 583 will be studied as well
In this increasingly complex world of learned information delivery and discovery - is it possible that the "free lunch" the Publishing world worries about could come true? Although Open Access and Institutional Repositories have not (yet) created the "scorched earth" effect many were predicting, they are slowly and inevitably gaining momentum. Broader access to top-level information via Google (and others) does indeed appear to be "good enough" for many in their search for content. But you rarely get food for free in a good quality restaurant. You pay for the selection, preparation, speed and expertise of the delivery. At the soup kitchen the food can often be filling - but the queue will be long, the wait even longer and there is no chance of silver service or à la carte. If you are unfortunate enough to have little choice then this may be a great solution. Others will be willing to pay for a more satisfactory meal. As in all aspects of life, diversification and specialisation are fundamental forces. The publishing community in the years to come will continue to develop its offerings for a variety of needs that require more than just broth. To stretch the analogy, the ongoing presence of tap water in our lives has done little to halt the extraordinary rise of bottled water as part of our staple diet. Business reality will continue to settle these types of debate; my bet is that the commercial publishers see a role as providing information that commands an intrinsic value proposition to enough customers to remain economically viable for some time to come. Inspired by the comments and ideas expounded by Dr. James O'Donnell of Georgetown University on the liblicense listserv on 20th July this year, this paper will look to expand on the analogy and identify the good, the bad - but importantly the difference in information quality and access that will result in the radically changed (but still co-existent) information landscape of tomorrow.
Recent results on baryon production in relativistic heavy ion collisions show that a revision of the chemical freeze-out conditions is necessary. Particularly, there is evidence that chemical freezeout does not occur at full chemical equilibrium. We present a method to reconstruct original hadronization conditions and show that the newly found points in the T − µB plane are in very good agreement with extrapolations of the lattice QCD critical line.
Background: A growing interest exists in using polymeric nanoparticles (NPs) especially functionalized with surface-active substances as carriers across the blood brain barrier (BBB) for potentially effective drugs in traumatic brain injury (TBI). However, the organ distribution of intravenous administrated biodegradable and non-biodegradable NPs coated with different surfactants, how much of the administrated dose reach the brain parenchyma in areas with intact and opened BBB after trauma, as well as whether they elicit an inflammatory response is still to be clarified.
Methods: The organ distribution, brain penetration and eventual inflammatory activation of polysorbate-80 (Tw80) and sodium-lauryl-sulfate (SDS) coated poly l-lactide (PLLA) and perfluorodecyl acrylate (PFDL) nanoparticles were evaluated after intravenous administration in rats prior and after undergoing controlled cortical impact (CCI).
Results: A significant highest NP uptake at 4 and 24 hs was observed in the liver and spleen, followed by the brain and kidney, with minimal concentrations in the lungs and heart for all NPs. After CCI, a significant increase of NP uptake at 4 hs and 24 hs was observed within the traumatized hemisphere, especially in the perilesional area, although NPs were still found in areas away from CCI and the contralateral hemisphere in similar concentrations as in non-CCI subject. NPs were localized in neurons, glial and endovascular cells. Immunohistochemical staining against GFAP, Iba1, TNFα and IL1β demonstrated no glial activation or neuroinflamatory changes.
Conclusions: Tw80 and SDS coated biodegradable (PLLA) and non-biodegradrable (PFDL) NPs reach the brain parenchyma in both areas of traumatized and undamaged brain with disrupted and intact BBB, even though a high amount of them are retained in the liver and the spleen. No inflammatory reaction is elicited by these NPs within 24 hs after application. These preliminary promising results postulate the effectiveness and safety of these NPs as drug-carriers for the treatment of TBI.
The Specialised Information Service Performing Arts (SIS PA) is part of a funding programme by the German Research Foundation that enables libraries to develop tailor-made services for individual disciplines in order to provide researchers direct access to relevant materials and resources from their field. For the field of performing arts, the SIS PA is aggregating metadata about theater and dance resources from currently, mostly, German-speaking cultural heritage institutions in a VuFind-based search portal.
In this article, we focus on metadata quality and its impact on the aggregation workflow by describing the different, possibly data provider-specific, process stages of improving data quality in order to achieve a searchable, interlinked knowledge base. We also describe lessons learned and limitations of the process.
Background: The rationale for gathering information from plants procuring nitrogen through symbiotic interactions controlled by a common genetic program for a sustainable biofuel production is the high energy demanding application of synthetic nitrogen fertilizers. We curated sequence information publicly available for the biofuel plant sugarcane, performed an analysis of the common SYM pathway known to control symbiosis in other plants, and provide results, sequences and literature links as an online database.
Methods: Sugarcane sequences and informations were downloaded from the nucEST database, cleaned and trimmed with seqclean, assembled with TGICL plus translating mapping method, and annotated. The annotation is based on BLAST searches against a local formatted plant Uniprot90 generated with CD-HIT for functional assignment, rpsBLAST to CDD database for conserved domain analysis, and BLAST search to sorghum's for Gene Ontology (GO) assignment. Gene expression was normalized according the Unigene standard, presented as ESTs/100 kb. Protein sequences known in the SYM pathway were used as queries to search the SymGRASS sequence database. Additionally, antimicrobial peptides described in the PhytAMP database served as queries to retrieve and generate expression profiles of these defense genes in the libraries compared to the libraries obtained under symbiotic interactions.
Results: We describe the SymGRASS, a database of sugarcane orthologous genes involved in arbuscular mycorrhiza (AM) and root nodule (RN) symbiosis. The database aggregates knowledge about sequences, tissues, organ, developmental stages and experimental conditions, and provides annotation and level of gene expression for sugarcane transcripts and SYM orthologous genes in sugarcane through a web interface. Several candidate genes were found for all nodes in the pathway, and interestingly a set of symbiosis specific genes was found.
Conclusions: The knowledge integrated in SymGRASS may guide studies on molecular, cellular and physiological mechanisms by which sugarcane controls the establishment and efficiency of endophytic associations. We believe that the candidate sequences for the SYM pathway together with the pool of exclusively expressed tentative consensus (TC) sequences are crucial for the design of molecular studies to unravel the mechanisms controlling the establishment of symbioses in sugarcane, ultimately serving as a basis for the improvement of grass crops.
While the sortal constraints associated with Japanese numeral classifiers are wellstudied, less attention has been paid to the details of their syntax. We describe an analysis implemented within a broadcoverage HPSG that handles an intricate set of numeral classifier construction types and compositionally relates each to an appropriate semantic representation, using Minimal Recursion Semantics.
The Frankfurt University Library possesses one of the outstanding Africana Collections in continental Europe; its regional anddisciplinary scope is unique in Germany. Today about 5,000 new acquisitions a year have accumulated over 200,000 items on Africa south of the Sahara. Some 50,000 historical and rare photographs are fully digitized and freely accessible. Together with a collection of around 18,000 books stemming from the collections of the German Colonial Society at the end of the 19th and the beginning of the 20th century they constitute the historical foundations of the collection. Recently the University Library Frankfurt and the library of the GIGA Institute of African Affairs, Hamburg, started the project ilissAfrica (internet library sub-Saharan Africa), a central subject gateway for online resources and a powerful tool for bibliographic research. These new services will be indispensable for researchers and librarians of African Studies and will promote African studies worldwide.
The paper presents an overview about some of the international relevant projects of digital resources in Germany. Online presentations of primary sources, e.g. photographic material, and bibliographic tools supporting research, such as cross searching, will be presented as potential partners of resource sharing with North America. Not only the possibility of cooperation will be sketched, but also necessary preliminary work and some obstacles will be outlined. This report is accompanied by a short characterization of African studies in Germany and the status quo of Open Access-initiatives.
[Abstract] Occurrence of hepatitis B virus (HBV) reactivation following kidney transplantation
(2004)
We present an effort for the development of multilingual named entity grammars in a unification-based finite-state formalism (SProUT). Following an extended version of the MUC7 standard, we have developed Named Entity Recognition grammars for German, Chinese, Japanese, French, Spanish, English, and Czech. The grammars recognize person names, organizations, geographical locations, currency, time and date expressions. Subgrammars and gazetteers are shared as much as possible for the grammars of the different languages. Multilingual corpora from the business domain are used for grammar development and evaluation. The annotation format (named entity and other linguistic information) is described. We present an evaluation tool which provides detailed statistics and diagnostics, allows for partial matching of annotations, and supports user-defined mappings between different annotation and grammar output formats.
poster presentation at the 31st International Symposium on Lattice Field Theory LATTICE 2013:
We explore and compare three mixed action setups with Wilson twisted mass sea quarks and different valence quark actions: (1) Wilson twisted mass, (2) Wilson twisted mass + clover and (3) Wilson + clover. Our main goal is to reduce lattice discretization errors in mesonic spectral quantities, in particular to reduce twisted mass parity and isospin breaking.
We compare away-side hadron correlations with respect to tagged heavy quark jets computed within a weakly coupled pQCD and a strongly coupled AdS/CFT model. While both models feature similar far zone Mach and diffusion wakes, the far zone stress features are shown to be too weak to survive thermal broadening at hadron freeze-out. Observable away-side conical correlations are dominated by the jet-induced transverse flow in near zone “Neck” region, which differs significantly for both models. Unlike in AdS/CFT, the induced transverse flow in the Neck zone is too weak in pQCD to produce conical correlations after Cooper-Frye freeze-out. The observation of conical correlations violating Mach’s law would favor the strongly-coupled AdS/CFT string drag dynamics, while their absence would favor weakly-coupled pQCD-based hydrodynamics.
We study tetraquark resonances with lattice QCD potentials computed for two static quarks and two dynamical quarks, the Born-Oppenheimer approximation and the emergent wave method of scattering theory. As a proof of concept we focus on systems with isospin I = 0, but consider different relative angular momenta l of the heavy b quarks. We compute the phase shifts and search for S and T matrix poles in the second Riemann sheet. We predict a new tetraquark resonance for l = 1, decaying into two B mesons, with quantum numbers I(JP) = 0(1−), mass MeV and decay width MeV.
Study of I = 0 bottomonium bound states and resonances based on lattice QCD static potentials
(2022)
We investigate I=0 bottomonium bound states and resonances in S, P, D and F waves using lattice QCD static-static-light-light potentials. We consider five coupled channels, one confined quarkonium and four open B(∗)B¯(∗) and B(∗)sB¯(∗)s meson-meson channels and use the Born-Oppenheimer approximation and the emergent wave method to compute poles of the T matrix. We discuss results for masses and decay widths and compare them to existing experimental results. Moreover, we determine the quarkonium and meson-meson composition of these states to clarify, whether they are ordinary quarkonium or should rather be interpreted as tetraquarks.
This work intends to analysis the philosophy of history and to discuss the consequences of this death to the Critical Theory. The concept of reason and the devices of democracy and human rights are discussed in a revision of the historical debate about the end of history operates the life in the interior of the modern society, especially about the intellectual condition at the information society.
A 5-gap timing RPC equipped with patterned electrodes coupled to both charge-sensitive and timing circuits yields a time accuracy of 77 ps along with a position accuracy of 38 μm. These results were obtained by calculating the straight-line fit residuals to the positions provided by a 3-layer telescope made out of identical detectors, detecting almost perpendicular cosmic-ray muons. The device may be useful for particle identification by time-of-flight, where simultaneous measurements of trajectory and time are necessary.
We perform a two-flavor dynamical lattice computation of the Isgur-Wise functions t1/2 and t3/2
at zero recoil in the static limit. We find t1/2(1) = 0.297(26) and t3/2(1) = 0.528(23) fulfilling
Uraltsev’s sum rule by around 80%. We also comment on a persistent conflict between theory and
experiment regarding semileptonic decays of B mesons into orbitally excited P wave D mesons,
the so-called “1/2 versus 3/2 puzzle”, and we discuss the relevance of lattice results in this
context.
The requalification of Habermas’ discussions on political philosophy and legal theory after the publication of Zwischen Naturalismus und Religion (2005), and his most recent texts and debates on religion and the public sphere, suggest a revision of the Habermasian theory of rationalization as it was firstly presented in Theorie des Kommunikativen Handelns (1982), especially on what concerns the processes of dessacralization and the linguistification of religious authority. In search of contributing to this revision, this paper intends to focus on the problem of a supposedly “lost” aesthetic-expressive understanding of religious authority in Habermas’s theory of rationalization, which may have contributed to a theory of law in Faktizität und Geltung (1992) that does not give satisfactory account to the aesthetical-expressive character of the modern understanding of legal authority. A better understanding of this special character, however, may contribute not only to the avoidance of fundamentalisms and new attempts of “aesthetization of politics”, but also to a rational strengthening of the solidarity of the citizens of democratic constitutional states.
This paper aims to discuss in which sense public hearings in supreme courts of democratic rules of law can be seen as proceduralization of popular sovereignty policies. These policies constitute expressions of a normative claim for a wider “publicization of law” by democratic states’ institutional powers and organs; a claim that becomes evident when one undertakes an intersubjective interpretation of law. This theoretical argument will be presented in the first section of the paper through a new articulation of Jürgen Habermas’ discursive theory of law and his most recent studies on the concept of political public sphere. The theoretical section gives normative and procedural criteria for the second section of the paper, which consists on a critical analysis of the procedures and practical cases of public hearings held at the Brazilian Supreme Court, constituting the first scientific study to date on the Court’s use of this legal instrument.
Although their applications have not yet extended widely due to their incipient state, nano-technologies and nano-medicines may be presumed to be at the origin of the next great technological revolution, foreseeably contributing to a new stage with respect to evolutions in mankind’s progress. Their possibilities are truly immense in enormously varied spheres, but the risks and uncertainties they engender are enormous too. Because access and use of the unceasingly increasing mega-quantity of information they generate will place further strain on the protection of personal life, privacy, the exercise of freedom, as well as the safeguarding of other fundamental principles and rights.
Universities of the 21st century heavily depend on an efficient IT infrastructure for teaching, research and administration. E-Learning environments, blended learning and all sorts of multimedia and cooperative environments are important requirements for teaching at universities and for further education. Many of the organizational structures such as continuous examinations, interdisciplinary studies, ECTS system and many more require efficient examination administration systems as well as room and personnel management. Research is based on Internet inquiries, eScience, eLibrary and other IT supported media. Research results must be documented and archived in a digital way and results must be distributed and marketed through the Internet. The efficient administration of all kinds of resources of the university must be planned using management support systems. Decisions of university heads must be prepared from well documented statistics and analysis software. In the past, many of the applications named above for teaching, research and administration have been performed by separate software applications and run in distributed environments of universities. Powerful server structures and networking features as well as new software technology like service-oriented architectures make it necessary to recentralize the IT services of the university after a long period of decentralization. Based on metadirectories and unified access procedures, all of the software components must be integrated into a seamless IT infrastructure. To guarantee consistency, data must not be stored in a redundant way. Project IntegraTUM of Technische Universität München started in 2003 and is an umbrella project to define such a seamless IT infrastructure for a university with 22.000 students and approximately 10.000 staff. The talk describes the project, which besides the definition of new technology is based on a fundamental process analysis of the university and many changes in the organizational structure.
We review our knowledge of the phase diagram of QCD as a function of temperature, chemical potential and quark masses. The presence of tricritical lines at imaginary chemical potential m = i p 3 T, with known scaling behaviour in their vicinity, puts constraints on this phase diagram, especially in the case of two light flavors. We show first results in our project to determine the finite-temperature behaviour in the Nf = 2 chiral limit.
The Deep Linguistic Processing with HPSG Initiative (DELH-IN) provides the infrastructure needed to produce open-source semantic transfer-based machine translation systems. We have made available a prototype Japanese-English machine translation system built from existing resources include parsers, generators, bidirectional grammars and a transfer engine.
We consider a dual representation of an effective three-dimensional Polyakov loop model for the SU(3) theory at nonzero real chemical potential. This representation is free of the sign problem and can be used for numeric Monte-Carlo simulations. These simulations allow us to locate the line of second order phase transitions, that separates the region of first order phase transition from the crossover one. The behavior of local observables in different phases of the model is studied numerically and compared with predictions of the mean-field analysis. Our dual formulation allows us to study also Polyakov loop correlation functions. From these results, we extract the screening masses and compare them with large-N predictions.
The broad class of U(N) and SU(N) Polyakov loop models on the lattice are solved exactly in the combined large N, Nf limit, where N is a number of colors and Nf is a number of quark flavors, and in any dimension. In this ’t Hooft-Veneziano limit the ratio N/Nf is kept fixed. We calculate both the free energy and various correlation functions. The critical behavior of the models is described in details at finite temperatures and non-zero baryon chemical potential. Furthermore, we prove that the calculation of the N-point (baryon) correlation function reduces to the geometric median problem in the confinement phase. In the deconfinement phase we establish an existence of the complex masses and an oscillating decay of correlations in a certain region of parameters.
Using a partonic transport model we investigate the evolution of conical structures in ultrarelativistic matter. Using two different source terms and varying the transport properties of the matter we study the formation of Mach Cones. Furthermore, in an additional study we extract the two-particle correlations from the numerical calculations and compare them to an analytical approximation. The influence of the viscosity to the shape of Mach Cones and the corresponding two-particle correlations is studied by adjusting the cross section of the medium.
We discuss recent applications of the partonic pQCD based cascade model BAMPS with focus on heavy-ion phenomeneology in hard and soft momentum range. The nuclear modification factor as well as elliptic flow are calculated in BAMPS for RHIC end LHC energies. These observables are also discussed within the same framework for charm and bottom quarks. Contributing to the recent jet-quenching investigations we present first preliminary results on application of jet reconstruction algorithms in BAMPS. Finally, collective effects induced by jets are investigated: we demonstrate the development of Mach cones in ideal matter as well in the highly viscous regime.
We discuss recent applications of the partonic perturbative QCD based cascade model BAMPS with focus on heavy-ion phenomenology in the hard and soft momentum range. First, the elliptic flow and suppression of charm and bottom quarks are studied at LHC energies. Thereafter, we compare in a detailed study the standard Gunion-Bertsch approximation of the matrix elements for inelastic processes to the exact results in leading order perturbative QCD. Since a disagreement is found, we propose an improved Gunion-Bertsch matrix element, which agrees with the exact result in all phase space regions.
To investigate the formation and the propagation of relativistic shock waves in viscous gluon matter we solve the relativistic Riemann problem using a microscopic parton cascade. We demonstrate the transition from ideal to viscous shock waves by varying the shear viscosity to entropy density ratio n/s. Furthermore we compare our results with those obtained by solving the relativistic causal dissipative fluid equations of Israel and Stewart (IS), in order to show the validity of the IS hydrodynamics. Employing the parton cascade we also investigate the formation of Mach shocks induced by a high-energy gluon traversing viscous gluon matter. For n/s = 0.08 a Mach cone structure is observed, whereas the signal smears out for n/s >=0.32.
Information supply is the genuine task of academic institutions as well as of publishers. Publishers profit from copyright provisions which give them exclusive rights in their products. The same copyright provisions are often the limiting factor when academic institutions try to improve their service to the academic community. This is the case in particular when it comes to digital access to information. In a so-called "Second Basket", the German copyright act has just been revised, introducing explicit legal exemptions for document deliveries and on the spot consultation of works contained in public libraries' collections. At the same time, unresolved issues remain with respect to existing legal exemptions as well as the new ones. What will the legal parameters look like for academic institutions once the "Second basket" has been put into force? How can libraries work with these provisions in practice?
In QCD at large enough isospin chemical potential Bose-Einstein Condensation (BEC) takes place, separated from the normal phase by a phase transition. From previous studies the location of the BEC line at the physical point is known. In the chiral limit the condensation happens already at infinitesimally small isospin chemical potential for zero temperature according to chiral perturbation theory. The thermal chiral transition at zero density might then be affected, depending on the shape of the BEC boundary, by its proximity. As a first step towards the chiral limit, we perform simulations of 2+1 flavors QCD at half the physical quark masses. The position of the BEC transition is then extracted and compared with the results at physical masses.
We discuss results for the Roberge Weiss (RW) phase transition at nonzero imaginary baryon and isospin chemical potentials, in the plane of temperature and quark masses. Our study focuses on the light tricritical endpoint which has already been used as a starting point for extrapolations aiming at the chiral limit at vanishing chemical potentials. In particular, we are interested in determining how imaginary isospin chemical potential shifts the tricritical mass with respect to earlier studies at zero imaginary isospin chemical potential. A positive shift might allow one to perform the chiral extrapolations from larger quark mass values, therefore making them less computationally expensive. We also present results for the dynamics of Polyakov loop clusters across the RW phase transition.
We compute the equation of state of isospin asymmetric QCD at zero and non-zero temperatures using direct simulations of lattice QCD with three dynamical flavors at physical quark masses. In addition to the pressure and the trace anomaly and their behavior towards the continuum limit, we will particularly discuss the extraction of the speed of sound. Furthermore, we discuss first steps towards the extension of the EoS to small non-zero baryon chemical potentials via Taylor expansion.
According to perturbation theory predictions, QCD matter in the zero-temperature, high-density limits of QCD at nonzero isospin chemical potential is expected to be in a superfluid Bardeen-Cooper-Schrieffer (BCS) phase of u and d¯ Cooper pairs. It is also expected, on symmetry grounds, that such phase connects via an analytical crossover to the phase with Bose-Einstein condensation (BEC) of charged pions at μI≥mπ/2. With lattice results, showing some indications that the deconfinement crossover also smoothly penetrates the BEC phase, the conjecture was made that the former connects continuously to the BEC-BCS crossover. We compute the spectrum of the Dirac operator, and use generalized Banks-Casher relations, to test this conjecture and identify signatures of the superfluid BCS phase.
The interrelation between quantum anomalies and electromagnetic fields leads to a series of non-dissipative transport effects in QCD. In this work we study anomalous transport phenomena with lattice QCD simulations using improved staggered quarks in the presence of a background magnetic field. In particular, we calculate the conductivities both in the free case and in the interacting case, analysing the dependence of these coefficients with several parameters, such as the temperature and the quark mass.
The magnetic fields generated in non-central heavy-ion collisions are among the strongest fields produced in the Universe, reaching magnitudes comparable to the scale of the strong interactions. Backed by model simulations, the resulting field is expected to be spatially modulated, deviating significantly from the commonly considered uniform profile. To improve our understanding of the physics of quarks and gluons under such extreme conditions, we use lattice QCD simulations with 2+1 staggered fermion flavors with physical quark masses and an inhomogeneous magnetic background for a range of temperatures covering the QCD phase transition. We assume a 1/cosh2 function to model the field profile and vary its strength to analyze the impact on the computed observables and on the transition. We calculate local chiral condensates, local Polyakov loops and estimate the size of lattice artifacts. We find that both observables show non-trivial spatial features due to the interplay between the sea and the valence effects.
We investigate the properties of QCD at finite isospin chemical potential at zero and non-zero temperatures. This theory is not affected by the sign problem and can be simulated using Monte-Carlo techniques. With increasing isospin chemical potential and temperatures below the deconfinement transition the system changes into a phase where charged pions condense, accompanied by an accumulation of low modes of the Dirac operator. The simulations are enabled by the introduction of a pionic source into the action, acting as an infrared regulator for the theory, and physical results are obtained by removing the regulator via an extrapolation. We present an update of our study concerning the associated phase diagram using 2+1 flavours of staggered fermions with physical quark masses and the comparison to Taylor expansion. We also present first results for our determination of the equation of state at finite isospin chemical potential and give an example for a cosmological application. The results can also be used to gain information about QCD at small baryon chemical potentials using reweighting with respect to the pionic source parameter and the chemical potential and we present first steps in this direction.
We explore the phase diagram of two flavour QCD at vanishing chemical potential using dynamical O(a)-improved Wilson quarks. In the approach to the chiral limit we use lattices with a temporal extent of Nt = 16 and spatial extent L = 32;48 and 64 to enable the extrapolation to the thermodynamic limit with small discretisation effects. In addition to an update on the scans at constant k, reported earlier, we present first results from scans along lines of constant physics at a pion mass of 290 MeV.We probe the transition using the Polyakov loop and the chiral condensate, as well as spectroscopic observables such as screening masses.
A lot of effort in lattice simulations over the last years has been devoted to studies of the QCD deconfinement transition. Most state-of-the-art simulations use rooted staggered fermions, while Wilson fermions are affected by large systematic uncertainties, such as coarse lattices or heavy sea quarks. Here we report on an ongoing study of the transition, using two degenerate flavours of nonperturbatively O(a) improved Wilson fermions. We start with Nt = 12 and 16 lattices and pion masses of 600 to 450 MeV, aiming at chiral and continuum limits with light quarks.
We present an overview on the resonance dynamics within the microscopic parton-hadron-string dynamics (PHSD) approach which incorporates explicit partonic degrees-of-freedom in terms of strongly interacting quasiparticles (quarks and gluons) in line with an equation-of-state from lattice QCD as well as the dynamical hadronization and hadronic collision dynamics in the final reaction phase. We discuss how the vector meson resonances can be used as a probe of the in-medium effects and demostrate that the low mass dilepton spectra show visible in-medium effects from dynamical vector-meson spectral functions from SIS to SPS energies whereas at RHIC and LHC energies such medium effects become more moderate. We show also that the intermediate mass spectra are dominated by the radiation from the partonic degrees of freedom at RHIC and LHC energies.
The early prediction of mortality is one of the unresolved tasks in intensive care medicine. This contribution models medical symptoms as observations cased by transitions between hidden markov states. Learning the underlying state transition probabilities results in a prediction probability success of about 91%. The results are discussed and put in relation to the model used. Finally, the rationales for using the model are reflected: Are there states in the septic shock data?
In its first part, this contribution reviews shortly the application of neural network methods to medical problems and characterizes its advantages and problems in the context of the medical background. Successful application examples show that human diagnostic capabilities are significantly worse than the neural diagnostic systems. Then, paradigm of neural networks is shortly introduced and the main problems of medical data base and the basic approaches for training and testing a network by medical data are described. Additionally, the problem of interfacing the network and its result is given and the neuro-fuzzy approach is presented. Finally, as case study of neural rule based diagnosis septic shock diagnosis is described, on one hand by a growing neural network and on the other hand by a rule based system. Keywords: Statistical Classification, Adaptive Prediction, Neural Networks, Neurofuzzy, Medical Systems
Thermodynamical variables and their time evolution are studied for central relativistic heavy ion collisions from 10.7 to 160 AGeV in the microscopic Ultrarelativistic Quantum Molecular Dynamics model (UrQMD). The UrQMD model exhibits drastic deviations from equilibrium during the early high density phase of the collision. Local thermal and chemical equilibration of the hadronic matter seems to be established only at later stages of the quasi-isentropic expansion in the central reaction cell with volume 125 fm 3. Baryon energy spectra in this cell are reproduced by Boltzmann distributions at all collision energies for t > 10 fm/c with a unique rapidly dropping temperature. At these times the equation of state has a simple form: P = (0.12 - 0.15) Epsilon. At SPS energies the strong deviation from chemical equilibrium is found for mesons, especially for pions, even at the late stage of the reaction. The final enhancement of pions is supported by experimental data.
Poster presentation: Calcium plays a pivotal role in relaying electrical signals of the cell to subcellular compartments, such as the nucleus. Since this one ion type is used by the cell for many processes a neuron needs to establish finely tuned calcium pathways in order to be able to differentiate multiple tasks, [1-3].
While it is known that neurons can actively change their shape upon neuronal activity, [4-7], we here present novel findings of activity-regulated nuclear morphology, [8,9]. With the help of an experimental and computational modeling approach, we show that hippocampal neurons can change the previously spherical shape of their nuclei to complex and infolded morphologies. This morphology regulation is demonstrated to be regulated by NMDA-receptor gated calcium, while synaptic and extra-synaptic NMDA-receptors elicit opposing effects on nuclear morphology, [8].
The structural alterations of the cell nucleus have significant effects on nuclear calcium dynamics. Compartmentalization of the nucleus, due to membrane infoldings, changes calcium frequencies, amplitudes and spatial distributions, [8,10]. Since these parameters have been shown to control downstream events towards gene transcription, [11,12], the results elucidate the cellular control of nuclear function with the help of morphology modulation. With respect to processes downstream of calcium, we show that histone H3 phosphorylation is closely linked to nuclear morphology. Investigating the nuclear morphologies of hippocampal neurons, two major classes were identified [9,10]. One class contains non-infolded nuclei that have the function of calcium signal integrators, while the other class contains highly infolded nuclei, which function as frequency detectors of nuclear calcium, [10].
Extending this interdisciplinary approach of investigating structure/function relationships in neurons, the effects of cellular morphology – as well as the morphology of the endoplasmic reticulum and other organelles – on neuronal calcium signals is currently being investigated. This endeavor makes use of highly detailed, three-dimensional models of neuronal calcium dynamics, including the three-dimensional morphology of the cell and its organelles.
Diffusion of e-learning as an innovation and economic aspects of e-learning support structures
(2012)
Meanwhile, many universities and educational institutions have implemented an e-learning center or some similar, often smaller institutional units in order to support the usage of new media in teaching and learning processes [1]. This paper addresses questions around the installation of such e-learning support structures at different levels of an institution and also looks at the diffusion of e-learning as an innovation in educational institutions.
The paper presents a study which was based on the hypothesis that wikis that are initiated bottom up by students might be used more deliberately than wikis which are introduced top down by teachers. Therefore it examines the specific effects observed in nine different wiki projects at the university of Frankfurt ranging from student wiki projects up to wikis used in seminars and as information tool for institutions.
Since 2007 the concept of open online courses came up leading to many discussions of this new format in blog posts and articles especially in the US and Canada. 2011, the first German open online course was started addressing the Future of Learning.
The article discusses the concept of open online courses, the experiences with the first German course, and gives some perspectives on further developments which partly were implemented in a new course that was just started in 2012.
AKUE is developed by the e-learning centre of the University of Frankfurt, studiumdigitale, and is a procedure to assure high quality levels of e-learning course design and digital media production. The name AKUE stands for the German words for analysis, concept, implementation and evaluation and describes the four phases of the process. Background of AKUE is the fact, that costs and benefits of e-learning courses are difficult to be quantified. Therefore so called procedure (or process) models are applied in order to improve the quality and effectiveness of e-learning courses. This paper presents the process and steps of AKUE and gives examples for its application.
In: conference proceedings, edulearn 2010, Barcelona 5.-7.7.2010.
In order to design and tutor online and blended learning courses, trainers and teachers need to obtain appropriate qualification. In this paper different competency models for online teaching which developed in Germany 2005 – 2008 will be addressed as well as different settings to qualify teachers and trainers appropriately. Finally the results of an evaluation of two different training settings will be presented in order to compare an in house versus a transorganisational training program.