Refine
Year of publication
Document Type
- Conference Proceeding (749) (remove)
Language
- English (749) (remove)
Has Fulltext
- yes (749) (remove)
Keywords
- Computerlinguistik (20)
- Informationsstruktur (16)
- Phonetik (12)
- Japanisch (9)
- Democracy (8)
- Englisch (7)
- Grammatik (7)
- Law (6)
- Maschinelle Übersetzung (6)
- Nungisch (6)
Institute
- Physik (243)
- Rechtswissenschaft (101)
- Medizin (81)
- Universitätsbibliothek (67)
- Informatik (37)
- Extern (27)
- Frankfurt Institute for Advanced Studies (FIAS) (27)
- Wirtschaftswissenschaften (14)
- Geowissenschaften (13)
- Biochemie und Chemie (11)
Phase transitions in a non-perturbative regime can be studied by ab initio Lattice Field Theory methods. The status and future research directions for LFT investigations of Quantum Chromo-Dynamics under extreme conditions are reviewed, including properties of hadrons and of the hypothesized QCD axion as inferred from QCD topology in different phases. We discuss phase transitions in strong interactions in an extended parameter space, and the possibility of model building for Dark Matter and Electro-Weak Symmetry Breaking. Methodological challenges are addressed as well, including new developments in Artificial Intelligence geared towards the identification of different phases and transitions.
The annotation of texts and other material in the field of digital humanities and Natural Language Processing (NLP) is a common task of research projects. At the same time, the annotation of corpora is certainly the most time- and cost-intensive component in research projects and often requires a high level of expertise according to the research interest. However, for the annotation of texts, a wide range of tools is available, both for automatic and manual annotation. Since the automatic pre-processing methods are not error-free and there is an increasing demand for the generation of training data, also with regard to machine learning, suitable annotation tools are required. This paper defines criteria of flexibility and efficiency of complex annotations for the assessment of existing annotation tools. To extend this list of tools, the paper describes TextAnnotator, a browser-based, multi-annotation system, which has been developed to perform platform-independent multimodal annotations and annotate complex textual structures. The paper illustrates the current state of development of TextAnnotator and demonstrates its ability to evaluate annotation quality (inter-annotator agreement) at runtime. In addition, it will be shown how annotations of different users can be performed simultaneously and collaboratively on the same document from different platforms using UIMA as the basis for annotation.
Over the past two decades the “one drug – one target – one disease” concept became the prevalent paradigm in drug discovery. The main idea of this approach is the identification of a single protein target whose inhibition leads to a successful treatment of the examined disease. The predominant assumption is that highly selective ligands would avoid unwanted side effects caused by binding to secondary non-therapeutic targets. In recent years the results of post-genomic and network biology showed that proteins rarely act in isolated systems but rather as a part of a highly connected network [1]. In addition this connectivity leads to more robust systems that cannot be interfered by the inhibition of a single target of that network and consequently might not lead to the desired therapeutic effect [2]. Furthermore studies prove that robust systems are rather affected by weak inhibitions of several parts than by a complete inhibition of a single selected element of that system [3]. Therefore there is an increasing interest in developing drugs that take effect on multiple targets simultaneously but is concurrently a great challenge for medicinal chemists. There has to be a sufficient activity on each target as well as an adequate pharmacokinetic profile [4]. Early design strategies tried to link the pharmacophors of known inhibitors, however these methods often lead to high molecular weight and low ligand efficacy. We present a new rational approach based on a retrosynthetic combinatorial analysis procedure [5] on approved ligands of multiple targets. These RECAP fragments are used to design a large combinatorial library containing molecules featuring chemical properties of each ligand class. The molecules are further validated by machine learning models, like random forests and self-organizing maps, regarding their activity on the targets of interest.
Introduction: aims and points of departure. 1. The problem of the knowledge of law: whether previous general rules may support a casuistic decision. 2. The problem of legal ethics: whether there are autonomous rights, which do not depend on positive law. 3. The ways of modern dogmatics to deal with these problems. 4. The question remains the same.
In this paper, an analysis of Robert Frost’s poem Mending Wall is presented as a hermeneutical key to investigate and criticize two examples of the oblivion of the reasonable distinction and the reasonable relationship between ethics and law proposed by a new Brazilian private law movement called Escola do Direito Civil-Constitucional (The Private-Constitutional School of Thought). Those examples of unreasonable relationship between ethics and law are: 1) the right to be loved and 2) the right to get a private education without paying for it.
In his works, Hans Kelsen elaborates several objections to the so-called “doctrine of natural law”, especially in his essay The Natural-Law Doctrine Before the Tribunal of Science. Kelsen argues that natural law theorists, searching for an absolute criterion for justice, try to deduce from nature the rules of human behavior. Robert P. George, in the essay Kelsen and Aquinas on the ‘Natural Law Doctrine’ examines his criticism and concludes that what Kelsen understands as the Natural-law doctrine does not include the natural law theory elaborated by Thomas Aquinas. In this paper, we will try to corroborate George’s theses and try to show how Aquinas’ natural law theory can be vindicated against Kelsens criticisms.
This article considers the Brazilian Legal System and the requirements of an act performed by public administration. To do so, it presents six main chapters. The first one considers Brazilian Constitution as it regards State form, legal and judicial systems. The second chapter presents the public administration stated in the Constitution. The requirements of a public administration act are presented in the third chapter. The improbity law, which determines how public administration acts should be performed, is presented on the fourth chapter. How one of the main judicial courts of Brazil has understood this law is the topic of the fifth chapter. The sixth chapter presents a proposal of how could be Phronesis used to solve misunderstandings about improbity in the Brazilian Legal System.
The Specialized Information Service Biodiversity Research (BIOfid) has been launched to mobilize valuable biological data from printed literature hidden in German libraries for over the past 250 years. In this project, we annotate German texts converted by OCR from historical scientific literature on the biodiversity of plants, birds, moths and butterflies. Our work enables the automatic extraction of biological information previously buried in the mass of papers and volumes. For this purpose, we generated training data for the tasks of Named Entity Recognition (NER) and Taxa Recognition (TR) in biological documents. We use this data to train a number of leading machine learning tools and create a gold standard for TR in biodiversity literature. More specifically, we perform a practical analysis of our newly generated BIOfid dataset through various downstream-task evaluations and establish a new state of the art for TR with 80.23% F-score. In this sense, our paper lays the foundations for future work in the field of information extraction in biology texts.
There is little doubt that Quantumchromodynamics (QCD) is the theory which describes strong interaction physics. Lattice gauge simulations of QCD predict that in the m,T plane there is a line where a transition from confined hadronic matter to deconfined quarks takes place. The transition is either a cross over (at low m) or of first order (at high m). It is the goal of the present and future heavy ion experiment at RHIC and FAIR to study this phase transition at different locations in the m,T plane and to explore the properties of the deconfined phase. It is the purpose of this contribution to discuss some of the observables which are considered as useful for this purpose.
Targeting for rare observables, the CBM experiment will operate at high interaction rates of up to 10 MHz, which is unprecedented in heavy-ion experiments so far. It requires a novel free-streaming readout system and a new concept of data processing. The huge data rates of the CBM experiment will be reduced online to the recordable rate before saving the data to the mass storage. Full collision reconstruction and selection will be performed online in a dedicated processor farm. In order to make an efficient event selection online a clean sample of particles has to be provided by the reconstruction package called First Level Event Selection (FLES).
The FLES reconstruction and selection package consists of several modules: track finding, track fitting, event building, short-lived particles finding, and event selection. Since detector measurements contain also time information, the event building is done at all stages of the reconstruction process. The input data are distributed within the FLES farm in a form of time-slices. A time-slice is reconstructed in parallel between processor cores. After all tracks of the whole time-slice are found and fitted, they are collected into clusters of tracks originated from common primary vertices, which then are fitted, thus identifying the interaction points. Secondary tracks are associated with primary vertices according to their estimated production time. After that short-lived particles are found and the full event building process is finished. The last stage of the FLES package is a selection of events according to the requested trigger signatures. The event reconstruction procedure and the results of its application to simulated collisions in the CBM detector setup are presented and discussed in detail.
Chromatic, geometric and space charge effects on laser accelerated protons focused by a solenoid
(2011)
We studied numerically emittance and transmission effects by chromatic and geometric aberrations, with and without space charge, for a proton beam behind a solenoid in the laser proton experiment LIGHT at GSI. The TraceWin code was employed using a field map for the solenoid and an initial distribution with exponential energy dependence close to the experiment. The results show a strong effect of chromatic, and a relatively weak one of geometric aberrations as well as dependence of proton transmission on distance from the solenoid. The chromatic effect has an energy filtering property due to the finite radius beam pipe. Furthermore, a relatively modest dependence of transmission on space charge is found for p production intensity below 1011.
The standard implementation of the HRG model has been shown to be unable to describe all the available data on QCD matter. Here we show the balance of repulsive and attractive hadronic interactions on QCD thermodynamics through observables both calculated by lattice simulations and measured in experiment. Attractive interactions are mediated by resonance formation, which are here implemented through extra states predicted by the Quark Model, while repulsive interactions are modelled by means of Excluded Volume (EV) effects. Informations on flavour dependent effective sizes are extracted. It is found that EV effects are present in lattice QCD thermodynamics, and are essential for a comprehensive description of higher order fluctuations of conserved charges.
[Tagungsbericht] Making finance sustainable: Ten years equator principles – success or letdown?
(2013)
In 2003, a number of banks adopted the Equator Principles (EPs), a voluntary Code of Conduct based on the International Finance Corporation’s (IFC) performance standards, to ensure the ecological and social sustainability of project finance. These so called Equator Principles Financial Institutions (EPFI) commit to requiring their borrowers to adopt sustainable management plans of environmental and social risks associated with their projects. The Principles apply to the project finance business segment of the banks and cover projects with a total cost of US $10 million or more. While for long developing countries relied on World Bank and other public assistance to finance infrastructure projects there has occurred a shift in recent years to private funding. The NGOs have been frustrated by this shift of project finance as they had spent their resources to exercise pressure on the public financial institutions to incorporate environmental and social standards in their project finance activities. However, after a shift of NGO pressure to private financial institutions the latter adopted the EPs for fear of reputational risks. NGOs had laid down their own more ambitious ideas about sustainable finance in the Collevecchio Declaration on Financial Institutions and Sustainability. Legally speaking, the EPs are a self-regulatory soft law instrument. However, it has a hard law dimension as the Equator Banks require their borrowers to comply with the EPs through covenants in the loan contracts that may trigger a default in a case of violation. ...
We show the first results for parton distribution functions within the proton at the physical pion mass, employing the method of quasi-distributions. In particular, we present the matrix elements for the iso-vector combination of the unpolarized, helicity and transversity quasi-distributions, obtained with Nf = 2 twisted mass cloverimproved fermions and a proton boosted with momentum = 0.83 GeV. The momentum smearing technique has been applied to improve the overlap with the proton boosted state. Moreover, we present the renormalized helicity matrix elements in the RI’ scheme, following the non-perturbative renormalization prescription recently developed by our group.
We discuss the current developments by the European Twisted Mass Collaboration in extracting parton distribution functions from the quasi-PDF approach. We concentrate on the non-perturbative renormalization prescription recently developed by us, using the RI′ scheme. We show results for the renormalization functions of matrix elements needed for the computation of quasi-PDFs, including the conversion to the MS scheme, and for renormalized matrix elements. We discuss the systematic effects present in the Z-factors and the possible ways of addressing them in the future.
The paper is structured as follows. Section 2.1 introduces the basic classes of adjectives that constitute the factual core of the paper. Section 2.2 summarizes in greater detail the X° and the XP movement approaches to word order variation within the DP. Section 3 briefly discusses problems for both approaches. Sections 4.1, 5.1, and 5.2 draw from Alexiadou (2001) and contain a discussion of Greek DS and its relevance for a re-analysis of the word order variation in the Romance DP. Section 4.2 introduces refinements to Alexiadou & Wilder (1998) and Alexiadou (2001). Section 5.3. discusses certain issues that arise from the analysis of postnominal adjectives in Romance as involving raising of XPs. Section 6 discusses phenomena found in other languages, which at first sight seem similar to DS. However, I show that double definiteness in e.g. Hebrew, Scandinavian or other Balkan languages constitutes a different type of phenomenon from Greek DS, thus making a distinction between determiners that introduce CPs (Greek) and those that are merely morphological/agreement markers (Hebrew, Scandinavian, Albanian).
Word formation in Distributed Morphology (see Arad 2005, Marantz 2001, Embick 2008): 1. Language has atomic, non-decomposable, elements = roots. 2. Roots combine with the functional vocabulary and build larger elements. 3. Roots are category neutral. They are then categorized by combining with category defining functional heads.
Experimental results and theoretical predictions in laser acceleration of protons achieved energies of ten to several tens of MeV. The LIGHT project (Laser Ion Generation, Handling and Transport) is proposed to use the PHELIX laser accelerated protons and to provide transport, focusing and injection into a conventional accelerator. This study demonstrates transport and focusing of laser-accelerated 10 MeV protons by a pulsed 18 T magnetic solenoid. The effect of co-moving electrons on the beam dynamics is investigated. The unique features of the proton distribution like small emittances and high yield of the order of 1013 protons per shot open new research area. The possibility of creating laser based injectors for ion accelerators is addressed. With respect to transit energies, direct matching into DTL's seems adequate. The bunch injection into a proposed CH− structure is under investigation at IAP Frankfurt. Options and simulation tools are presented.
This paper traces the development of National Socialist cultural and legal policy towards the arts. It examines the role of censure in this development starting with Hitler's first attempts at power in the Weimar republic. It then looks more closely into aspects of the development of new policies in and after 1933 and their implementation in institutions of the totalitarian state. As the paper shows, policies were carried out within a legal framework that included parliament and constitutional law but they were often also accompanied by aggressive political actions. Racial and nationalistic ideologies were at the heart of the National Socialist discourse about culture. This discourse quickly established modernity as its principal enemy and saw modernist culture (in the broad sense of the word), and especially art criticism, as being under Jewish domination. True German Kultur was set against this; Hitler himself promoted German art both through exhibitions and through policies which included the removal of un-German art and the exclusion of writers and artists who did not conform the cultural ideal. As Jewish artists and intellectuals in modernist culture posed the greatest threat to the establishment of a new German culture, Nazi policies towards the arts embarked on a process of censure, exclusion and annihilation. The purpose of these policies was nothing less than the elimination of all modernist (Jewish and ‘degenerate’) culture and any memory of it.