Refine
Year of publication
Document Type
- Conference Proceeding (746) (remove)
Language
- English (746) (remove)
Keywords
- Computerlinguistik (20)
- Informationsstruktur (16)
- Phonetik (12)
- Japanisch (9)
- Democracy (8)
- Englisch (7)
- Grammatik (7)
- Law (6)
- Maschinelle Übersetzung (6)
- Nungisch (6)
Institute
- Physik (241)
- Rechtswissenschaft (101)
- Medizin (79)
- Universitätsbibliothek (68)
- Informatik (37)
- Extern (27)
- Frankfurt Institute for Advanced Studies (FIAS) (27)
- Wirtschaftswissenschaften (14)
- Geowissenschaften (13)
- Biochemie und Chemie (11)
Phase transitions in a non-perturbative regime can be studied by ab initio Lattice Field Theory methods. The status and future research directions for LFT investigations of Quantum Chromo-Dynamics under extreme conditions are reviewed, including properties of hadrons and of the hypothesized QCD axion as inferred from QCD topology in different phases. We discuss phase transitions in strong interactions in an extended parameter space, and the possibility of model building for Dark Matter and Electro-Weak Symmetry Breaking. Methodological challenges are addressed as well, including new developments in Artificial Intelligence geared towards the identification of different phases and transitions.
The annotation of texts and other material in the field of digital humanities and Natural Language Processing (NLP) is a common task of research projects. At the same time, the annotation of corpora is certainly the most time- and cost-intensive component in research projects and often requires a high level of expertise according to the research interest. However, for the annotation of texts, a wide range of tools is available, both for automatic and manual annotation. Since the automatic pre-processing methods are not error-free and there is an increasing demand for the generation of training data, also with regard to machine learning, suitable annotation tools are required. This paper defines criteria of flexibility and efficiency of complex annotations for the assessment of existing annotation tools. To extend this list of tools, the paper describes TextAnnotator, a browser-based, multi-annotation system, which has been developed to perform platform-independent multimodal annotations and annotate complex textual structures. The paper illustrates the current state of development of TextAnnotator and demonstrates its ability to evaluate annotation quality (inter-annotator agreement) at runtime. In addition, it will be shown how annotations of different users can be performed simultaneously and collaboratively on the same document from different platforms using UIMA as the basis for annotation.
Over the past two decades the “one drug – one target – one disease” concept became the prevalent paradigm in drug discovery. The main idea of this approach is the identification of a single protein target whose inhibition leads to a successful treatment of the examined disease. The predominant assumption is that highly selective ligands would avoid unwanted side effects caused by binding to secondary non-therapeutic targets. In recent years the results of post-genomic and network biology showed that proteins rarely act in isolated systems but rather as a part of a highly connected network [1]. In addition this connectivity leads to more robust systems that cannot be interfered by the inhibition of a single target of that network and consequently might not lead to the desired therapeutic effect [2]. Furthermore studies prove that robust systems are rather affected by weak inhibitions of several parts than by a complete inhibition of a single selected element of that system [3]. Therefore there is an increasing interest in developing drugs that take effect on multiple targets simultaneously but is concurrently a great challenge for medicinal chemists. There has to be a sufficient activity on each target as well as an adequate pharmacokinetic profile [4]. Early design strategies tried to link the pharmacophors of known inhibitors, however these methods often lead to high molecular weight and low ligand efficacy. We present a new rational approach based on a retrosynthetic combinatorial analysis procedure [5] on approved ligands of multiple targets. These RECAP fragments are used to design a large combinatorial library containing molecules featuring chemical properties of each ligand class. The molecules are further validated by machine learning models, like random forests and self-organizing maps, regarding their activity on the targets of interest.
Introduction: aims and points of departure. 1. The problem of the knowledge of law: whether previous general rules may support a casuistic decision. 2. The problem of legal ethics: whether there are autonomous rights, which do not depend on positive law. 3. The ways of modern dogmatics to deal with these problems. 4. The question remains the same.
In this paper, an analysis of Robert Frost’s poem Mending Wall is presented as a hermeneutical key to investigate and criticize two examples of the oblivion of the reasonable distinction and the reasonable relationship between ethics and law proposed by a new Brazilian private law movement called Escola do Direito Civil-Constitucional (The Private-Constitutional School of Thought). Those examples of unreasonable relationship between ethics and law are: 1) the right to be loved and 2) the right to get a private education without paying for it.
In his works, Hans Kelsen elaborates several objections to the so-called “doctrine of natural law”, especially in his essay The Natural-Law Doctrine Before the Tribunal of Science. Kelsen argues that natural law theorists, searching for an absolute criterion for justice, try to deduce from nature the rules of human behavior. Robert P. George, in the essay Kelsen and Aquinas on the ‘Natural Law Doctrine’ examines his criticism and concludes that what Kelsen understands as the Natural-law doctrine does not include the natural law theory elaborated by Thomas Aquinas. In this paper, we will try to corroborate George’s theses and try to show how Aquinas’ natural law theory can be vindicated against Kelsens criticisms.
This article considers the Brazilian Legal System and the requirements of an act performed by public administration. To do so, it presents six main chapters. The first one considers Brazilian Constitution as it regards State form, legal and judicial systems. The second chapter presents the public administration stated in the Constitution. The requirements of a public administration act are presented in the third chapter. The improbity law, which determines how public administration acts should be performed, is presented on the fourth chapter. How one of the main judicial courts of Brazil has understood this law is the topic of the fifth chapter. The sixth chapter presents a proposal of how could be Phronesis used to solve misunderstandings about improbity in the Brazilian Legal System.
The Specialized Information Service Biodiversity Research (BIOfid) has been launched to mobilize valuable biological data from printed literature hidden in German libraries for over the past 250 years. In this project, we annotate German texts converted by OCR from historical scientific literature on the biodiversity of plants, birds, moths and butterflies. Our work enables the automatic extraction of biological information previously buried in the mass of papers and volumes. For this purpose, we generated training data for the tasks of Named Entity Recognition (NER) and Taxa Recognition (TR) in biological documents. We use this data to train a number of leading machine learning tools and create a gold standard for TR in biodiversity literature. More specifically, we perform a practical analysis of our newly generated BIOfid dataset through various downstream-task evaluations and establish a new state of the art for TR with 80.23% F-score. In this sense, our paper lays the foundations for future work in the field of information extraction in biology texts.
There is little doubt that Quantumchromodynamics (QCD) is the theory which describes strong interaction physics. Lattice gauge simulations of QCD predict that in the m,T plane there is a line where a transition from confined hadronic matter to deconfined quarks takes place. The transition is either a cross over (at low m) or of first order (at high m). It is the goal of the present and future heavy ion experiment at RHIC and FAIR to study this phase transition at different locations in the m,T plane and to explore the properties of the deconfined phase. It is the purpose of this contribution to discuss some of the observables which are considered as useful for this purpose.
Targeting for rare observables, the CBM experiment will operate at high interaction rates of up to 10 MHz, which is unprecedented in heavy-ion experiments so far. It requires a novel free-streaming readout system and a new concept of data processing. The huge data rates of the CBM experiment will be reduced online to the recordable rate before saving the data to the mass storage. Full collision reconstruction and selection will be performed online in a dedicated processor farm. In order to make an efficient event selection online a clean sample of particles has to be provided by the reconstruction package called First Level Event Selection (FLES).
The FLES reconstruction and selection package consists of several modules: track finding, track fitting, event building, short-lived particles finding, and event selection. Since detector measurements contain also time information, the event building is done at all stages of the reconstruction process. The input data are distributed within the FLES farm in a form of time-slices. A time-slice is reconstructed in parallel between processor cores. After all tracks of the whole time-slice are found and fitted, they are collected into clusters of tracks originated from common primary vertices, which then are fitted, thus identifying the interaction points. Secondary tracks are associated with primary vertices according to their estimated production time. After that short-lived particles are found and the full event building process is finished. The last stage of the FLES package is a selection of events according to the requested trigger signatures. The event reconstruction procedure and the results of its application to simulated collisions in the CBM detector setup are presented and discussed in detail.
Chromatic, geometric and space charge effects on laser accelerated protons focused by a solenoid
(2011)
We studied numerically emittance and transmission effects by chromatic and geometric aberrations, with and without space charge, for a proton beam behind a solenoid in the laser proton experiment LIGHT at GSI. The TraceWin code was employed using a field map for the solenoid and an initial distribution with exponential energy dependence close to the experiment. The results show a strong effect of chromatic, and a relatively weak one of geometric aberrations as well as dependence of proton transmission on distance from the solenoid. The chromatic effect has an energy filtering property due to the finite radius beam pipe. Furthermore, a relatively modest dependence of transmission on space charge is found for p production intensity below 1011.
The standard implementation of the HRG model has been shown to be unable to describe all the available data on QCD matter. Here we show the balance of repulsive and attractive hadronic interactions on QCD thermodynamics through observables both calculated by lattice simulations and measured in experiment. Attractive interactions are mediated by resonance formation, which are here implemented through extra states predicted by the Quark Model, while repulsive interactions are modelled by means of Excluded Volume (EV) effects. Informations on flavour dependent effective sizes are extracted. It is found that EV effects are present in lattice QCD thermodynamics, and are essential for a comprehensive description of higher order fluctuations of conserved charges.
[Tagungsbericht] Making finance sustainable: Ten years equator principles – success or letdown?
(2013)
In 2003, a number of banks adopted the Equator Principles (EPs), a voluntary Code of Conduct based on the International Finance Corporation’s (IFC) performance standards, to ensure the ecological and social sustainability of project finance. These so called Equator Principles Financial Institutions (EPFI) commit to requiring their borrowers to adopt sustainable management plans of environmental and social risks associated with their projects. The Principles apply to the project finance business segment of the banks and cover projects with a total cost of US $10 million or more. While for long developing countries relied on World Bank and other public assistance to finance infrastructure projects there has occurred a shift in recent years to private funding. The NGOs have been frustrated by this shift of project finance as they had spent their resources to exercise pressure on the public financial institutions to incorporate environmental and social standards in their project finance activities. However, after a shift of NGO pressure to private financial institutions the latter adopted the EPs for fear of reputational risks. NGOs had laid down their own more ambitious ideas about sustainable finance in the Collevecchio Declaration on Financial Institutions and Sustainability. Legally speaking, the EPs are a self-regulatory soft law instrument. However, it has a hard law dimension as the Equator Banks require their borrowers to comply with the EPs through covenants in the loan contracts that may trigger a default in a case of violation. ...
We show the first results for parton distribution functions within the proton at the physical pion mass, employing the method of quasi-distributions. In particular, we present the matrix elements for the iso-vector combination of the unpolarized, helicity and transversity quasi-distributions, obtained with Nf = 2 twisted mass cloverimproved fermions and a proton boosted with momentum = 0.83 GeV. The momentum smearing technique has been applied to improve the overlap with the proton boosted state. Moreover, we present the renormalized helicity matrix elements in the RI’ scheme, following the non-perturbative renormalization prescription recently developed by our group.
We discuss the current developments by the European Twisted Mass Collaboration in extracting parton distribution functions from the quasi-PDF approach. We concentrate on the non-perturbative renormalization prescription recently developed by us, using the RI′ scheme. We show results for the renormalization functions of matrix elements needed for the computation of quasi-PDFs, including the conversion to the MS scheme, and for renormalized matrix elements. We discuss the systematic effects present in the Z-factors and the possible ways of addressing them in the future.
The paper is structured as follows. Section 2.1 introduces the basic classes of adjectives that constitute the factual core of the paper. Section 2.2 summarizes in greater detail the X° and the XP movement approaches to word order variation within the DP. Section 3 briefly discusses problems for both approaches. Sections 4.1, 5.1, and 5.2 draw from Alexiadou (2001) and contain a discussion of Greek DS and its relevance for a re-analysis of the word order variation in the Romance DP. Section 4.2 introduces refinements to Alexiadou & Wilder (1998) and Alexiadou (2001). Section 5.3. discusses certain issues that arise from the analysis of postnominal adjectives in Romance as involving raising of XPs. Section 6 discusses phenomena found in other languages, which at first sight seem similar to DS. However, I show that double definiteness in e.g. Hebrew, Scandinavian or other Balkan languages constitutes a different type of phenomenon from Greek DS, thus making a distinction between determiners that introduce CPs (Greek) and those that are merely morphological/agreement markers (Hebrew, Scandinavian, Albanian).
Word formation in Distributed Morphology (see Arad 2005, Marantz 2001, Embick 2008): 1. Language has atomic, non-decomposable, elements = roots. 2. Roots combine with the functional vocabulary and build larger elements. 3. Roots are category neutral. They are then categorized by combining with category defining functional heads.
Experimental results and theoretical predictions in laser acceleration of protons achieved energies of ten to several tens of MeV. The LIGHT project (Laser Ion Generation, Handling and Transport) is proposed to use the PHELIX laser accelerated protons and to provide transport, focusing and injection into a conventional accelerator. This study demonstrates transport and focusing of laser-accelerated 10 MeV protons by a pulsed 18 T magnetic solenoid. The effect of co-moving electrons on the beam dynamics is investigated. The unique features of the proton distribution like small emittances and high yield of the order of 1013 protons per shot open new research area. The possibility of creating laser based injectors for ion accelerators is addressed. With respect to transit energies, direct matching into DTL's seems adequate. The bunch injection into a proposed CH− structure is under investigation at IAP Frankfurt. Options and simulation tools are presented.
This paper traces the development of National Socialist cultural and legal policy towards the arts. It examines the role of censure in this development starting with Hitler's first attempts at power in the Weimar republic. It then looks more closely into aspects of the development of new policies in and after 1933 and their implementation in institutions of the totalitarian state. As the paper shows, policies were carried out within a legal framework that included parliament and constitutional law but they were often also accompanied by aggressive political actions. Racial and nationalistic ideologies were at the heart of the National Socialist discourse about culture. This discourse quickly established modernity as its principal enemy and saw modernist culture (in the broad sense of the word), and especially art criticism, as being under Jewish domination. True German Kultur was set against this; Hitler himself promoted German art both through exhibitions and through policies which included the removal of un-German art and the exclusion of writers and artists who did not conform the cultural ideal. As Jewish artists and intellectuals in modernist culture posed the greatest threat to the establishment of a new German culture, Nazi policies towards the arts embarked on a process of censure, exclusion and annihilation. The purpose of these policies was nothing less than the elimination of all modernist (Jewish and ‘degenerate’) culture and any memory of it.
The superconducting CH-structure (Crossbar-H-mode) is a multi-cell drift tube cavity for the low and medium energy range operated in the H21-mode, which has been developed at the Institute for Applied Physics (IAP) of Frankfurt University. With respect to different high power applications two types of superconducting CH-structures (f = 325 MHz, β = 0.16, seven cells and f = 217 MHz, β = 0.059, 15 cells) are presently under construction and accordingly under development. The structural mechanical simulation is a very important aspect of the cavity design. Furthermore, several simulations with ANSYS Workbench have been performed to predict the deformation of the cavity walls due to the cavity cool-down, pressure effects and mechanical vibrations. To readjust the fast frequency changes in consequence of the cavity shape deformation, a new concept for the dynamic frequency tuning has been investigated, including a novel type of bellow-tuner.
The revolution will be tweeted : how the internet can stimulate the public exercise of freedoms
(2012)
This article discusses how new technologies of communication, especially the Internet and, more specifically, social network services, can interfere in social interactions and in political relations. The main objective is to problematize the concept of public liberty and verify how the new technologies can promote the reoccupation of public spaces and the recovery of public life, in opposition to the tendency to valorize the private sphere, observed in the second half of the twentieth century. The theoretical benchmark adopted for the investigation is Hannah Arendt's theory about the exercise of fundamental political capacities in order to establish a public space of freedom, as presented in “On Revolution”. The “Praia da Estação” (“Station Beach”) case is chosen to test the hypothesis. In 2010 in the Brazilian city of Belo Horizonte, different individuals articulated a movement through blogs, Twitter and facebook, in order to protest against the Mayor’s act that banned the assembling of cultural events in one of the main public places of the city, the “Praça da Estação” (Station Square). By applying Arendt's concepts to the selected case, it is possible to demonstrate that the Internet can assume an important role against governmental arbitrariness and abuse of power, as it can stimulate the public exercise of fundamental freedoms, such as freedom of assembly and manifestation.
The present study focuses on the beam line optimization from the heavy-ion synchrotron SIS18 to the HADES experiment. BOBYQA (Bound Optimization BY Quadratic Approximation) solves bound constrained optimization problems without using derivatives of the objective function. The Bayesian optimization is another strategy for global optimization of costly, noisy functions without using derivatives. A python programming interface to MADX allow the use of the python implementation of BOBYQA and Bayesian method. This gave the possibility to use tracking simulation with MADX to determine the loss budget for each lattice setting during the optimization and compare both optimization methods.
Due to the massive parallel operation modes at GSI accelerators, a lot of accelerator setup and re-adjustment has to be made by operators during a beam time. This is typically done manually using potentiometers and is very time-consuming. With the FAIR project the complexity of the accelerator facility increases further and for efficiency reasons it is recommended to establish a high level of automation for future operation. Modern Accelerator Control Systems allow a fast access to both, accelerator settings and beam diagnostics data. This provides the opportunity to implement algorithms for automated adjustment of e.g. magnet settings to maximize transmission and optimize required beam parameters. The fast-switching magnets in GSI-beamlines are an optimal basis for an automatic exploration of the parameter-space. The optimization of the parameters for the SIS18 multi-turn-injection using a genetic algorithm has already been simulated*. The first results of our automatized online parameter optimization at the CRYRING@ESR injector are presented here.
In this article I advance an account of human rights as individual claims that can be justified within the conceptual framework of social contract theories. The contractarian approach at issue here aims, initially, at a justification of morality at large, and then at the specific domain of morality which contains human rights concepts. The contractarian approach to human rights has to deal with the problem of universality, i.e. how can human rights be ‘universal’? I deal with this problem by examining the relationship between moral dispositions and what I call ‘diffuse legal structure’.
This paper proposes a new approach for the encoding of images by only a few important components. Classically, this is done by the Principal Component Analysis (PCA). Recently, the Independent Component Analysis (ICA) has found strong interest in the neural network community. Applied to images, we aim for the most important source patterns with the highest occurrence probability or highest information called principal independent components (PIC). For the example of a synthetic image composed by characters this idea selects the salient ones. For natural images it does not lead to an acceptable reproduction error since no a-priori probabilities can be computed. Combining the traditional principal component criteria of PCA with the independence property of ICA we obtain a better encoding. It turns out that this definition of PIC implements the classical demand of Shannon’s rate distortion theory.
While the existence of a strongly interacting state of matter, known as “quark-gluon plasma” (QGP), has been established in heavy ion collision experiments in the past decade, the task remains to map out the transition from the hadronic matter to the QGP. This is done by measuring the dependence of key observables (such as particle suppression and elliptic flow) on the collision energy of the heavy ions. This procedure, known as "beam energy scan", has been most recently performed at the Relativistic Heavy Ion Collider (RHIC).
Utilizing a Boltzmann+hydrodynamics hybrid model, we study the collision energy dependence of initial state eccentricities and the final state elliptic and triangular flow. This approach is well suited to investigate the relative importance of hydrodynamics and hadron transport at different collision energies.
Challenges of FAIR phase 0
(2018)
After two-year's shutdown, the GSI accelerators plus the latest addition of storage ring CRYRING, will be back into operation in 2018 as the FAIR phase 0 with the goal to fulfill the needs of scientific community and the FAIR accelerators and detector development. Even though GSI has been well known for its operation of a variety of ion beams ranging from proton up to uranium for multi research areas such as nuclear physics, astrophysics, biophysics, material science, the upcoming beam time faces a number of challenges in re-commissioning its existing circular accelerators with brand new control system and upgrade of beam instrumentations, as well as in rising failures of dated components and systems. The cycling synchrotron SIS18 has been undergoing a set of upgrade measures for fulfilling future FAIR operation, among which many measures will also be commissioned during the upcoming beam time. This paper presents the highlights of the challenges such as re-establishing the high intensity heavy ion operation as well as parallel operation mode for serving multi users. The status of preparation including commissioning results will also be reported.
In keeping with the views of its guru, Stephen Harnard, the open access movement is only prepared to discuss the two models of the "green road" and the "golden road" as sole alternatives for the future of scientific publishing. The "golden road" is put forward as the royal road for solving the journals crisis. However, no one has drawn attention to the fact that the golden road represents a purely socialist solution to a free-market problem and thus continues the "samizdat" tradition of underground literature in the former Eastern bloc. The present paper reveals the alarmingly low level at which the open access movement intends to publish top-class results from science and research, and the low degree of professionalism with which they are satisfied.
Akrasia, or weak-will, is a term denoting a phenomenon when one acts freely and intentionally contrary to his or her better judgment. Discussion of akrasia originates in the Plato's Protagoras where he states that “No one who either knows or believes that there is another possible course of action, better than the one he is following, will ever continue on his present course”. However, in his influential article from 1970, Donald Davidson argued that akrasia is theoretically possible yet irrational. Some other critics of Plato's stance point out that phenomenon of akrasia is common in our everyday experience, therefore it must be possible.
These two arguments in favor of akrasia existence – theoretical and empirical – will be discussed from both – philosophical and psychological points of view. Especially, George Ainslie's argument that akrasia results from hyperbolic discounting will be taken into consideration to show how it affects traditional thinking about weak-willed actions.
Finally, the paper will discuss how the contemporary notion of akrasia may affect the idea of responsibility and free will. Implications for the philosophy of law will be shown, i.a. whether it is possible to claim that a given example of a weak-willed action was indeed free and intentional and one should be held responsible for its results.
The increase in the volume of litigation verified since the 1990’s, having the Brazilian society as context, made the judiciary open itself to new technologies which facilitate the access to justice, as well as to a faster resolution of the demands. However, the intense insertion of technical rationalization in the process and decision operations by the judiciary, during the last years, led to a legalization supported by presuppositions of technical-instrumental regulation. According to the goal policy established by the CNJ, the annoyance of the instrumental rationality is present “with respect to purposes”, which demands, more and more, a mere fulfillment of previously instituted goals from the law operators. The matter is to know if the implementation of new technologies to solve the growing litigation coming from the complexity of societies is enough to adjust the Law to a post-conventional platform. If the social complexity implies resources coming from new technologies, it’s not certain that such technologies, on their own, satisfactorily answer a judicial model which, seen under the eyes of the post- conventional legitimacy and regulation, is adequate to complex societies. This illustrates that a judicial model, able to deal with the social plurality, must take into account not only the rules of instrumental rationality, but also the fundamental issues of communicative rationality. This current work intends to evaluate if the applicability of the instrumental rationality in the judiciary equally allows the law to extent the useful conditions of the communicative rationality to the consensual formation of will and opinion in the Democratic State of Law.
Delayed-onset muscle soreness (DOMS) is a common symptom in people participating in exercise, sport, or recreational physical activities. Several remedies have been proposed to prevent and alleviate DOMS. In 2008 and 2015, two studies have been conducted to investigate the effects of acupuncture on symptoms and muscle function in eccentric exercise-induced DOMS of the biceps brachii muscle. In 2008 a prospective, randomized, controlled, observer and subject-blinded trial was undertaken with 22 healthy subjects (22–30 years; 12 females) being randomly assigned to three treatment groups: real acupuncture (deep needling at classic acupuncture points and tender points; n = 7), sham-acupuncture (superficial needling at non-acupuncture points; n = 8), and control (n = 7). In 2015, a five-arm randomized controlled study was conducted with 60 subjects (22 females, 23.6 ± 2.8 years). Participants were randomly allocated to needle, laser, sham needle, sham laser acupuncture, and no intervention.
In both cases treatment was applied immediately, 24 and 48 hours after DOMS induction.
The outcome measures included pain perception (visual analogue scale; VAS), mechanical pain threshold (MPT), maximum isometric voluntary force (MIVF) and pressure pain threshold (PPT).
Results: In 2008, following nonparametric testing, there were no significant differences between groups in outcome measures at baseline. After 72 hours, pain perception (VAS) was significantly lower in the acupuncture group compared to the sham acupuncture and control subjects. However, the mean MPT and MIVF scores were not significantly different between groups. This lead to the conclusion, that acupuncture seemed to have no effects on MPT and muscle function, but reduced perceived pain arising from exercise-induced DOMS.
The more recent results from 2015 indicated that neither verum nor sham interventions significantly improved outcomes within 72 hours when compared with the no treatment control (P > 0.05).
Based on a non-rigorous formalism called the “cavity method”, physicists have made intriguing predictions on phase transitions in discrete structures. One of the most remarkable ones is that in problems such as random k-SAT or random graph k-coloring, very shortly before the threshold for the existence of solutions there occurs another phase transition called condensation [Krzakala et al., PNAS 2007]. The existence of this phase transition seems to be intimately related to the difficulty of proving precise results on, e. g., the k-colorability threshold as well as to the performance of message passing algorithms. In random graph k-coloring, there is a precise conjecture as to the location of the condensation phase transition in terms of a distributional fixed point problem. In this paper we prove this conjecture, provided that k exceeds a certain constant k0.
This paper is aimed to re-elaborate questions and discuss them rather than presenting answers. It starts with the dialog concerning specific contributions of philosophy of language to Law, followed by the re-elaboration of some yet unanswered problems, as well as the discussion of possible paths for this issue.
We present results of lattice QCD simulations with mass-degenerate up and down and mass-split strange and charm (Nf = 2+1+1) dynamical quarks using Wilson twisted mass fermions at maximal twist. The tuning of the strange and charm quark masses is performed at three values of the lattice spacing a ~ 0:06 fm, a ~ 0:08 fm and a ~ 0:09 fm with lattice sizes ranging from L ~ 1:9 fm to L ~ 3:9 fm. We perform a preliminary study of SU(2) chiral perturbation theory by combining our lattice data from these three values of the lattice spacing.
We present first results from runs performed with Nf = 2+1+1 flavours of dynamical twisted mass fermions at maximal twist: a degenerate light doublet and a mass split heavy doublet. An overview of the input parameters and tuning status of our ensembles is given, together with a comparison with results obtained with Nf = 2 flavours. The problem of extracting the mass of the K- and D-mesons is discussed, and the tuning of the strange and charm quark masses examined. Finally we compare two methods of extracting the lattice spacings to check the consistency of our data and we present some first results of cPT fits in the light meson sector.
We present the status of runs performed in the twisted mass formalism with Nf =2+1+1 flavours of dynamical fermions: a degenerate light doublet and a mass split heavy doublet. The procedure for tuning to maximal twist will be described as well as the current status of the runs using both thin and stout links. Preliminary results for a few observables obtained on ensembles at maximal twist will be given. Finally, a reweighting procedure to tune to maximal twist will be described.
A CW RFQ prototype
(2011)
A short RFQ prototype was built for RF-tests of high power RFQ structures. We will study thermal effects and determine critical points of the design. HF-simulations with CST Microwave Studio and measurements were done. The cw-tests with 20 kW/m RF-power and simulations of thermal effects with ALGOR were finished successfully. The optimization of some details of the HF design is on focus now. First results and the status of the project will be presented.
Beam measurements with the new RFQ beam matching section at the Frankfurt Funneling Experiment
(2011)
Funneling is a method to increase low energy beam currents in multiple stages. The Frankfurt Funneling Experiment is a model of such a stage. The experiment is built up of two ion sources with electrostatic lens systems, a Two-Beam-RFQ accelerator, a funneling deflector and a beam diagnostic system. The two beams are bunched and accelerated in a Two-Beam RFQ. A funneling deflector combines the bunches to a common beam axis. A new beam transport system between RFQ accelerator and deflector has been constructed and mounted. With these extended RFQ-electrodes the drift between the Two-Beam-RFQ and the rf-deflector will be minimized and therefore unwanted emittance growth reduced. After first rf measurements current work are beam tests with the improved Two-Beam-RFQ. First results will be presented.
We analyze the reaction dynamics of central Pb+Pb collisions at 160 GeV/nucleon. First we estimate the energy density pile-up at mid-rapidity and calculate its excitation function: The energy density is decomposed into hadronic and partonic contributions. A detailed analysis of the collision dynamics in the framework of a microscopic transport model shows the importance of partonic degrees of freedom and rescattering of leading (di)quarks in the early phase of the reaction for E >= 30 GeV/nucleon. The energy density reaches up to 4 GeV/fm 3, 95% of which are contained in partonic degrees of freedom. It is shown that cells of hadronic matter, after the early reaction phase, can be viewed as nearly chemically equilibrated. This matter never exceeds energy densities of 0.4 GeV/fm 3, i.e. a density above which the notion of separated hadrons loses its meaning. The final reaction stage is analyzed in terms of hadron ratios, freeze-out distributions and a source analysis for final state pions.
Stabilized Wilson fermions are a reformulation of Wilson clover fermions that incorporates several numerical stabilizing techniques, but also a local change of the fermion action - the original clover term being replaced with an exponentiated version of it. We intend to apply the stabilized Wilson fermions toolbox to the thermodynamics of QCD, starting on the Nf=3 symmetric line on the Columbia plot, and to compare the results with those obtained with other fermion discretizations.
The debates about the interrelations between reason and law have undergone a change after the eighteenth century. References to the recta ratio of jusnaturalistic tradition have not disappeared, but other comprehensions of legal reason have developed. The European debate over legal positivist science has contributed to this in a manifestation of the rationality of law. This transformation may be considered the basis for the development of true “legal technologies” throughout the twentieth century. On the other hand, in the context of theories of positive law which have taken the relation between ethics and legal reason as a problem, the formation of discourses on coercion (Austin and Holmes), on validity (Kelsen and Hart) and on justification (Alexy and Dworkin) has also contributed to the emergence of new models of legal rationality. In this paper, it is highlighted that the construction of these models is linked to the “points of view” which theories have proposed as legitimate for the interpretation of legal phenomenon. And it is suggested that the discussion over points of view (defined as “focuses”, term which is close to the notion of “attitude”, “stance” or “place of speech”) may aid in the debate on the normativity of law.
Background: The most frequent therapy of hydrocephalus is the implantation of ventriculoperitoneal shunts for diverting cerebrospinal fluid from the ventricles into the peritoneum. We compared two adjustable valves, the proGAV and proGAV 2.0, for complications which resulted in revision operations.
Methods: Four hundred patients who underwent primary shunt implantation between 2014 and 2020 were analyzed for overall revision rate, one-year revision rate, revision free survival and overall survival observing patient age group, gender, etiology of hydrocephalus, implantation site, prior diversion of cerebrospinal fluid and cause of revision.
Results: All data were available of all 400 patients (female/male 208/192). Overall, 99 patients underwent revision surgery after primary implantation. ProGAV valve was implanted in 283 patients, proGAV 2.0 in 117 patients. There was no significant difference between the two shunt valves concerning revision rate (p=0.8069), one-year revision rate (p=0.9077), revision free survival (p=0.6921) and overall survival (p=0.3232). Furthermore, regarding one-year revision rate, we observed no significant difference between the two shunt valves in pediatric patients (40.7% vs 27.6%; p=0.2247). Revision operation had to be performed more frequently in pediatric patients (46.6% vs 24.8%; p=0.0093) with a significant higher number of total revisions with proGAV than proGAV 2.0 (55.9% vs. 27.6%; p=0.0110) most likely due to longer follow up in the proGAV -group.
Conclusion: According to the target variables we analyzed, aside from lifetime revision rate in pediatric patients there is no significant difference between the two shunt valves. From our subjective point of view, implantation of the newer proGAV 2.0 valve is preferable due to higher adjustment comfort for both patients and physicians.
This paper traces the military role of Tibnīn and its rulers in the Latin East against the Muslims until 1187/ 583. Tibnīn played a key role in overcoming the Muslims in Tyre and controlled it in 1124. It also played a vital role in the conflict between Damascus and the Kingdom of Jerusalem. Tibnīn participated in defending Antioch, Banyas, Hebron and Transjordan several times. Furthermore, its soldiers and Knights joined the army of the Kingdom of Jerusalem to capture Ascalon in 1153, and joined the campaigns of Amaury I, King of Jerusalem, against Egypt from 1164 to1169. The military situation of Tibnīn under the rule of the royal house until its fall to the Muslims in 1187/ 583 will be studied as well
In this increasingly complex world of learned information delivery and discovery - is it possible that the "free lunch" the Publishing world worries about could come true? Although Open Access and Institutional Repositories have not (yet) created the "scorched earth" effect many were predicting, they are slowly and inevitably gaining momentum. Broader access to top-level information via Google (and others) does indeed appear to be "good enough" for many in their search for content. But you rarely get food for free in a good quality restaurant. You pay for the selection, preparation, speed and expertise of the delivery. At the soup kitchen the food can often be filling - but the queue will be long, the wait even longer and there is no chance of silver service or à la carte. If you are unfortunate enough to have little choice then this may be a great solution. Others will be willing to pay for a more satisfactory meal. As in all aspects of life, diversification and specialisation are fundamental forces. The publishing community in the years to come will continue to develop its offerings for a variety of needs that require more than just broth. To stretch the analogy, the ongoing presence of tap water in our lives has done little to halt the extraordinary rise of bottled water as part of our staple diet. Business reality will continue to settle these types of debate; my bet is that the commercial publishers see a role as providing information that commands an intrinsic value proposition to enough customers to remain economically viable for some time to come. Inspired by the comments and ideas expounded by Dr. James O'Donnell of Georgetown University on the liblicense listserv on 20th July this year, this paper will look to expand on the analogy and identify the good, the bad - but importantly the difference in information quality and access that will result in the radically changed (but still co-existent) information landscape of tomorrow.
Recent results on baryon production in relativistic heavy ion collisions show that a revision of the chemical freeze-out conditions is necessary. Particularly, there is evidence that chemical freezeout does not occur at full chemical equilibrium. We present a method to reconstruct original hadronization conditions and show that the newly found points in the T − µB plane are in very good agreement with extrapolations of the lattice QCD critical line.
Background: A growing interest exists in using polymeric nanoparticles (NPs) especially functionalized with surface-active substances as carriers across the blood brain barrier (BBB) for potentially effective drugs in traumatic brain injury (TBI). However, the organ distribution of intravenous administrated biodegradable and non-biodegradable NPs coated with different surfactants, how much of the administrated dose reach the brain parenchyma in areas with intact and opened BBB after trauma, as well as whether they elicit an inflammatory response is still to be clarified.
Methods: The organ distribution, brain penetration and eventual inflammatory activation of polysorbate-80 (Tw80) and sodium-lauryl-sulfate (SDS) coated poly l-lactide (PLLA) and perfluorodecyl acrylate (PFDL) nanoparticles were evaluated after intravenous administration in rats prior and after undergoing controlled cortical impact (CCI).
Results: A significant highest NP uptake at 4 and 24 hs was observed in the liver and spleen, followed by the brain and kidney, with minimal concentrations in the lungs and heart for all NPs. After CCI, a significant increase of NP uptake at 4 hs and 24 hs was observed within the traumatized hemisphere, especially in the perilesional area, although NPs were still found in areas away from CCI and the contralateral hemisphere in similar concentrations as in non-CCI subject. NPs were localized in neurons, glial and endovascular cells. Immunohistochemical staining against GFAP, Iba1, TNFα and IL1β demonstrated no glial activation or neuroinflamatory changes.
Conclusions: Tw80 and SDS coated biodegradable (PLLA) and non-biodegradrable (PFDL) NPs reach the brain parenchyma in both areas of traumatized and undamaged brain with disrupted and intact BBB, even though a high amount of them are retained in the liver and the spleen. No inflammatory reaction is elicited by these NPs within 24 hs after application. These preliminary promising results postulate the effectiveness and safety of these NPs as drug-carriers for the treatment of TBI.
The Specialised Information Service Performing Arts (SIS PA) is part of a funding programme by the German Research Foundation that enables libraries to develop tailor-made services for individual disciplines in order to provide researchers direct access to relevant materials and resources from their field. For the field of performing arts, the SIS PA is aggregating metadata about theater and dance resources from currently, mostly, German-speaking cultural heritage institutions in a VuFind-based search portal.
In this article, we focus on metadata quality and its impact on the aggregation workflow by describing the different, possibly data provider-specific, process stages of improving data quality in order to achieve a searchable, interlinked knowledge base. We also describe lessons learned and limitations of the process.