Refine
Year of publication
Document Type
- Conference Proceeding (749) (remove)
Language
- English (749) (remove)
Has Fulltext
- yes (749) (remove)
Keywords
- Computerlinguistik (20)
- Informationsstruktur (16)
- Phonetik (12)
- Japanisch (9)
- Democracy (8)
- Englisch (7)
- Grammatik (7)
- Law (6)
- Maschinelle Übersetzung (6)
- Nungisch (6)
Institute
- Physik (243)
- Rechtswissenschaft (101)
- Medizin (81)
- Universitätsbibliothek (67)
- Informatik (37)
- Extern (27)
- Frankfurt Institute for Advanced Studies (FIAS) (27)
- Wirtschaftswissenschaften (14)
- Geowissenschaften (13)
- Biochemie und Chemie (11)
This paper presents the design and implementation of a group oriented, decentralised, virtual learning settings where students meet in groups of 3 –5 people at different locations all over the world and communicate via the internet. After presenting the objective of such a didactical design the paper gives an insight into the technical implementation. It presents the advantages and disadvantages of several internet services in such a virtual setting and a way of combining these internet applications according to their special characteristics. While the role of teachers change to those of coordinators the communication process within and between the groups becomes more important – as discussed in the following chapter. The paper concludes with the presentation of two practical applications as offered by the Institute for Didactics and Economics at the Johann Wolfgang Goethe-University Frankfurt/Main (Germany) and some evaluating remarks.
The paper takes a deeper look at participation rates in cMOOCs. To get a better insight into the behavior of learners in MOOCs, studiumdigitale has developed a tool which helps to analyze the contribution of participants in the so called cMOOCs. These are MOOCs which are fostering the active participation of learners in various tools and which are based on the concept of connectivism [1]. After an approach at each part of the definition of MOOCs and the discussion of the different categories of this quite new phenomena a deeper look will be taken into the analysis of two cMOOCs, OPCO11 and OPCO12 which took place 2011 and 2012 [2].
We empirically investigate algorithms for solving Connected Components in the external memory model. In particular, we study whether the randomized O(Sort(E)) algorithm by Karger, Klein, and Tarjan can be implemented to compete with practically promising and simpler algorithms having only slightly worse theoretical cost, namely Borůvka’s algorithm and the algorithm by Sibeyn and collaborators. For all algorithms, we develop and test a number of tuning options. Our experiments are executed on a large set of different graph classes including random graphs, grids, geometric graphs, and hyperbolic graphs. Among our findings are: The Sibeyn algorithm is a very strong contender due to its simplicity and due to an added degree of freedom in its internal workings when used in the Connected Components setting. With the right tunings, the Karger-Klein-Tarjan algorithm can be implemented to be competitive in many cases. Higher graph density seems to benefit Karger-Klein-Tarjan relative to Sibeyn. Borůvka’s algorithm is not competitive with the two others.
For the research program with cooled antiprotons at FAIR a dedicated 70 MeV, 70 mA proton injector is required. The main acceleration of this room temperature linac will be provided by six CH cavities operated at 325 MHz. Each cavity will be powered by a 2.5 MW Klystron. For the second acceleration unit from 11.5 MeV to 24.2 MeV a 1:2 scaled model has been built. Low level RF measurements have been performed to determine the main parameters and to prove the concept of coupled CH cavities. For this second tank technical and mechanical investigations have been performed in 2010 to develop a complete technical concept for the manufacturing. In Spring 2011, the construction of the first power prototype has started. The main components of this cavity will be ready for measurements in summer 2011. At that time, the cavity will be tested with a preliminary aluminum drift tube structure, which will allow precise frequency and field tuning. This paper will report on the recent technical development and achievements. It will outline the main fabrication steps towards that novel type of proton DTL. Also first low level RF measurements are expected.
The thermodynamics of Quantum Chromodynamics (QCD) in external (electro-)magnetic fields shows some unexpected features like inverse magnetic catalysis, which have been revealed mainly through lattice studies. Many effective descriptions, on the other hand, use Landau levels or approximate the system by just the lowest Landau level (LLL). Analyzing lattice configurations we ask whether such a picture is justified. We find the LLL to be separated from the rest by a spectral gap in the two-dimensional Dirac operator and analyze the corresponding LLL signature in four dimensions. We determine to what extent the quark condensate is LLL dominated at strong magnetic fields.
We will discuss the issue of Landau levels of quarks in lattice QCD in an external magnetic field. We will show that in the two-dimensional case the lowest Landau level can be identified unambiguously even if the strong interactions are turned on. Starting from this observation, we will then show how one can define a “plowest Landau level” in the four-dimensional case, and discuss how much of the observed effects of a magnetic field can be explained in terms of it. Our results can be used to test the validity of low-energy models of QCD that make use of the lowest-Landau-level approximation.
Several articulatory strategies are available during the production of /u/, all resulting in a similar acoustic output. /u/ has two main constrictions, at the velum and at the lips. A perturbation of either constriction can be compensated at the other one, e.g wider constriction at the velum by more lip protrusion, wider lip opening by more tongue retraction. This study investigates whether speakers use this relation under perturbation. Six speakers were provided with palatal prostheses which were worn for two weeks. Speakers were instructed to make a serious attempt to produce normal speech. Their speech was recorded via EMA and acoustics several times over the adaptation period. Formant values of /u/-productions were measured. Velar constriction width and lip protrusion were estimated. For four speakers a correlation between constriction width and lip protrusion was found. A negative correlation between lip protrusion and F1 or F2 could sometimes be observed, but no correlation occurred between constriction size and either of the formants. The results show that under perturbation speakers use motor equivalent strategies in order to adapt. The correlation between constriction size and lip protrusion is stronger than in studies investigating unperturbed speech. This could be because under perturbation speakers are inclined to try out several strategies in order to reach the acoustic target and the co-variability might thus be greater.
A two-week perturbation EMA-experiment was carried out with palatal prostheses. Articulatory effort for five speakers was assessed by means of peak acceleration and jerk during the tongue tip gestures from /t/ towards /i, e, o, y, u/. After a period of no change speakers showed an increase in these values. Towards the end of the experiment the values decreased. The results are interpreted as three phases of carrying out changes in the internal model. At first, the complete production system is shifted in relation to the palatal change, afterwards speakers explore different production mechanisms which involves more articulatory effort. This second phase can be seen as a training phase where several articulatory strategies are explored. In the third phase speakers start to select an optimal movement strategy to produce the sounds so that the values decrease.
The study investigates the contribution of tactile and auditory feedback in the adaptation of /s/ towards a palatal prosthesis. Five speakers were recorded via electromagnetic articulography, at first without the prosthesis, then with the prosthesis and auditory feedback masked, and finally with the prosthesis and auditory feedback available. Tongue position, jaw position and acoustic centre of gravity of productions of the sound were measured. The results show that the initial adaptation attempts without auditory feedback are dependent on the prosthesis type and directed towards reaching the original tongue palate contact pattern. Speakers with a prosthesis which retracted the alveolar ridge retracted the tongue. Speakers with a prosthesis which did not change the place of the alveolar ridge did not retract the tongue. All speakers lowered the jaw. In a second adaptation step with auditory feedback available speakers reorganised tongue and jaw movements in order to produce more subtle acoustic characteristics of the sound such as the high amplitude noise which is typical for sibilants.
Temporal development of compensation strategies for perturbed palate shape in German /S/-production
(2006)
The palate shape of four speakers was changed by a prosthesis which either lowered the palate or retracted the alveoles. Subjects wore the prosthesis for two weeks and were recorded several times via EMA. Results of articulatory measurements show that speakers use different compensation methods at different stages of the adaptation. They lower the tongue immediately after the insertion of the prosthesis. Other compensation methods as for example lip protrusion are only acquired after longer practising periods. The results are interpreted as supporting the existence of different mappings between motor commands, vocal tract shape and auditory-acoustic target.
In this paper we describe SOBA, a sub-component of the SmartWeb multi-modal dialog system. SOBA is a component for ontologybased information extraction from soccer web pages for automatic population of a knowledge base that can be used for domainspecific question answering. SOBA realizes a tight connection between the ontology, knowledge base and the information extraction component. The originality of SOBA is in the fact that it extracts information from heterogeneous sources such as tabular structures, text and image captions in a semantically integrated way. In particular, it stores extracted information in a knowledge base, and in turn uses the knowledge base to interpret and link newly extracted information with respect to already existing entities.
This demo abstract describes the SmartWeb Ontology-based Information Extraction System (SOBIE). A key feature of SOBIE is that all information is extracted and stored with respect to the SmartWeb ontology. In this way, other components of the systems, which use the same ontology, can access this information in a straightforward way. We will show how information extracted by SOBIE is visualized within its original context, thus enhancing the browsing experience of the end user.
Working closely with teaching and research staff is critical to the success of libraries and information services. Indeed, the degree of integration with a University's academic work is one of the factors that distinguish a successful service from a poor one. This paper will consider the relationship between information services and how universities operate. Using the challenges facing institutions as a starting point - including the move towards a single European higher education market - the impact of information provision on institutional strategies will be explored. Information resources underpin all learning, teaching and research activities and the presentation will consider the professional practice which ensures that libraries and computing services are fully exploited. The focus on the experience of students is leading some institutions to integrate information services with a wide range of other activities and the paper will consider the opportunities and challenges which this brings, including the need to build working relationships with a broader range of professional groups.
Information literacy is a mosaic of attitudes, understandings, capabilities and knowledge about which there are three myths. The first myth is that it is about the ability to use ICTs to access a wealth of information. The second is that students entering higher education are information literate because student centred, resource based, and ICT focused learning are now pervasive in secondary education. The third myth is that information literacy development can be addressed by library-centric generic approaches. This paper addresses those myths and emphasises the need for information literacy to be recognised as the critical whole of education and societal issue, fundamental to an information-enabled and better world. In formal education, information literacy can only be developed by infusion into curriculum design, pedagogies, and assessment.
Pseudo-Critical Temperature and Thermal Equation of State from Nf = 2 Twisted Mass Lattice QCD
(2012)
We report about the current status of our ongoing study of the chiral limit of two-flavor QCD at finite temperature with twisted mass quarks. We estimate the pseudo-critical temperature Tc for three values of the pion mass in the range of mPS ~ 300 and 500 MeV and discuss different chiral scenarios. Furthermore, we present first preliminary results for the trace anomaly, pressure and energy density. We have studied several discretizations of Euclidean time up to Nt = 12 in order to assess the continuum limit of the trace anomaly. From its interpolation we evaluate the pressure and energy density employing the integral method. Here, we have focussed on two pion masses with mPS ~ 400 and 700 MeV.
At the Institute for Applied Physics (IAP), University of Frankfurt, a s.c. 325 MHz CH-Cavity is under development for future beam tests at GSI UNILAC, Darmstadt. The cavity with 7 accelerating cells has a geometrical beta of 0.15 corresponding to 11.4 AMeV. The design gradient is 5 MV/m. The geometry of this resonator was optimized with respect to a compact design, low peak fields, surface processing, power coupling and tuning. Furthermore a new tuning system based on bellow tuners inside the resonator will control the frequency during operation. After rf tests in Frankfurt the cavity will be tested with a 10 mA, 11.4 AMeV beam delivered by the GSI UNILAC. In this paper rf simulations, multipacting analysis as well as thermal calculations will be presented.
Direct photon emission in heavy-ion collisions is calculated within a relativistic micro+macro
hybrid model and compared to the microscopic transport model UrQMD. In the hybrid approach,
the high-density part of the collision is calculated by an ideal 3+1-dimensional hydrodynamic
calculation, while the early (pre-equilibrium-) and late (rescattering-) phase are calculated with
the transport model. Different scenarios of the transition from the macroscopic description to
the transport model description and their effects are studied. The calculations are compared to
measurements by the WA98-collaboration and predictions for the future CBM-experiment are
made.
In his new book, R. Dworkin advocates the unity of values thesis. He wants to circumscribe morality as a proper epistemological domain which is methodologically different from scientific inquiry. The epistemological independence of morality is supposed to be a consequence of the irreducible fact/value dichotomy. This paper sustains that unity of values thesis is methodologically correct; all moral reasoning must be a constructive interpretation of its meaning. However, that author fails to recognize that not every axiological interpretation implies moral consequences. From H. Putnam’s pragmatic realism, this paper intends to demonstrate that much of scientific inquiry relies on values interpretation, and that this kind of reasoning is morally neutral. Finally, it should be clear that epistemological choices in legal positivism – e.g. the decision on which aspects of social interaction are theoretically relevant – should not disturb the soundness of its argument nor should it be read as if it had moral implications. This paper concludes that positivist theories cannot be ruled out. Since the choice between descriptive and interpretative models requires a circular justification, legal theory is itself an activity governed by epistemic values interpretation. Likewise natural sciences, it can only be understood from an internal perspective. Accordingly, inclusive positivism holds the advantage of being more consilient than interpretivism, which is arguably parochial.
It is a fact that mediation and other alternative dispute resolution means are becoming increasingly popular. Actually, governments are encouraging people to use them instead of going to Court, as they are quicker, cheaper and more informal than trials, and can be implemented using internet. The author focus on the analysis of the structure and purposes of mediation, in particular. The paper aims to discuss and understand what kind of justice, if any, is offered by alternative dispute resolution.
The research performed in the DeepThought project aims at demonstrating the potential of deep linguistic processing if combined with shallow methods for robustness. Classical information retrieval is extended by high precision concept indexing and relation detection. On the basis of this approach, the feasibility of three ambitious applications will be demonstrated, namely: precise information extraction for business intelligence; email response management for customer relationship management; creativity support for document production and collective brainstorming. Common to these applications, and the basis for their development is the XML-based, RMRS-enabled core architecture framework that will be described in detail in this paper. The framework is not limited to the applications envisaged in the DeepThought project, but can also be employed e.g. to generate and make use of XML standoff annotation of documents and linguistic corpora, and in general for a wide range of NLP-based applications and research purposes.
Nitrogen pollution is a major constituent of global change, threatening local biodiversity, ecosystem services, and causing serious environmental damage. Specifically, in areas with heavy agricultural soil-use, excessive use of nitrogen fertilizer pollutes the groundwaters with nitrates, but also with ammonia and nitrites. Freshwater fish and other aquatic fauna are especially vulnerable to nitrites, which can cause massive mortalities at even low concentrations < 0.1 mg/l NO2- - N. Adaptation of fish to environments with relatively high concentrations of chemicals has occurred throughout the history of life, although contemporary evolution acts at a much more rapid pace. The growing use of land for mass agriculture and livestock industries in the last 50 years in Florida has dramatically increased the nutrient loading into the groundwaters that feed the springs. Nitrite poses a serious threat for freshwater fauna as it is rapidly up taken and disturbs ion homeostasis and blood gas transport in fish. In this study, we evaluated, by means of a common-garden experiment, the tolerance of fish to nitrite using three different populations of eastern mosquitofish (Gambusia holbrooki) with different background nitrogen pollution histories. Mosquitofish females were exposed to nitrite in the lab, to either < 0.005 mg/l NO2- (control) or 0.3 mg/l NO2- for ten days and we assessed at the end of the exposure period their blood O2 transport capacity by measuring the concentration of four different types of hemoglobin, their total hematocrit, and their respiratory rates. Preliminary results show slight but significant varying patterns in the response of the exposed fish, depending on the population source, as evidenced by their respiratory rates and the blood erythrocyte counts. Mortality was very low, and hemoglobin profiles indicate high tolerance of G. holbrooki to nitrite contamination – a factor supporting their invasion success in agriculturally dominated regions around the world.
Nitrogen pollution is a major constituent of global change, threatening local biodiversity, ecosystem services, and causing serious environmental damage. Specifically, in areas with heavy agricultural soil-use, excessive use of nitrogen fertilizer pollutes the groundwaters with nitrates, but also with ammonia and nitrites. Freshwater fish and other aquatic fauna are especially vulnerable to nitrites, which can cause massive mortalities at even low concentrations < 0.1 mg/l NO2- - N. Adaptation of fish to environments with relatively high concentrations of chemicals has occurred throughout the history of life, although contemporary evolution acts at a much more rapid pace. The growing use of land for mass agriculture and livestock industries in the last 50 years in the US has dramatically increased the nutrient loading into the surface and groundwaters. Nitrite poses a serious threat for freshwater fauna as it is rapidly up taken and disturbs ion homeostasis and blood gas transport in fish. In this study, we evaluated, by means of a laboratory experiment, the tolerance of fish to nitrite using six different populations of wild eastern mosquitofish (Gambusia holbrooki) from two regions, North FL and NC, and with different background nitrogen pollution histories. Mosquitofish females were exposed to nitrite in the lab, to either < 0.005 mg/l NO2- (control) or 0.3 mg/l NO2- for ten days and we assessed at the end of the exposure period their blood O2 transport capacity by measuring the concentration of four different types of hemoglobin, their total hematocrit, and their respiratory rates. Preliminary results show significant varying patterns in the response of the exposed fish, depending on the population source, as evidenced by their respiratory rates and the blood erythrocyte counts. Mortality was very low, and hemoglobin profiles indicate high tolerance of G. holbrooki to nitrite contamination – a factor supporting their invasion success in agriculturally dominated regions around the world
n this contribution we lay down a lattice setup that allows for the nonperturbative study of a field theoretical model where a SU(2) fermion doublet, subjected to non-Abelian gauge interactions, is also coupled to a complex scalar field doublet via a Yukawa and an “irrelevant” Wilson-like term. Using naive fermions in quenched approximation and based on the renormalizedWard identities induced by purely fermionic chiral transformations, lattice observables are discussed that enable: a) in theWigner phase, the determinations of the critical Yukawa coupling value where the purely fermionic chiral transformation become a symmetry up to lattice artifacts; b) in the Nambu-Goldstone phase of the resulting critical theory, a stringent test of the actual generation of a fermion mass term of non-perturbative origin. A soft twisted fermion mass term is introduced to circumvent the problem of exceptional configurations, and observables are then calculated in the limit of vanishing twisted mass.
Race has been a term avoided in the Swedish debates, while at the same time, protections with respect to unlawful discrimination on the basis of race or ethnic origins have not been vigilantly upheld by the courts. This paper looks at the treatment of race by the Swedish legislature, as well as the treatment by the courts, specifically the Labour Court, with respect to claims of unlawful discrimination in employment on the basis of ethnic origins, against the background of Critical Race Theory. The disparities between the intent of the legislature and the outcome of the cases brought to the Swedish courts can be in least in part explained through the lens of Critical Race Theory, particularly with respect to the liberal approach taken by the courts when applying the law.
The thermodynamics of QCD with sufficiently heavy dynamical quarks can be described by a three-dimensional Polyakov loop effective theory, obtained after a truncated character and hopping expansion. We investigate the resulting phase diagram for low temperatures by mean field methods. Taking into account chemical potentials for both baryon number and isospin, we obtain clear signals for a liquid-gas type transition to baryon matter at μI=0 and a Bose-Einstein condensation transition at μB=0, as well as for their connection when both chemical potentials are non-zero.
H. L. A. Hart thought that a theory of law can be purely descriptive and called his theory a “descriptive sociology”. One of his great contributions to modern legal theory is his emphasis on the internal aspect of social rules. According to him, a theory of law can be built on the basis of the description of the participants’ view without sharing with it. This descriptivism is totally rejected by Dworkin, who propagates a theory that denies a sharp separation between a legal theory and its implications for adjudication. For Dworkin, a legal theory is only possible as a theory with “the internal, participants’ point of view”. Dworkin’s position implies a radicalization of legal theory that will transform the statement of an external point of view to that of an internal one. For Dworkin, the descriptivism bases on the sociological concept of law, which is an “imprecise criterial concept” and is “not sufficiently precise to yield philosophically interesting essential features.”Hart’s position is vulnerable because it takes an impure form of descriptivism that still draws a categorical distinction between fact and norm. This theoretical impurity results from the ambiguity of interpreting the internal aspect of rules. A strategy to rescue the Hart’s project is to radicalize his descriptivism with Luhmann's systems theory. Adapting the systems theoretical distinction between internal and external observation of law with all its implications for the explanation of the legal system and legal communications, Hart’s descriptivism finally attains its pure form, which is not only a distinctive paradigm of legal theory, but also possesses the potentialities to clarify its relationship to the legal theory based on the internal aspect of law.
The concept of biopolitics has its origin on the Michel Foucault works developped since 1975 to 1979. In this period, the author introduced the foundations for a new approach about the modern government, based in both crescent enpowerment on individuals and the control of populations. The theme has attracted the attentions of some critical political studies, with many practical uses. However, I believe there is not enough consolidation about biopolitics as a concept and a comprehensive theory of the new political mechanisms. This uncertainness is more evident when the very role of Law is questioned in a biopolitical model, due to the archaic nature that Foucault gives to it. So the aim of the paper is to identify the theorical comprehension of biopolitics in a contemporary author as Giorgio Agamben to demonstrate his oppositions and proximities from the original idea of Michel Foucault. I propose that Agamben has the same difficulties of Foucault to deal with legal theory and Law inside biopolitics. Nevertheless, after a critical review on the works of this two authors, my conclusion is that a settlement of the concepts of Law and biopolitics depends of the surpassing of the Foucaldian version of Law as sovereignity, a clear delimitation of a common core between the authors and their differences and the research and affirmation of the concept of Law in Agamben, more well-refined than Foucault's one.
Charmonia with different transverse momentum pT usually comes from different mechanisms in the relativistic heavy ion collisions. This work tries to review the theoretical studies on quarkonium evolutions in the deconfined medium produced in p-Pb and Pb-Pb collisions. The charmonia with high pT are mainly from the initial hadronic collisions, and therefore sensitive to the initial energy density of the bulk medium. For those charmonia within 0.1 < pT < 5 GeV/c at the energies of Large Hadron Collisions (LHC), They are mainly produced by the recombination of charm and anti-charm quarks in the medium. In the extremely low pT ∼ 1/RA (RA is the nuclear radius), additional contribution from the coherent interactions between electromagnetic fields generated by one nucleus and the target nucleus plays a non-negligible role in the J/ψ production even in semi-central Pb-Pb collisions.
There is an increasing interest in incorporating significant citizen participation into the law-making process by developing the use of the internet in the public sphere. However, no well-accepted e-participation model has prevailed. This article points out that, to be successful, we need critical reflection of legal theory and we also need further institutional construction based on the theoretical reflection.
Contemporary dominant legal theories demonstrate too strong an internal legal point of view to empower the informal, social normative development on the internet. Regardless of whether we see the law as a body of rules or principles, the social aspect is always part of people’s background and attracts little attention. In this article, it is advocated that the procedural legal paradigm advanced by Jürgen Habermas represents an important breakthrough in this regard.
Further, Habermas’s co-originality thesis reveals a neglected internal relationship between public autonomy and private autonomy. I believe the co-originality theory provides the essential basis on which a connecting infrastructure between the legal and the social could be developed. In terms of the development of the internet to include the public sphere, co-originality can also help us direct the emphasis on the formation of public opinion away from the national legislative level towards the local level; that is, the network of governance.1
This article is divided into two sections. The focus of Part One is to reconstruct the co-originality thesis (section 2, 3). This paper uses the application of discourse in the adjudication theory of Habermas as an example. It argues that Habermas would be more coherent, in terms of his insistence on real communication in his discourse theory, if he allowed his judges to initiate improved interaction with the society. This change is essential if the internal connection between public autonomy and private autonomy in the sense of court adjudication is to be truly enabled.
In order to demonstrate such improved co-original relationships, the empowering character of the state-made law is instrumental in initiating the mobilization of legal intermediaries, both individual and institutional. A mutually enhanced relationship is thus formed; between the formal, official organization and its governance counterpart aided by its associated ‘local’ public sphere. Referring to Susan Sturm, the Harris v Forklift Systems Inc. (1930) decision of the Supreme Court of the United States in the field of sexual harassment is used as an example.
Using only one institutional example to illustrate how the co-originality thesis can be improved is not sufficient to rebuild the thesis but this is as much as can be achieved in this article.
In Part Two, the paper examines, still at the institutional level, how Sturm develops an overlooked sense of impartiality, especially in the derivation of social norms; i.e. multi-partiality instead of neutral detachment (section 4). These two ideas should be combined as the criterion for impartiality to evaluate the legitimacy of the joint decision-making processes of both the formal official organization and ‘local’ public sphere.
Sturm’s emphasis on the deployment of intermediaries, both institutional and individual, can also enlighten the discourse theory. Intermediaries are essential for connecting the disassociated social networks, especially when a breakdown of communication occurs due to a lack of data, information, knowledge, or disparity of value orientation, all of which can affect social networks. If intermediaries are used, further communication will not be blocked as a result of the lack of critical data, information, knowledge or misunderstandings due to disparity of value orientation or other causes.
The institutional impact of the newly constructed co-originality thesis is also discussed in Part Two. Landwehr’s work on institutional design and assessment for deliberative interaction is first discussed. This article concludes with an indication of how the ‘local’ public sphere, through e-rulemaking or online dispute resolution, for example, can be constructed in light of the discussion of this article.
Der Vortrag wurde am 5th Frankfurt Scientific Symposium gehalten (22-23 Oktober 2005). Die Betrachtung des Videos ist (leider) nur mit den Browsern Internet Explorer ab 5.0, Netscape Navigator ab 7.0 oder Internet Explorer ab 5.2.2 für MaC möglich (s. Dokument 1.html). Die gesamten Tagungsbeiträge sind unter http://publikationen.ub.uni-frankfurt.de/volltexte/2005/1992/ abrufbar.
This paper introduces a novel research tool for the field of linguistics: The Linjgujisjtik web portal provides a virtual library which offers scientific information on every linguistic subject. It comprises selected internet sources and databases as well as catalogues for linguistic literature, and addresses an interdisciplinary audience. The virtual library is the most recent outcome of the Special Subject Collection Linguistics of the German Research Foundation (DFG), and also integrates the knowledge accumulated in the Bibliography of Linguistic Literature. In addition to the portal, we describe long-term goals and prospects with a special focus on ongoing efforts regarding an extension towards integrating language resources and Linguistic Linked Open Data.
A full session was organized in memory of Helmut Oeschler during the 2017 edition of the Strangeness in Quark Matter Conference. It was heart-warming to discuss with the audience his main achievements and share anecdotes about this exceptionally praised and appreciated colleague, who was also a great friend for many at the conference. A brief summary of the session is provided with these proceedings.
Occasionally, in pursuing their adjudicative duties over the course of a legal hearing, judges are called upon to acquire new concepts – that is, concepts which they did not possess at the commencement of the hearing. In performing their judicial role they are required to learn new things and, as a result, conceptualise the world in a way which differs from the way they conceived of things before the hearing commenced. Some theorists have argued that either as a general matter or as a matter specific to judicial practice and the legal context, judges are, with some degree of necessity, incapacitated from acquiring certain kinds of concepts. Such concepts include those possessed by the members of culturally different minority groups. Drawing on contemporary trends in analytic and naturalistic philosophy of mind, this paper explores the extent to which a judge might be incapacitated from acquiring new concepts over the course of a legal hearing and identifies those factors which condition the success or failure of that process.
The Brazilian Constitution of 1988 declares Brazil as a Democratic State of Law. This formally democratic legal status has been facing difficulties when it comes to its material implementation. Brazilian legal procedures are still greatly influenced by the catholic heritage from Portugal in the times of colonization, translated in the present times into a strong moral set of dogmas that still reflects upon the legal production and interpretation in the country. Recently in Brazil, a debate brought to the Supremo Tribunal Federal, the Brazilian Federal Supreme Court, has evidenced the struggle between Ethics and Morality in the country’s legal scenario. The focus of the discussion was the possibility of abortion of anencephalic fetuses (in Brazil, abortion in considered a crime against life). In order to properly ground its decision, the Court invited scientists, doctors, members of feminist movements and representatives of certain religions to a public dialogue, in which both scientific-technical and purely moral-religious arguments were presented. Although these procedures encouraged and promoted a democratic and pluralistic legal debate, it seems like the crucial point of the discussion were not taken into account: the scientific character of Law. This is the object of the present manuscript: in order to ensure an intersubjective construction and application of Law, this must be perceived as an Applied Social Science and judges, lawyers, legislators and all other legal actors must proceed in a scientific way. To illustrate the theme, the specific case of abortion of anencephalic fetuses will be mentioned through the text.
We present an architecture for the integration of shallow and deep NLP components which is aimed at flexible combination of different language technologies for a range of practical current and future applications. In particular, we describe the integration of a high-level HPSG parsing system with different high-performance shallow components, ranging from named entity recognition to chunk parsing and shallow clause recognition. The NLP components enrich a representation of natural language text with layers of new XML meta-information using a single shared data structure, called the text chart. We describe details of the integration methods, and show how information extraction and language checking applications for realworld German text benefit from a deep grammatical analysis.
Agamben has claimed to work inside the tradition inaugurated by the archaeological method of Michel Foucault but not to fully coincide with it. “My method is archaeological and paradigmatic in a sense which is very close to that of Foucault, but not completely coincident with it. The question is, facing the dichotomies that structuralize our culture, to go beyond the exceptions that have been producing the former, however, not to find a chronologically originary state, but to be able to understand the situation in which we are. Archaeology is, in this sense, the only way to access present” (interview to Flavia Costa, trad. Susana Scramim, in Revista do Departamento de Psicologia – Universidade Federal Fluminense, Niterói, v. 18 - n. 1, 131-136, Jan./Jun. 2006, 132, translated by the author). However, the aspects in which Agamben follows Foucault's method and the ones he does not were never very clear. This situation seems to change with the edition of Agamben's most extensive and explicit texts on method, Signatura Rerum. Sul Metodo (2008, italian edition). The goal of this article is to identify the points of intersection between their methods and some points in which they differ.
This paper aims to assess the arguments that claim representative democracy may be enhanced or replaced by an updated electronic version. Focusing on the dimension of elections and electioneering as the core mechanism of representative democracy I will discuss: (1) the proximity argument used to claim the necessity of filling the gap between decision-makers and stakeholders; (2) the transparency argument, which claims to remove obstacles to the publicity of power; (3) the bottom-up argument, which calls for a new form of legitimacy that goes beyond classical mediation of parties or unions; (4) the public sphere argument, referred to the problem of hierarchical relation between voters and their representatives; (5) the disintermediation argument, used to describe the (supposed) new form of democracy following the massive use of ICTs. The first way of conceptualizing e-democracy as different from mainstream 20th century representative democracy regimes is to imagine it as a new form direct democracy: this conception is often underlying contemporary studies of e-voting. To avoid some of the ingenuousness of this conception of e-democracy, we should take a step back and consider a broader range of issues than mere gerrymandering around the electoral moment. Therefore I shall problematize the abovementioned approach by analyzing a wider range of problems connected to election and electioneering in their relation with ICTs.
We report on the status of ongoing investigations aiming at locating the deconfinement critical point with standard Wilson fermions and Nf = 2 flavors towards the continuum limit (standard Columbia plot); locating the tricritical masses at imaginary chemical potential with unimproved staggered fermions at Nf = 2 (extended Columbia plot); identifying the order of the chiral phase transition at μ = 0 for Nf = 2 via extrapolation from non integer Nf (alternative Columbia plot).
The order of the chiral phase transition of lattice QCD with unimproved staggered fermions is known to depend on the number of quark flavours, their masses and the lattice spacing. Previous studies in the literature for Nf∈{3,4} show first-order transitions, which weaken with decreasing lattice spacing. Here we investigate what happens when lattices are made coarser to establish contact to the strong coupling region. For Nf∈{4,8} we find a drastic weakening of the transition when going from Nτ=4 to Nτ=2, which is consistent with a second-order chiral transition reported in the literature for Nf=4 in the strong coupling limit. This implies a non-monotonic behaviour of the critical quark or pseudo-scalar meson mass, which separates first-order transitions from crossover behaviour, as a function of lattice spacing.
In this contribution we report the status and plans of the open lattice initiative to generate and share new gauge ensembles using the stabilised Wilson fermion framework. The production strategy is presented in terms of a three stage plan alongside summaries of the data management as well as access policies. Current progress in completing the first stage of generating ensembles at four lattice spacings at the flavor symmetric point is given.
The OpenLat initiative presents its results of lattice QCD simulations using Stabilized Wilson Fermions (SWF) using 2+1 quark flavors. Focusing on the SU(3) flavor symmetric point mπ=mK=412 MeV, four different lattice spacings (a=0.064,0.077,0.094,0.12 fm) are used to perform the continuum limit to study cutoff effects. We present results on light hadron masses; for the determination we use a Bayesian analysis framework with constraints and model averaging to minimize the bias in the analysis.
n this joint contribution we announce the formation of the "OPEN LATtice initiative", this https URL, to study Stabilised Wilson Fermions (SWF). They are a new avenue for QCD calculations with Wilson-type fermions and we report results on our continued study of this framework: Tuning the clover improvement coefficient, and extending the reach of lattice spacings to a=0.12 fm. We fix the flavor symmetric points mπ=mK=412 MeV at a=0.055,0.064,0.077,0.094,0.12 fm and define the trajectories to the physical point by fixing the trace of the quark mass matrix. Currently our pion mass range extends down to mπ∼200 MeV. We outline our tuning goals and strategy as well as our future planned ensembles. First scaling studies are performed on fπ and mπ. Additionally results of a preliminary continuum extrapolation of mN at the flavor symmetric point are presented. Going further a first determination of the light and strange hadron spectrum chiral dependence is shown, which serves to check the quality of the action for precision measurements. We also investigate other quantities such as flowed gauge observables to study how the continuum limit is approached. Taken together we observe the SWF enable us to perform stable lattice simulations across a large range of parameters in mass, volume and lattice spacing. Pooling resources our new initiative has made our reported progress possible and through it we will share generated gauge ensembles under an open science philosophy.
The so-called Columbia plot summarises the order of the QCD thermal transition as a function of the number of quark flavours and their masses. Recently, it was demonstrated that the first-order chiral transition region, as seen for Nf∈[3,6] on coarse lattices, exhibits tricritical scaling while extrapolating to zero on sufficiently fine lattices. Here we extend these studies to imaginary baryon chemical potential. A similar shrinking of the first-order region is observed with decreasing lattice spacing, which again appears compatible with a tricritical extrapolation to zero.
This article presents a multiscale approach for detecting and monitoring soil erosion phenomena (i.e. gully erosion) in the agro-industrial area around the city of Taroudannt, Souss basin, Morocco. The study area is characterized as semi-arid with an annual average precipitation of 200 mm. Water scarcity, high population dynamics and changing land use towards huge areas of irrigation farming present numerous threats to sustainability. The agro-industry produces citrus fruits and vegetables in monocropping, mainly for the European market. Badland areas strongly affected by gully erosion border the agricultural areas as well as residential areas. To counteract the significant loss of land, land-leveling measures are attempted to create space for plantations and greenhouses. In order to develop sustainable approaches to limit gully growth the detection and monitoring of gully systems is fundamental. Specific gully sites are monitored with unmanned aerial vehicle (UAV) taking small-format aerial photographs (SFAP). This enables extremely high-resolution analysis (SFAP resolution: 2-10 cm) of the actual size of the gully channels as well as a detailed continued surveillance of their growth. Transferring the methodology on a larger scale using Quickbird satellite data (resolution: 60 cm) leads to the possibility of a large-scale analysis of the whole area around the city of Taroudannt (Area extent: ca. 350 km²). The results will then reveal possible relationships of gully growth and agro-industrial management and may even illustrate further interdependencies. The main objective is the identification of areas with high gully-erosion risk due to non-sustainable land use and the development of mitigation strategies for the study area.
Communist regimes in general and especially the one in Albania destroyed almost every aspect of political, social, cultural and economic life, including the notion of pluralism and intellectual elite of the country. In Albania, the transition into democracy in 90’ was done through extrication which means that the authoritarian government was weakened, but not as thoroughly as in a transition by defeat. As a consequence, the former Communist elite was able to negotiate crucial features of the transition and was very quickly transformed into the new pluralist political class. This position enabled the communist elite to be rehabilitated and together with the new emerged communist elite to remain a strong influential actor in new emerged democracy and de facto to run in continuance the country. The purpose of the new emerged communist elite to maintain control was favored inter alia by the absence of a new strong intellectual elite and was done merely by sharing the power among its members divided into different political parties and also by using the ‘pluralist’ law as a tool for social control over new emerging intellectual elites. The use of law as a tool for social control by the political class has severely damaged people's understanding and expectations on the law, its relations with the state as well as international community. Indeed, such experience of the use of law by the political class for its own narrow interests, has made people lose confidence in law and state as well as has severely weakened the law enforcement in the country. To conclude, the overall purpose of this paper would be the analysis of law in general and its understandings and development in a post-communist society such as Albania from different points of view.
It is a long discussed issue whether light scalar mesons have sizeable four-quark components. We present an exploratory study of this question using Nf = 2+1+1 twisted mass lattice QCD. A mixed action approach ignoring disconnected contributions is used to calculate correlatormatrices consisting of mesonic molecule, diquark-antidiquark and two-meson interpolating operators with quantum numbers of the scalar mesons a0(980) (1(0++)) and k (1/2(0+)). The correlation matrices are analyzed by solving the generalized eigenvalue problem. The theoretically expected free two-particle scattering states are identified, while no additional low lying states are observed. We do not observe indications for bound four-quark states in the channels investigated.
Are Kantian philosophy and its principle of respect for persons inadequate to the protection of environmental values? This paper answers this question by elucidating how Kantian ethics can take environmental values seriously. In the period that starts with the Critique of Judgment in 1790 and ends with the Metaphysics of Morals in 1797, the subject would have been approached by Kant in a different manner; although the respect that we may owe to non-human nature is still grounded in our duties to mankind, the basis for such respect stems from nature’s aesthetic properties, and the duty to preserve nature lies in our duties to ourselves. Compared to the “market paradigm”, as it is called by Gillroy (the reference is to a conception of a public policy based on a criterion of economic efficiency or utility), Kantian philosophy can offer a better explanation of the relationship between environmental policy and the theory of justice. Kantian justice defines the “just state” as the one that protects the moral capacities of its “active” citizens, as presented in the first Part of the Metaphysics of Morals. In the Kantian paradigm, the environmental risk becomes a “public” concern. That means it is not subsumed under an individual decision, based on a calculus.
In the year 2000 the Deutsche Initiative für Netzwerkinformation (DINI) / German Coalition of Network Information was founded: 10 theses "Changes in information infrastructure – challenges to universities and their information and communications facilities" is the DINI’s founding charter (s. http://www.dini.de).
Thesis 4 states: "The universities need to establish information management structures to integrate departments. University managements, departments and central institutions ought to prepare a university development plan for the areas of information, communication and multimedia." ...
The doubt about certainty like an absolute value in law and as an ideal full in legal system (argument about impossibility) is a controversial fact in contemporary legal theory. In this text I examine some contemporary doctrines about the classic understanding (in critical sense) of this ideal. I have selected the most representative doctrines: doctrine about "open texture of Law" (H.L.A. Hart), starting point in this discussion; doctrine about "Il Diritto mite" (G. Zagrebelsky), from the continental European legal tradition at present; and doctrine about "vagueness in Law" (T.A.O. Endicott), this doctrine is the most recent, from the Anglo-Saxon legal tradition. Finally, in Conclusions, I analyze if this doubt (argument about impossibility) contaminates (in some sense) to the concept of law or to the characteristics that describe law in the contemporary Constitutional State.
The CBM experiment will investigate heavy-ion collisions at beam energies from 8 to 45 AGeV at the future accelerator facility FAIR. The goal of the experiment is to study the QCD phase diagram in the vincinity of the QCD critical point. To do so, CBM aims at measuring rare probes among them open charm. In order to identify those rare and short lived particles despite the rich combinatorial background generated in heavy ion collisions, a micro vertex detector (MVD) providing an unprecedented combination of high rate capability and radiation hardness, very light material budget and excellent granularity is required. In this work, we will discuss the concept of this detector and summarize the status of the R&D.
The Compressed Baryonic Matter (CBM) experiment [1] is a fixed target heavy-ion experiment that will operate at the international Facility for Antiproton and Ion Research (FAIR) [2] now under construction in Darmstadt, Germany. The experiment intends to study rare probes, which are emitted from heavy ion collisions with a beam energy of 4 to 45 AGeV. A focus is laid to the short lived open charm particles and to particles decaying into di-lepton pairs. Handling the up to 107 Au+Au collisions/s required for generating those probes with sufficient statistics, as much as reaching the required sensitivity for observing them, forms a major challenge for the silicon detectors of the experiment. We present the concept and the development status of two central detectors of CBM, the CMOS pixel based micro vertex detector (MVD) and the micro-strip detector based silicon tracking system (STS).
22nd International Workshop on Vertex Detectors, 15-20 September 2013 Lake Starnberg, Germany
Chopper systems are used to pulse charged particle beams. In most cases, electric deflection systems are used to generate beam pulses of defined lengths and appropriate repetition rates. At high beam intensities, the field distribution of the chopper system needs to be adapted precisely to the beam dynamics in order to avoid aberrations. An additional challenge is a robust design which guarantees reliable operation. For the Frankfurt Neutron Source FRANZ, an E×B chopper system is being developed which combines static magnetic deflection with a pulsed electric field in a Wien filter configuration. It will generate proton pulses with a flat top of 50 ns at a repetition rate of 250 kHz for 120 keV, 200 mA beams. For the electric deflection, pre-experiments with static and pulsed fields were performed using a helium ion beam. In pulsed mode operation, ion beams of different energies were deflected with voltages of up to ±6 kV and the resulting response was measured using a beam current transformer. A comparison between experiments and theoretical calculations as well as numerical simulations are presented.
The influence of an ac current of arbitrary amplitude and frequency on the mixed-state dc-voltage-ac-drive tiltingratchet response of a superconducting film with uniaxial cosine pinning potential at finite temperature is theoretically investigated. The results are obtained in the single-vortex approximation, within the frame of an exact solution of the Langevin equation for non-interacting vortices. Both experimentally achievable, the dc ratchet response and absorbed ac power are predicted to demonstrate a pronounced filter-like behavior at microwave frequencies. Based on our findings, we propose a cut-off filter and discuss its operating curves as functions of the driving parameters, i.e, ac amplitude, frequency, and dc bias. The predicted results can be examined, e.g, on superconducting films with a washboard pinning potential landscape.
The demarcation of authority between parents and the State regarding education of children has become an increasingly complex issue over the past three decades. During the same period the number of parents around the world choosing educational alternatives such as homeschooling has grown exponentially, causing significant legislative and jurisprudential shifts in the United States as well as other Western nations. If the State is responsible for education or has a significant interest therein, then it must have broad authority by which to prescribe the method, mechanism, and acceptable outcomes of education; it must also be able to review and enforce these desired outcomes. If parents, on the other hand, are responsible, then it is the State’s duty to defer to parents absent a compelling reason to interfere. A survey of the philosophical foundations from ancient to modern times demonstrates the tension between the State and parents in the realm of education; however, modern human rights norms contained in post-1945 international human rights documents provide explicit grounds on which the State must defer to parental choice in education.
An optimized design of a stellarator-type storage ring for low energy ion beams was numerically investigated. The magnetic field variation along the circumference and therefore magnetic heating is suppressed by using simple circular correction coils. Particle-in-Cell (PIC) simulations in a magnetic flux coordinate system show the ability of high current ion beam accumulation in such a configuration with unique features for clockwise and anticlockwise moving beams. Additionally scaled down experiments with two 30 degree room temperature toroidal segments were performed to demonstrate toroidal transport and to develop optical beam diagnostics. Properties of multi-component beams, redistribution of transversal momenta in the non-adiabatic part of the experimental configuration and investigation of strongly confined beam induced electron clouds will be addressed.
Background: The extent of preoperative peritumoral edema in glioblastoma (GBM) has been negatively correlated with patient outcome. As several ongoing studies are investigating T-cell based immunotherapy in GBM, we conducted this study to assess whether peritumoral edema with potentially increased intracranial pressure, disrupted tissue homeostasis and reduced local blood flow has influence on immune infiltration and affects survival.
Methods: A volumetric analysis of preoperative imaging (gadolinium enhanced T1 weighted MRI sequences for tumor size and T2 weighted sequences for extent of edema (including the infiltrative zone, gliosis etc.) was conducted in 144 patients using the BrainlabÒ software. Immunohistochemical staining was analyzed for lymphocytic- (CD 3+) and myeloid (CD15+) tumor infiltration. A retrospective analysis of patient-, surgical-, and molecular characteristics was performed using medical records.
Results: The edema to tumor ratio was neither associated with progression-free nor overall survival (p=0.90, p=0.74). However, GBM patients displaying IDH-1 wildtype had significantly higher edema to tumor ratio than patients displaying an IDH-1 mutation (p=0.01). Immunohistopathological analysis did not show significant differences in lymphocytic or myeloid tumor infiltration (p=0.78, p=0.74) between these groups.
Conclusion: In our cohort, edema to tumor ratio had no significant correlation with immune infiltration and outcome. However, patients with an IDH-1wildtype GBM had a significantly higher edema to tumor ratio compared to their IDH-1 mutated peer group. Further studies are necessary to elucidate the underlying mechanisms.
The enhancing importance of digital documents has effected activities on how to deal with them. One line came from the more general field of "scientific publishing", which was handled in detail by DINI (Deutsche Initiative für Netzwerkinformation). But for this initiative long- time archiving was only one field of many and was not their primary focus. DINI first of all concentrated on the elaboration of effective and standardized methods and tools for publishing and related services on the basis of open access policy via the use of institutional repositories. The second line of projects came from the more general view of maintaining cultural heritage also in a digital world. Especially under the patronage of the Ministry of Education and Research important projects were being financed. Strategic solutions including archives, libraries, and museums are discussed and elaborated within NESTOR, where more technical solutions based on the term of practicability are developed within KOPAL. KOPAL brought together the industry (IBM) with a public- funded technical center (GWDG) and two libraries (DNB and SUB Göttingen). Within this project a general software implementation, which took into consideration all necessary international standards, could be finished last month and has been now for about two weeks. Based on early results within NESTOR it seemed important too, to strengthen all activities by giving them a legal basis. Therefore when the law changed concerning the German National Library from June 22nd this year (DNBG), the library was authorized with all the necessary instruments to collect digital documents in "non-physical" form as well. With this law at the moment Germany is in the rare position of being one of the few countries where the collection of network publications is part of the whole legal deposit strategy.
E-democracy as the frame of networked public discourse : information, consensus and complexity
(2012)
The quest for democracy and the political reflection about its future are to be understood nowadays in the horizon of the networked information revolution. Hence, it seems difficult to speak of democracy without speaking of e-democracy, the key issue of which is the re-configuration of models of information production and concentration of attention, which are to be investigated both from a political and an epistemological standpoint. In this perspective, our paper aims at analyzing the multi-agent dimension of networked public discourse, by envisaging two competing models of structuring this discourse (those of dialogue and of claim) and by suggesting to endorse the epistemic idea of complementarity as a guidance principle for elaborating a form of partnership between traditional and electronic media.
The paper is concerned with the Hartian idea that the justification of law’s normativity can be traced back to the exquisite social fact, viz. special kind of social convention. After discussing the view that the rule of recognition is a coordinative convention A. Marmor’s idea of constitutive convention is introduced. Relying on J. Dickson’s brilliant enquiry I finally argue that this latter idea is deprieved of any explanatory power, which was pressuposed by H.L.A. Hart when he himself reffered to the conventional rule of recognition as social fact having full normative significance.
At GSI a new, superconducting (sc) continuous wave (cw) LINAC is under design in cooperation with the Institute for Applied Physics (IAP) of Frankfurt University and the Helmholtz Institut Mainz (HIM). This proposed LINAC is highly requested by a broad community of future users to fulfill the requirements of nuclear chemistry, nuclear physics, and especially in the research field of Super Heavy Elements (SHE). In this context the preliminary layout of the LINAC has been carried out by IAP. The main acceleration of up to 7.3 AMeV will be provided by nine sc Crossbar-H-mode (CH) cavities operated at 217 MHz. Currently, a prototype of the cw LINAC as a demonstrator is under development. The demonstrator comprises a sc CH-cavity embedded between two sc solenoids mounted in a horizontal cryomodule. A full performance test of the demonstrator in 2013/14 by injecting and accelerating a beam from the GSI High Charge Injector (HLI) is one important milestone of the project. The status of the demonstrator is presented.
This article summarizes some of the current theoretical developments and the experimental status of hypernuclei in relativistic heavy-ion collisions and elementary collisions. In particular, the most recent results of hyperhydrogen of mass A = 3 and 4 are discussed. The highlight at SQM2022 in this perspective was the discovery of the anti-hyperhydrogen-4 by the STAR Collaboration, in a large data set consisting of different collision systems. Furthermore, the production yields of hyperhydrogen-4 and hyperhelium-4 from the STAR Collaboration can be described nicely by the thermal model when the excited states of these hypernuclei are taken into account. In contrast, the production measurements in small systems (pp and p–Pb) from the ALICE Collaboration tends to favour the coalescence model over the thermal description. New measurements from STAR, ALICE and HADES Collaborations of the properties, e.g. lifetime, of A = 3 and 4 hypernuclei give similar results of these properties. Also the anti-hyperhydrogen-4 lifetime is in rather good agreement with previous measurements. Interestingly, the new STAR measurement on the R3 value, that is connected to the branching ratio, points to a Λ separation energy that is below 100 keV but definitely consistent with the value of 130 keV assumed since the 70s.
An overview is given on the experimental study of physics with relativistic heavy-ion collisions, with emphasis on recent measurements at the Large Hadron Collider (LHC) and the Relativistic Heavy Ion Collider (RHIC). The focus here is laid on p–Pb collisions at the LHC and the corresponding d–Au measurements at RHIC. The topics touched are “collectivity and approach to equilibrium”, “high pT and jets”, “heavy flavour and electroweak bosons” and “search for exotic objects”.
Democratic rule of law has been struggling with the occurring problem of pluralism of values. It is therefore still faced with the dilemma of ordering the relationship of law and ethics, namely with the question whether in the issue of legal solutions the priority is granted to ethics or to law. In the case of dominance of the positivist paradigm, it is all the more important because the ethical issue is marginalized in it. It turns out that the same authority, deciding on similar issues, at the junction of two areas: ethics and law, can make mutually contradictory decisions: once giving priority to ethics, whereas - at different times - to positive law. On a closer analysis, this contradiction proves illusory because under the guise of protection of a positive paradigm, the hidden fact is that the axiological decision underlies the resolution concerning law. This decision protects the values that have priority in the scale of preferential value of decision-making body. The example considered in the article concerns the interface between ethical and legal norms against selected rulings of the Constitutional Court. The doubts that arise in this context may be in future avoided or perhaps, if necessary, resolved by adopting a two-aspect model of legal norm. This model in its vertical approach has an evaluative element. This allows to deem the seemingly contradictory decision in similar cases as justified one. It also shows that in practice the rightness of the resolution takes precedence both over ethics as well as over law.
The paper will provide a brief background to the history of the organization and cooperative efforts of African studies librarians in the United States including their efforts at international cooperation. Particular emphasis will be placed on the current opportunities for improved cooperation as digitization activities increase. Examples will include the DISA and Aluka initiatives and well as the Timbuktu manuscript digitization project at the Center for Research Libraries. Particular emphasis will placed on the possibilities for German-North American cooperation in the area of digital projects of historical photographs given the extensive collections held at Northwestern and Frankfurt.
Development of fragmented low-Z ion beams for the NA61 fixed-target experiment at the CERN SPS
(2011)
The NA61 experiment, aims to study the properties of the onset of deconfinement at low SPS energies and to find signatures of the critical point of strongly interacting matter. A broad range in T-μB phase diagram will be covered by performing an energy (13A-158A GeV/c) and system size (p+p, Be+Be, Ar+Ca, Xe+La) scan. In a first phase, fragmented ion beams of 7Be or 11C produced as secondaries with the same momentum per nucleon when the incident primary Pb-ion beam hits a thin Be target will be used. The H2 beam line that transports the beam to the experiment acts as a double spectrometer which combined with a new thin target (degrader) where fragments loose energy proportional to the square of their charge allows the separation of the wanted A/Z fragments. Thin scintillators and TOF measurement for the low energy points are used as particle identification devices. In this paper results from the first test of the fragmented ion beam done in 2010 will be presented showing that a pure Be beam can be obtained satisfying the needs of the experiment.
Friedrich Schlegel's lasting contribution to linguistics is usually seen in the impact that his book "Über die Sprache und Weisheit der Indier" from 1808 left on comparative linguistics and on the study of Sanskrit. Schlegel was one of the first European scholars to have studied Sanskrit extensively and he made a number of translations of Sanskrit literature into German which make up one third of "Über die Sprache und Weisheit der Indier". Schlegel's book is widely regarded as a founding document both of comparative linguistics and of indology, a fact which is quite remarkable in light of the development of Schlegel's thought after this text. His interest in Indian studies ceased more or less directly with the publication of this work, while his thoughts on language became more and more suffused by transcendental philosophy.
Computation of masses of quarkonium bound states using heavy quark potentials from lattice QCD
(2022)
We compute masses of bottomonium and charmonium bound states using a Schrödinger equation with a heavy quark-antiquark potential including 1/m and 1/m2 corrections previously derived in potential Non-Relativistic QCD and computed with lattice QCD. This is a preparatory step for a future project, where we plan to take into account similar corrections to study quarkonium resonances and tetraquarks above the lowest meson-meson thresholds.
Human rights and climate policy – toward a new concept of freedom, protection rights, and balancing
(2012)
Neither the scope of “protection obligations” which are based on fundamental rights nor the theory of constitutional balancing nor the issue of “absolute” minimum standards (fundamental rights nuclei, “Grundrechtskerne”), which have to be preserved in the balancing of fundamental rights, can be considered satisfactorily resolved–in spite of intensive, long-standing debates. On closer analysis, the common case law definitions turn out to be not always consistent. This is generally true and with respect to environmental fundamental rights at the national, European, and international level. Regarding the theory of balancing, for the purpose of a clear balance of powers the usual principle of proportionality also proves specifiable. This allows a new analysis, whether fundamental rights have absolute cores. This question is does not only apply to human dignity and the German Aviation Security Act, but even if environmental policy accepts death, e.g. regarding climate change. Overall, it turns out that an interpretation of fundamental rights which is more multipolar and considers the conditions for freedom more heavily–as well as the freedom of future generations and of people in other parts of the world–develops a greater commitment to climate protection.
p-process nucleosynthesis via proton-capture reactions in thermonuclear supernovae explosions
(2015)
Model calculations within the framework of the so-called γ process show an underproduction of the p nucleus with the highest isotopic abundace 92Mo. This discrepancy can be narrowed by taking into account the alternative production site of a type Ia supernova explosion. Here, the nucleus 92Mo can be produced by a sequence of proton-capture reactions. The amount of 92Mo nuclei produced via this reaction chain is most sensitive to the reactions 90Zr(p,γ) and 91Nb(p,γ). Both rates have to be investigated experimentally to study the impact of this nucleosynthesis aspect on the long-standing 92Mo-problem. We have already measured the proton-capture reaction on 90Zr using high-resolution in-beam γ-ray spectroscopy. In this contribution, we will present our preliminary results of the total cross sections as well as the partial cross sections. Furthermore, we plan to measure the 91Nb(p,γ) reaction soon. Due to the radioactive target material, the 91Nb nuclei have to be produced prior to the experiment. The current status of this production will be presented in this contribution.
As microscopic transport models usually have difficulties to deal with in-medium effects in heavy-ion collisions, we present an alternative approach that uses coarse-grained output from transport calculations with the UrQMD model to determine thermal dilepton emission rates. A four-dimensional space-time grid is set up to extract local baryon and energy densities, respectively temperature and baryon chemical potential. The lepton pair emission is then calculated for each cell of the grid using thermal equilibrium rates. In the current investigation we inlcude the medium-modified r spectral function by Eletsky et al., as well as contributions from the QGP and four-pion interactions for high collision energies. First dielectron invariant mass spectra for Au+Au collisions at 1.25 AGeV and for dimuons from In+In at 158 AGeV are shown. At 1.25 AGeV a clear enhancement of the total dilepton yield as compared to a pure transport result is observed. In the latter case, we compare our outcome with the NA60 dimuon excess data. Here a good agreement is achieved, but the yield in the low-mass tail is underestimated. In general the results show that the coarse-graining approach gives reasonable results and can cover a broad collision-energy range.
The main purpose of my article is to discuss what GMOs are, the controversies about this specific issue and the related regulations that are put forward by the authorities. GMOs are genetically altered organisms which have been widely produced and breeded in certain parts of the world. According to some experts, this special practice of agriculture emerged in order to put an end to famine and prevent food scarcity. As growing GMOs seems to be more convenient than the traditional farming, it is more eligible to produce food in large scale which will be a fine solution for food scarcity. However, there are some oppositions to the GMOs. It is strongly believed that the real causes of famine is not related to production, it is a problem of distribution of food. Moreover, patenting the seeds leads to an unstoppable control and dominance over food by the private enterprises. Therefore, the opponents state that the aims of these companies are solely financial gain and monopolisation in food production. Patenting the seeds is another arguable issue. It poses a great threat for the organic farmers since GMO seeds can contaminate the others through natural ways. This is not the only danger that organic farmers face with; thay can also be sued by the GMO producers for this unintended exposure to GMO seeds. Not only the diminishing of the variety of species but also the possible adverse effects of GMOs on human health create a debate between the two groups. These are not the only topics that are open to discussion. In addition to these, labelling the products creates a huge problem among the poorly educated consumers as they have not been clearly regulated in some countries. Hence, this subject having such a close connection to human health cannot be ignored by the law. In fact, a number of countries have enacted legislation in order to regulate this sensitive field. Turkey, having been dependent on the import of the agricultural goods for a period of time, has to join these countries with a recent legislation. All these contemporary issues for Turkey will be highlighted in my article.
LICE is one of the four major LHC experiments at CERN. When the accelerator enters the Run 3 data-taking period, starting in 2021, ALICE expects almost 100 times more Pb-Pb central collisions than now, resulting in a large increase of data throughput. In order to cope with this new challenge, the collaboration had to extensively rethink the whole data processing chain, with a tighter integration between Online and Offline computing worlds. Such a system, code-named ALICE O2, is being developed in collaboration with the FAIR experiments at GSI. It is based on the ALFA framework which provides a generalized implementation of the ALICE High Level Trigger approach, designed around distributed software entities coordinating and communicating via message passing.
We will highlight our efforts to integrate ALFA within the ALICE O2 environment. We analyze the challenges arising from the different running environments for production and development, and conclude on requirements for a flexible and modular software framework. In particular we will present the ALICE O2 Data Processing Layer which deals with ALICE specific requirements in terms of Data Model. The main goal is to reduce the complexity of development of algorithms and managing a distributed system, and by that leading to a significant simplification for the large majority of the ALICE users.
Technocracy is usually opposed to democracy. Here, another perspective is taken: technocracy is countered with the rule of law. In trying to understand the contemporary dynamics of the rule of law, two main types of legal systems (in a broad sense) have to be distinguished: firstly, the legal norm, studied by the science of law; secondly, the scientific laws (which includes the legalities of the different sciences and communities). They both contain normative prescriptions. But their differ in their subjects‘ source: while legal norms are the will’s expression of the normative authority, technical prescriptions can be derived from scientific laws, which are grounded over the commonly supposed objectivity of the scientific knowledge about reality. They both impose sanctions too, but in the legal norm they refer to what is established by the norm itself, while in the scientific legality they consist in the reward or the punishment derived from the efficacy or inefficacy to reach the end pursued by the action. The way of legitimation also differs: while legal norms have to have followed the formal procedures and must not have contravened any fundamental right, technical norms‘ validity depend on its theoretical foundations or on its efficacy. Nowadays, scientific knowledge has become and important feature in policy-making. Contradictions can arise between these legal systems. These conflicts are specially grave when the recognition or exercise of fundamental rights is instrumentally used, or when they are violated in order to increase the policies‘ efficacy. A political system is technocratic, when, in case of contradiction, the scientific law finally prevails.
Focus expressions in Foodo
(2006)
Focus expressions in Yom
(2005)
Focus in Gur and Kwa
(2006)
The project investigates focus phenomena in the two genetically relatedWest African Gur and Kwa language groups of the Niger-Congo phylum. Most of its members are tone languages, they are similar with respect to word order typology (all are SVO languages), but of divergent morphological type (agglutinating Gur versus isolating Kwa).
0. Introduction 1. Observations concerning the structure of morphosyntactically marked focus constructions 1.1 First observation: SF vs. NSF asymmetry 1.2 Second observation: NSF-NAR parallelism 1.3 Affirmative ex-situ focus constructions (SF, NSF), and narrative clauses (NAR) 2. Grammaticalization 2.1 Cleft hypothesis 2.2 Movement hypothesis 2.3 Narrative hypothesis 2.3.1 Back- or Foregrounding? 2.3.2 Converse directionality of FM and conjunction 3. Language specific analysis 4. Conclusionary remarks References
The role of experts grows in the present and that is, in part, justifiable: as complexity rises, the ones who deliberate feel the need of the help of those who have know-how in specific fields. The question that must be asked revolves around the type of expectations developed in modern societies regarding what experts can do. Though specialization is not a peculiarity of our time (the process can be observed since human beings became sedentary); it has presently gained specific characteristics. Two aspects of modern life are particularly significant on that matter: (i.) the fact that the economic system is based on excitation of new needs (and no longer on the demand for satisfaction of needs); (ii.) the growing pursuit for total administration of conflicts. These factors are constitutive of what Gadamer sees as a great threat to our civilization: the excessive emphasis given in our time to the human ability to adapt. A specific ability is demanded from individuals: the capability of making an apparatus functions properly. Less resistance and more adaptability is requested, and because of that, autonomous thought - that is, not determined by the function it has in a system – is devalued. The threat we currently face is that the abilities of a good technocrat become the only qualities demanded from those who are responsible for practical decisions (especially in politics and law). Teleological reason, that guides the activity of specialists (and requires know-how in a specific area and consists in choosing means to reach a previously established goal), should not substitute practical reason, as the former requires adaptability to experience (not to a plan that was previously established) and is grounded on solidarity. In order to discuss the limits of the activity of specialists, the paper looks back to phrónesis and the way ancient Greeks set boundaries - this exercise should help raising new questions revolving the matter.
The hadronic final state of central Pb+Pb collisions at 20, 30, 40, 80, and 158 AGeV has been measured by the CERN NA49 collaboration. The mean transverse mass of pions and kaons at midrapidity stays nearly constant in this energy range, whereas at lower energies, at the AGS, a steep increase with beam energy was measured. Compared to p+p collisions as well as to model calculations, anomalies in the energy dependence of pion and kaon production at lower SPS energies are observed. These findings can be explained, assuming that the energy density reached in central A+A collisions at lower SPS energies is sufficient to transform the hot and dense nuclear matter into a deconfined phase.
In this article the author, in the context of the fiftieth anniversary of H.L.A. Hart’s “The Concept of Law”, reconsiders the moderate indeterminacy of law thesis, which derives from the open texture of language. For that purpose, he intends: first, to analyze Hart’s moderate indeterminacy thesis, i.e. determinacy in “easy cases” and indeterminacy in “hard cases”, which resembles Aristotle’s "doctrine of the mean"; second, to criticize his moderate indeterminacy thesis as failing to embody the virtues of a center in between the vices of the extremes, by insisting that the exercise of discretion required constitutes an “interstitial” legislation; and, third, to reorganize an argument for a truly “mean” position, which requires a form of weak interpretative discretion, instead of a strong legislative discretion.
In order to understand the impact of new technologies on the law through the science of law, it is essential to observe how Law researches are done. This paper pursues the following models of legal science: analytical (theory of formal rule); hermeneutics (interpretation theory) and empirical (decision theory) to appraise methodological procedures used in monograph researches in some Brazilian Law courses. This study was to detect which model of law science was used in the development of Law researches. The study was conducted, through Juris Doctors’ interviews. All of these respondents have written a monograph, which is a requirement to complete a Law course in Brazil. The main conclusions of this study were the following: 1) most of the monographs produced do not specify the methodology used for developing the work; 2) when the papers indicate the methodology used, the analytical model was prevalent. In these cases, the science of law appears as a systematization of rules for obtaining possible decisions. 3) Hermeneutic and empirical models were also used, but on a smaller scale. These researches revealed the inaccuracy of the methodological tools used to apprehend the reality. However, these strategies are significant to define the objects of study of law in the contemporary time. Answering the question about how Law researches are done in some Brazilian Law schools, this paper discusses the construction of classical models of science of law, which were taken as the theoretical framework of this work before the hypercomplex current problems.
It is widely believed that chiral symmetry is spontaneously broken at zero temperature in the strong coupling limit of staggered fermions, for any number of colors and flavors. Using Monte Carlo simulations, we show that this conventional wisdom, based on a mean-field analysis, is wrong. For sufficiently many fundamental flavors, chiral symmetry is restored via a bulk, first-order transition. This chirally symmetric phase appears to be analytically connected with the expected conformal window of manyflavor continuum QCD. We perform simulations in the chirally symmetric phase at zero quark mass for various system sizes L, and measure the torelon mass and the Dirac spectrum. We find that all observables scale with L, which is hence the only infrared length scale. Thus, the strong-coupling chirally restored phase appears as a convenient laboratory to study IR-conformality. Finally, we present a conjecture for the phase diagram of lattice QCD as a function of the bare coupling and the number of quark flavors.
We discuss the diffusion currents occurring in a dilute system and show that the charge currents do not only depend on gradients in the corresponding charge density, but also on the other conserved charges in the system—the diffusion currents are therefore coupled. Gradients in one charge thus generate dissipative currents in a different charge. In this approach, we model the Navier-Stokes term of the generated currents to consist of a diffusion coefficient matrix, in which the diagonal entries are the usual diffusion coefficients and the off-diagonal entries correspond to the coupling of different diffusion currents. We evaluate the complete diffusion matrix for a specific hadron gas and for a simplified quark-gluon gas, including baryon, electric and strangeness charge. Our findings are that the off-diagonal entries can range within the same magnitude as the diagonal ones.
In the intersection between law, science and technology lies the debate on the overcoming of the boundaries of the biological structure of the human being and its implications on the idea of human rights, on the concept of person and on the conception of equality – being the latter a fundamental tenet of a democracy.
Posthumanism assumes a biological inadequacy of the human body regarding the quantity, complexity and quality of information which it can muster. The same occurs with the needs of accuracy, speed or strength demanded by the contemporary environment. Under such perspective, the body is considered to be an inefficient structure, with a short lifespan, easy to break and hard to fix.
The body, always seen as the locus for the definition of human, emerges as the object of a commodification process that seeks to exonerate men from their burden - by declination towards a virtual existence, totally free and rational - or to enhance them with bionic devices or drugs.
This issue has already been the subject of attention by many scholars like Savulescu, Rodotà, Broston, Fukuyama and even Habermas.
Therefore, the aim of this paper is to seek, by criticism and revision of the positions on the foreseen problems of this process, an adequate theoretical approach on issues like the concept of person and its connection with the idea of human rights in order to promote the fundamental statement that all men are equal without disregard to the values of diversity and personal identity.