Refine
Year of publication
- 2012 (137) (remove)
Document Type
- Conference Proceeding (137) (remove)
Language
- English (137) (remove)
Has Fulltext
- yes (137)
Is part of the Bibliography
- no (137)
Keywords
- Democracy (8)
- Law (6)
- human rights (6)
- law (6)
- democracy (5)
- Internet (4)
- Human Rights (3)
- educational freedom (3)
- homeschooling (3)
- jurisprudence (3)
Institute
- Rechtswissenschaft (99)
- Physik (14)
- Informatik (6)
- Biochemie und Chemie (3)
- Frankfurt Institute for Advanced Studies (FIAS) (3)
- Medizin (3)
- Zentrum für Weiterbildung (3)
- Deutsches Institut für Internationale Pädagogische Forschung (DIPF) (1)
- Geschichtswissenschaften (1)
- Institut für sozial-ökologische Forschung (ISOE) (1)
One-photon and multi-photon absorption, spontaneous and stimulated photon emission, resonance Raman scattering and electron transfer are important molecular processes that commonly involve combined vibrational-electronic (vibronic) transitions. The corresponding vibronic transition profiles in the energy domain are usually determined by Franck-Condon factors (FCFs), the squared norm of overlap integrals between vibrational wavefunctions of different electronic states. FC profiles are typically highly congested for large molecular systems and the spectra usually become not well-resolvable at elevated temperatures. The (theoretical) analyses of such spectra are even more difficult when vibrational mode mixing (Duschinsky) effects are significant, because contributions from different modes are in general not separable, even within the harmonic approximation. A few decades ago Doktorov, Malkin and Man'ko [1979 J. Mol. Spectrosc. 77, 178] developed a coherent state-based generating function approach and exploited the dynamical symmetry of vibrational Hamiltonians for the Duschinsky relation to describe FC transitions at zero Kelvin. Recently, the present authors extended the method to incorporate thermal, single vibronic level, non-Condon and multi-photon effects in energy, time and probability density domains for the efficient calculation and interpretation of vibronic spectra. Herein, recent developments and corresponding generating functions are presented for single vibronic levels related to fluorescence, resonance Raman scattering and anharmonic transition.
The influence of an ac current of arbitrary amplitude and frequency on the mixed-state dc-voltage-ac-drive tiltingratchet response of a superconducting film with uniaxial cosine pinning potential at finite temperature is theoretically investigated. The results are obtained in the single-vortex approximation, within the frame of an exact solution of the Langevin equation for non-interacting vortices. Both experimentally achievable, the dc ratchet response and absorbed ac power are predicted to demonstrate a pronounced filter-like behavior at microwave frequencies. Based on our findings, we propose a cut-off filter and discuss its operating curves as functions of the driving parameters, i.e, ac amplitude, frequency, and dc bias. The predicted results can be examined, e.g, on superconducting films with a washboard pinning potential landscape.
Health-care personnel (HCP) are exposed to infectious diseases throughout the course of their work. The concerns of pregnant HCP are considerable because certain otherwise mild infections may affect fetal development. We studied 424 pregnant HCP at the University Hospital Frankfurt / Germany between March 2007 and July 2011. Serological tests were carried out for varicella zoster virus (VZV), measles, mumps, rubella (MMR), cytomegalovirus (CMV) and parvovirus B19. Our overall seroprevalence data with regard to VZV, MMR, CMV and parvovirus B 19 corresponded to the general population. It was striking that, only 57.1% of the study population was immune against the four vaccine-preventable diseases (MMR, VZV). Our study suggests that a comprehensive approach to improving the vaccination status of said HCP before pregnancy is paramount.
This paper describes the ongoing efforts of the authors to present ancient Greek and Roman numismatic data on the public internet, with an emphasis on efforts to integrate information from multiple sources using Linked Data and Semantic Web techniques. By way of very modern metaphor, it is useful to think of coins as intentionally created packages of 'named entities'. Each coin was struck by a particular authority, often at a known site, and coins often make reference to familiar concepts such as deities, historical events, or symbols that were widely recognized in the ancient world. The institutions represented among the authors have deployed search interfaces that allow users to take advantage of this aspect of numismatic databases. The American Numismatic Society's database provides faceted search to its collection of over 550,000 objects. The Portable Antiquities Scheme (PAS) in the UK presents individual finds (and hoards) recorded throughout the country. The Römisch-Germanische Kommission and the University of Frankfurt (DBIS) are developing a prototype metaportal (INTERFACE) that accesses national databases of coin finds held in in Frankfurt, Vienna and Utrecht. Each of these resources is beginning to explore Semantic Web/Linked data approaches so that the role of numismatic standards is immediately coming to the fore. DBIS and INTERFACE are developing a numismatic ontology. At the ANS and PAS, the public database already presents RDF serializations based on Dublin Core. Together, the authors have begun to explore standardization of conceptual names on the basis of the vocabulary presented at the site http://nomisma.org . Nomisma.org is a collaborative effort to provide stable digital representations of numismatic concepts and entities. It provides URIs for such basic concepts as 'coin', 'mint', 'axis'. All of these are defined within the scope of numismatics but are already being linked to other stable resources where available. This is particularly the case for mints. For example, the URI http://nomisma.org/id/corinth is intended to represent that ancient city in its role as a minter/issuer of coins. The URI is linked via the SKOS ontology to the Pleiades Gazetteer of ancient places. This allows Nomisma to be the basis for a common representation of the concept that an object is a coin minted at Corinth. The ANS has already deployed such relationships in its public database. The work of all these projects is very much in progress so that this paper hopes to generate discussion on how multiple large projects can move forward in their own work while encouraging sufficient commonality to support large scale research questions undertaken by diverse audiences.
In terms of the direction of development, I referred to Johanna Nichols' work on head-marking vs. dependant marking. Nichols did not make reference to any languages in Tibeto-Burman, but all of the Tibeto-Burman languages that do not have verb agreement systems are solidly dependent-marking (i.e., they have marking on the nouns for case or pragmatic function); those languages with verb agreement systems, a type of head marking, also have many dependent-marking features (of the same types as the non-pronominalized languages). The question, then, is which is older, the dependent-marking type or the headmarking (actually mixed) type?
This is the Proceedings of the "International Conference on Motivation 2012" carried out by the Special Interest Group "Motivation and Emotion" of the European Association for Research on Learning and Instruction (EARLI) in cooperation with the German Institute for International Educational Research (DIPF) and the Goethe University Frankfurt. (DIPF/author).
The aim of this paper is to give a unified account of the way that German demonstrative pronouns (henceforth: D-pronouns) like der, die and das behave (a) in sentences where they receive a coreferential interpretation, and (b) in sentences where they receive a covarying interpretation because they are in some way dependent on a quantificational expression – either via direct binding or indirectly, because the value they receive varies with the value that is assigned to the variable bound by an indefinite determiner.
The paper presents a study which was based on the hypothesis that wikis that are initiated bottom up by students might be used more deliberately than wikis which are introduced top down by teachers. Therefore it examines the specific effects observed in nine different wiki projects at the university of Frankfurt ranging from student wiki projects up to wikis used in seminars and as information tool for institutions.
Since 2007 the concept of open online courses came up leading to many discussions of this new format in blog posts and articles especially in the US and Canada. 2011, the first German open online course was started addressing the Future of Learning.
The article discusses the concept of open online courses, the experiences with the first German course, and gives some perspectives on further developments which partly were implemented in a new course that was just started in 2012.
Diffusion of e-learning as an innovation and economic aspects of e-learning support structures
(2012)
Meanwhile, many universities and educational institutions have implemented an e-learning center or some similar, often smaller institutional units in order to support the usage of new media in teaching and learning processes [1]. This paper addresses questions around the installation of such e-learning support structures at different levels of an institution and also looks at the diffusion of e-learning as an innovation in educational institutions.
The workshop “Transdisciplinary Research on Biodiversity, Steps towards Integrated Biodiversity Research” was organized on 14-15 November 2011 in Brussels by the German-based Institute for Social-Ecological Research (ISOE) in cooperation with the European Platform for Biodiversity Research Strategy (EPBRS) and the Belgian Biodiversity Platform.
The workshop was a follow up of the EPBRS summit “Positive Visions for Biodiversity” organized in November 2010, and its aim was to explore ways to further increase the capacities of transdisciplinary biodiversity research in Europe. It brought together researchers and experts, representatives and decision-makers from European institutions and research funding agencies, as well as members from civil society and the private sector.
Participants discussed and identified in working groups key research topics and the added value of transdisciplinary approaches for three main themes of the “Positive Visions for Biodiversity” summit:
1/ The integration of biodiversity into every part of life
2/ Values and behaviours to a more harmonious way of life
3/ Governance that is more transparent and effective and that balances global and local responsibilities.
During the final plenary panel discussion, participants highlighted recommendations for promoting transdisciplinary biodiversity research:
➢ Scientists have a role to play in raising awareness on the importance of biodiversity as a transdisciplinary issue.
➢ Environmental policy representatives at national and European level have to open up to and interact with other sectors to better advocate for global biodiversity agreements and mobilize more funding for transdisciplinary research on biodiversity.
➢ There is a need for scientists who are interested in comunicating and advocating. The biodiversity community needs people who are able to bridge between worlds, both science and advocacy, to get transdisciplinary biodiversity topics on European research agendas.
➢ Scientific academic training should provide means and opportunities to train these new professionals to become the “in-between” links. Current educational and insitutional frameworks need to be adapted to provide such training and career opportunities.
➢ Innovation should be understood in a broader sense than technology and products with market value. Research is needed on innovative ways to increase sustainable use, recycling of natural resources and learning from natural processes.
➢ The biodiversity community needs to reinforce its identity and build up larger influential groups to be able to advocate more efficiently at national and European levels.
Among the main barriers to developing and implementing an efficient transdisciplinary research on biodiversity issues, the current trends in European research agendas to focus on technological and product oriented research is particularly detrimental. Improving advocacy on biodiversity and the implementation of transdisciplinary biodiversity research will be critical for the next decade to ensure the necessary knowledge for informing political decisions.
The human immunodeficiency virus (HIV) is currently ranked sixth in the worldwide causes of death [1]. One treatment approach is to inhibit reverse transcriptase (RT), an enzyme essential for reverse transcription of viral RNA into DNA before integration into the host genome [2]. By using non-nucleoside RT inhibitors (NNRTIs) [3], which target an allosteric binding site, major side effects can be evaded. Unfortunately, high genetic variability of HIV in combination with selection pressure introduced by drug treatment enables the virus to develop resistance against this drug class by developing point mutations. This situation necessitates treatment with alternative NNRTIs that target the particular RT mutants encountered in a patient.
Previously, proteochemometric approaches have demonstrated some success in predicting binding of particular NNRTIs to individual mutants; however a structurebased approach may help to further improve the predictive success of such models. Hence, our aim is to rationalize the experimental activity of known NNRTIs against a variety of RT mutants by combining molecular modeling, long-timescale atomistic molecular dynamics (MD) simulation sampling and ensemble docking. Initial control experiments on known inhibitor-RT mutant complexes using this protocol were successful, and the predictivity for further complexes is currently being evaluated. In addition to predictive power, MD simulations of multiple RT mutants are providing fundamental insight into the dynamics of the allosteric NNRTI binding site which is useful for the design of future inhibitors. Overall, work of this type is hoped to contribute to the development of predictive efficacy models for individual patients, and hence towards personalized HIV treatment options.
Dual- or multi-target ligands have gained increased attention in the past years due to several advantages, including more simple pharmacokinetic and phamarcodynamic properties compared to a combined application of several drugs. Furthermore multi-target ligands often possess improved efficacy. We present a new approach for the discovery of dual-target ligands using aligned pharmacophore models combined with a shape-based scoring. Starting with two sets of known active compounds for each target, a number of different pharmacophore models is generated and subjected to pairwise graph-based alignment using the Kabsch-Algorithm. Since a compound may be able to bind to different targets in different conformations, the algorithm aligns pairs of pharmacophore models sharing the same features which are not necessarily at the exactly same spatial distance. Using the aligned models, a pharmacophore search on a multi-conformation-database is performed to find compounds matching both models. The potentially “dual” ligands are scored by a shape-based comparison with the known active molecules using ShaEP.
Using this approach, we performed a prospective fragment-based virtual screening for dual 5-LO/sEH inhibitors. Both enzymes play an important role in the arachidonic acid cascade and are involved in inflammatory processes, pain, cardiovascular diseases and allergic reactions. Beside several new selective inhibitors we were able to find a compound inhibiting both enzymes in low micromolar concentrations. The results indicate that the idea of aligned pharmacophore models can be successfully employed for the discovery of dual-target ligands.
We study the implications on compact star properties of a soft nuclear equation of state determined from kaon production at subthreshold energies in heavy-ion collisions. On one hand, we apply these results to study radii and moments of inertia of light neutron stars. Heavy-ion data provides constraints on nuclear matter at densities relevant for those stars and, in particular, to the density dependence of the symmetry energy of nuclear matter. On the other hand, we derive a limit for the highest allowed neutron star mass of three solar masses. For that purpouse, we use the information on the nucleon potential obtained from the analysis of the heavy-ion data combined with causality on the nuclear equation of state.
We present and compare new types of algorithms for lattice QCD with staggered fermions in the limit of infinite gauge coupling. These algorithms are formulated on a discrete spatial lattice but with continuous Euclidean time. They make use of the exact Hamiltonian, with the inverse temperature beta as the only input parameter. This formulation turns out to be analogous to that of a quantum spin system. The sign problem is completely absent, at zero and non-zero baryon density. We compare the performance of a continuous-time worm algorithm and of a Stochastic Series Expansion algorithm (SSE), which operates on equivalence classes of time-ordered interactions. Finally, we apply the SSE algorithm to a first exploratory study of two-flavor strong coupling lattice QCD, which is manageable in the Hamiltonian formulation because the sign problem can be controlled.
It is widely believed that chiral symmetry is spontaneously broken at zero temperature in the strong coupling limit of staggered fermions, for any number of colors and flavors. Using Monte Carlo simulations, we show that this conventional wisdom, based on a mean-field analysis, is wrong. For sufficiently many fundamental flavors, chiral symmetry is restored via a bulk, first-order transition. This chirally symmetric phase appears to be analytically connected with the expected conformal window of manyflavor continuum QCD. We perform simulations in the chirally symmetric phase at zero quark mass for various system sizes L, and measure the torelon mass and the Dirac spectrum. We find that all observables scale with L, which is hence the only infrared length scale. Thus, the strong-coupling chirally restored phase appears as a convenient laboratory to study IR-conformality. Finally, we present a conjecture for the phase diagram of lattice QCD as a function of the bare coupling and the number of quark flavors.
We explore the phase diagram of two flavour QCD at vanishing chemical potential using dynamical O(a)-improved Wilson quarks. In the approach to the chiral limit we use lattices with a temporal extent of Nt = 16 and spatial extent L = 32;48 and 64 to enable the extrapolation to the thermodynamic limit with small discretisation effects. In addition to an update on the scans at constant k, reported earlier, we present first results from scans along lines of constant physics at a pion mass of 290 MeV.We probe the transition using the Polyakov loop and the chiral condensate, as well as spectroscopic observables such as screening masses.
Pseudo-Critical Temperature and Thermal Equation of State from Nf = 2 Twisted Mass Lattice QCD
(2012)
We report about the current status of our ongoing study of the chiral limit of two-flavor QCD at finite temperature with twisted mass quarks. We estimate the pseudo-critical temperature Tc for three values of the pion mass in the range of mPS ~ 300 and 500 MeV and discuss different chiral scenarios. Furthermore, we present first preliminary results for the trace anomaly, pressure and energy density. We have studied several discretizations of Euclidean time up to Nt = 12 in order to assess the continuum limit of the trace anomaly. From its interpolation we evaluate the pressure and energy density employing the integral method. Here, we have focussed on two pion masses with mPS ~ 400 and 700 MeV.
It is a long discussed issue whether light scalar mesons have sizeable four-quark components. We present an exploratory study of this question using Nf = 2+1+1 twisted mass lattice QCD. A mixed action approach ignoring disconnected contributions is used to calculate correlatormatrices consisting of mesonic molecule, diquark-antidiquark and two-meson interpolating operators with quantum numbers of the scalar mesons a0(980) (1(0++)) and k (1/2(0+)). The correlation matrices are analyzed by solving the generalized eigenvalue problem. The theoretically expected free two-particle scattering states are identified, while no additional low lying states are observed. We do not observe indications for bound four-quark states in the channels investigated.
A 5-gap timing RPC equipped with patterned electrodes coupled to both charge-sensitive and timing circuits yields a time accuracy of 77 ps along with a position accuracy of 38 μm. These results were obtained by calculating the straight-line fit residuals to the positions provided by a 3-layer telescope made out of identical detectors, detecting almost perpendicular cosmic-ray muons. The device may be useful for particle identification by time-of-flight, where simultaneous measurements of trajectory and time are necessary.
We discuss recent applications of the partonic pQCD based cascade model BAMPS with focus on heavy-ion phenomeneology in hard and soft momentum range. The nuclear modification factor as well as elliptic flow are calculated in BAMPS for RHIC end LHC energies. These observables are also discussed within the same framework for charm and bottom quarks. Contributing to the recent jet-quenching investigations we present first preliminary results on application of jet reconstruction algorithms in BAMPS. Finally, collective effects induced by jets are investigated: we demonstrate the development of Mach cones in ideal matter as well in the highly viscous regime.
The results of the microscopic transport calculations of -nucleus interactions within a GiBUU model are presented. The dominating mechanism of hyperon production is the strangeness exchange processes → γπ and → ΞK. The calculated rapidity spectra of Ξ hyperons are significantly shifted to forward rapidities with respect to the spectra of S = −1 hyperons. We argue that this shift should be a sensitive test for the possible exotic mechanisms of -nucleus annihilation. The production of the double Λ-hypernuclei by Ξ− interaction with a secondary target is calculated.
We study the light scalar mesons a_0(980) and kappa using N_f = 2+1+1 flavor lattice QCD. In order to probe the internal structure of these scalar mesons, and in particular to identify, whether a sizeable tetraquark component is present, we use a large set of operators, including diquark-antidiquark, mesonic molecule and two-meson operators. The inclusion of disconnected diagrams, which are technically rather challenging, but which would allow us to extend our work to e.g. the f_0(980) meson, is introduced and discussed.
Background: Undergoing systemic inflammation, the innate immune system releases excessive proinflammatory mediators, which finally can lead to organ failure. Pattern recognition receptors (PRRs), such as Toll-like receptors (TLRs) and NOD-like receptors (NLRs), form the interface between bacterial and viral toxins and innate immunity. During sepsis, patients with diagnosed adrenal gland insufficiency are at high risk of developing a multiorgan dysfunction syndrome, which dramatically increases the risk of mortality. To date, little is known about the mechanisms leading to adrenal dysfunction under septic conditions. Here, we investigated the sepsis-related activation of the PRRs, cell inflammation, and apoptosis within adrenal glands.
Methods: Two sepsis models were performed: the polymicrobial sepsis model (caecal ligation and puncture (CLP)) and the LTA-induced intoxication model. All experiments received institutional approval by the Regierungspräsidium Darmstadt. CLP was performed as previously described [1], wherein one-third of the caecum was ligated and punctured with a 20-gauge needle. For LTA-induced systemic inflammation, TLR2 knockout (TLR2-/-) and WT mice were injected intraperitoneally with pure LTA (pLTA; 1 mg/kg) or PBS for 2 hours. To detect potential direct adrenal dysfunction, mice were additionally injected with adrenocorticotropic hormone (ACTH; 100 μg/kg) 1 hour after pLTA or PBS. Adrenals and plasma samples were taken. Gene expressions in the adrenals (rt-PCR), cytokine release (multiplex assay), and the apoptosis rate (TUNEL assay) within the adrenals were determined.
Results: In both models, adrenals showed increased mRNA expression of TLR2 and TLR4, various NLRs, cytokines as well as inflammasome components, NADPH oxidase subunits, and nitric oxide synthases (data not shown). In WT mice, ACTH alone had no effect on inflammation, while pLTA or pLTA/ACTH administration showed increased levels of the cytokines IL-1β, IL-6, and TNFα. TLR2-/- mice indicated no response as expected (Figure 1, left). Interestingly, surviving CLP mice showed no inflammatory adrenal response, whereas nonsurvivors had elevated cytokine levels (Figure 1, right). Additionally, we identified a marked increase in apoptosis of both chromaffin and steroid-producing cells in adrenal glands obtained from mice with sepsis as compared with their controls (Figure 2).
...
Conclusion: Taken together, sepsis-induced activation of the PRRs may contribute to adrenal impairment by enhancing tissue inflammation, oxidative stress and culminate in cellular apoptosis, while mortality seems to be associated with adrenal inflammation.
Background: Nerve injury induced protein 1 (Ninjurin 1 (Ninj1)) was first identified in Schwann cells and neurons contributing to cell adhesion and nerve regeneration. Recently, the role of Ninj1 has been linked to inflammatory processes in the central nervous system where functional repression reduced leukocyte infiltration and clinical disease activity during experimental autoimmune encephalomyelitis in mice [1]. But Ninj1 is also expressed outside the nervous system in various organs such as the liver and kidney as well as on leukocytes [2,3]. Therefore, we hypothesized that Ninj1 contributes to inflammation in general; that is, also outside the nervous system, with special interest in the pathogenesis of sepsis.
Methods: Ninj1 was repressed by transfecting HMEC-1 cells, a human dermal microvascular endothelial cell line with siRNA targeting Ninj1 (siNinj1) or a negative control (siC). Subsequently, cells were stimulated with 100 ng/ml LPS (TLR4 agonist), 3 μg/ml LTA (TLR2 agonist) or 100 n/ml poly(I:C) (TLR3 agonist) for 3 hours. The inflammatory response was analyzed by real-time PCR. In addition, transmigration of neutrophils across a HMEC-1 monolayer was measured using transwell plates (pore size 3 μm).
Results: Repression of Ninj1 by siRNA reduced Ninj1 mRNA expression in HMEC about 90% (Figure 1A). Reduced Ninj1 expression decreased neutrophil migration to 62.5% (Figure 1B) and TLR signaling. In detail, knockdown of Ninj1 significantly reduced TLR-2 and TLR-4 triggered expression of ICAM-1 and IL-6 (Figure 1C,D) while poly(I:C)-induced expression was only slightly reduced. To analyze a more specific TLR-3 target, we measured IP-10 mRNA expression, which was also significantly reduced in siNinj1-transfected cells (Figure 1E).
Conclusion: Our in vitro data strongly indicated that Ninj1 is involved in regulation of TLR signaling and therewith contributes to inflammation. In vivo experiments will clarify its impact on systemic inflammation.
This volume contains the proceedings of the 12th International Workshop on Termination (WST 2012), to be held February 19–23, 2012 in Obergurgl, Austria. The goal of the Workshop on Termination is to be a venue for presentation and discussion of all topics in and around termination. In this way, the workshop tries to bridge the gaps between different communities interested and active in research in and around termination. The 12th International Workshop on Termination in Obergurgl continues the successful workshops held in St. Andrews (1993), La Bresse (1995), Ede (1997), Dagstuhl (1999), Utrecht (2001), Valencia (2003), Aachen (2004), Seattle (2006), Paris (2007), Leipzig (2009), and Edinburgh (2010). The 12th International Workshop on Termination did welcome contributions on all aspects of termination and complexity analysis. Contributions from the imperative, constraint, functional, and logic programming communities, and papers investigating applications of complexity or termination (for example in program transformation or theorem proving) were particularly welcome. We did receive 18 submissions which all were accepted. Each paper was assigned two reviewers. In addition to these 18 contributed talks, WST 2012, hosts three invited talks by Alexander Krauss, Martin Hofmann, and Fausto Spoto.
The diagram-based method to prove correctness of program transformations consists of computing
complete set of (forking and commuting) diagrams, acting on sequences of standard reductions
and program transformations. In many cases, the only missing step for proving correctness of a
program transformation is to show the termination of the rearrangement of the sequences. Therefore
we encode complete sets of diagrams as term rewriting systems and use an automated tool
to show termination, which provides a further step in the automation of the inductive step in
correctness proofs.
Although their applications have not yet extended widely due to their incipient state, nano-technologies and nano-medicines may be presumed to be at the origin of the next great technological revolution, foreseeably contributing to a new stage with respect to evolutions in mankind’s progress. Their possibilities are truly immense in enormously varied spheres, but the risks and uncertainties they engender are enormous too. Because access and use of the unceasingly increasing mega-quantity of information they generate will place further strain on the protection of personal life, privacy, the exercise of freedom, as well as the safeguarding of other fundamental principles and rights.
The requalification of Habermas’ discussions on political philosophy and legal theory after the publication of Zwischen Naturalismus und Religion (2005), and his most recent texts and debates on religion and the public sphere, suggest a revision of the Habermasian theory of rationalization as it was firstly presented in Theorie des Kommunikativen Handelns (1982), especially on what concerns the processes of dessacralization and the linguistification of religious authority. In search of contributing to this revision, this paper intends to focus on the problem of a supposedly “lost” aesthetic-expressive understanding of religious authority in Habermas’s theory of rationalization, which may have contributed to a theory of law in Faktizität und Geltung (1992) that does not give satisfactory account to the aesthetical-expressive character of the modern understanding of legal authority. A better understanding of this special character, however, may contribute not only to the avoidance of fundamentalisms and new attempts of “aesthetization of politics”, but also to a rational strengthening of the solidarity of the citizens of democratic constitutional states.
This paper aims to discuss in which sense public hearings in supreme courts of democratic rules of law can be seen as proceduralization of popular sovereignty policies. These policies constitute expressions of a normative claim for a wider “publicization of law” by democratic states’ institutional powers and organs; a claim that becomes evident when one undertakes an intersubjective interpretation of law. This theoretical argument will be presented in the first section of the paper through a new articulation of Jürgen Habermas’ discursive theory of law and his most recent studies on the concept of political public sphere. The theoretical section gives normative and procedural criteria for the second section of the paper, which consists on a critical analysis of the procedures and practical cases of public hearings held at the Brazilian Supreme Court, constituting the first scientific study to date on the Court’s use of this legal instrument.
Since the XIX century, a pleiad of philosophers and historians support the idea that Greek philosophy, usually reported to have started with the presocratics, lays its basis in a previous moment: the Greek myths – systematized by Homer and Hesiod – and the Greek arts, in particular the lyric and tragedy literature. According to this, it is important to retrieve philosophical elements even before the pre-Socratics to understand the genesis of specific concepts in Philosophy of Law. Besides, assuming that the Western’s core values are inherited from Ancient Greece, it is essential to recuperate the basis of our own justice idea, through the Greek myths and tragedy literature. As a case study, this paper aims on the comparison of two key-works, each one representing a phase of the Greek tragedy: The Orestea, by Aeschylus, and Orestes, by Euripides. Both contain the same story, telling how the Greeks understood the necessity of solving their conflicts not by blood revenge, but through a political way, and also the political drama. Although, in Aeschylus’s one, men still leashed by their fate, while the gods play a major role, in order to punish human pride (hybris). In a different way, on Euripides’s work men face their own loneliness, in a world fulfilled with gods, each one demanding divergent actions. That represents a necessary moment to the flourishing freedom and human subjectivity, and, once the exterior divinity is unable to resolve human problems, men will need to discover their interior divinity: that is how the Philosophy emerges.
Dworkin`s political theory is characterized by the interpretative integrity of morality, law, and politics, the so-called “hedgehog’s approach”. The interpretative integrity approach functions on multiple levels. Firstly, philosophical foundations of his theory of justice are linked to his conception of just liberal society and state. Secondly, from the perspective of political morality, interpretative concepts of law and morality are internally connected, in addition to interpretative concepts of equality, liberty, and democracy. Thirdly, from the perspective of philosophical foundations, individual ethics, personal morality and political morality are mutually connected. The aforementioned ethical and moral foundations are also related – in a wider sense of philosophical foundations - with his gnoseological conception regarding value concepts in law, politics and morality, and with his episthemological conception regarding an objective truth in the field of values, in a sense that the value concepts are interpretative and can be objectively true when articulated in accordance with methodological rules and standards of a »reflexive equilibrium« and an interpretative integrity, and in accordance with the so-called internal scepticism in the context of value pluralism.
The term “ethics” in a “narrower” sense refers to individual ethics, the study of how to live well, while the “ethics” in a “broader” sense refers to personal morality, the study of how we must treat other people. The term “morality” however, is used primarily to denote a political morality, the issue of how a sovereign power should treat its citizens.
Philosophical foundations of Dworkin`s political theory of justice, his conception of two cardinal values of humanity, his concievement of individual ethics, personal morality and political morality will be in the focus of consideration.
Agamben has claimed to work inside the tradition inaugurated by the archaeological method of Michel Foucault but not to fully coincide with it. “My method is archaeological and paradigmatic in a sense which is very close to that of Foucault, but not completely coincident with it. The question is, facing the dichotomies that structuralize our culture, to go beyond the exceptions that have been producing the former, however, not to find a chronologically originary state, but to be able to understand the situation in which we are. Archaeology is, in this sense, the only way to access present” (interview to Flavia Costa, trad. Susana Scramim, in Revista do Departamento de Psicologia – Universidade Federal Fluminense, Niterói, v. 18 - n. 1, 131-136, Jan./Jun. 2006, 132, translated by the author). However, the aspects in which Agamben follows Foucault's method and the ones he does not were never very clear. This situation seems to change with the edition of Agamben's most extensive and explicit texts on method, Signatura Rerum. Sul Metodo (2008, italian edition). The goal of this article is to identify the points of intersection between their methods and some points in which they differ.
John Gray is the thinker who has reconstructed the main tenets of ethical pluralism inherent in the work of its initiator - Isaiah Berlin - and pointed to its consequences for political philosophy. In particular he singled out three levels of conflict in ethics identifiable in Berlin’s writings: among the ultimate values belonging to the same morality or code of conduct, among whole ways or styles of life and within goods or values which are themselves internally complex and inherently pluralistic.
It is the third, internal kind of conflict that proves to be the richest in implications.Because it undermines a whole constellation of contemporary liberal doctrines informed by the Kantian-Lockean tradition that conform to the legal paradigm. From the pluralist perspective such monumental theories (e.g. those of Rawls or Dworkin) are no longer sustainable due to the recognition that no ultimate value is immune to the phenomenon of incommensurability. Thus, irresolvable conflicts may also break out within the given regulative value.
Confronting ethical pluralism with general reflection on law has mostly negative consequences. Nevertheless, the incommensurability thesis sheds considerable light on certain legal disputes. This claim will be illustrated by interpreting from the pluralist perspective the controversy over the verdict by the European Tribunal of Human Rights of 3 November 2010 concerning hanging crosses in classrooms.
Brazil has one of the worst distributions of income in the world. The wealth of the richest 1% of the population is equal to that of the poorest 50%. Brazil has a greater concentration of wealth than ninety-five percent of the countries on which data is available. In the legal field, tax justice is based on the constitutional principle of the “ability to pay”, according to which taxes should be paid based on the economic capacity of the taxpayer. This principle first appeared in the Brazilian legal order in the 1946 Constitution, was excluded from the texts of 1967/69, and reappeared in § 1 of article 145 of the 1988 Constitution. The aim of this paper is to examine two possible grounds for the ability to pay principle (equal sacrifices and proportional sacrifices) to show how, in Brazil, the interpretations that seek to assign a positive content to the principle are limited to the horizons of a particular form of State associated with the theory of equal sacrifice. This theory for its turn is consistent with a theory of justice, under which no expense or charge levied by the government can alter the distribution of welfare produced by the market. As the application of the ability to pay principle is done within the limits of that horizon, as a consequence, this principle does not play an important role in the issue of reduction of inequality in Brazil.
In their book Principles of Biomedical Ethics, Tom Beauchamp and James Childress offer an account of bioethics, called “Principlism”, by way of specifying and balancing four clusters of principles.2 These principles are found, as the author state, in a common morality, understood as a set of universally shared moral beliefs.
This paper seeks to introduce the following questions: Does this account of Beauchamp and Childress flow from common morality in a natural way? Can their proposals claim to be endorsed by the authority of common morality? If not, in what way does Principlism contribute to bioethics?
This paper is aimed to re-elaborate questions and discuss them rather than presenting answers. It starts with the dialog concerning specific contributions of philosophy of language to Law, followed by the re-elaboration of some yet unanswered problems, as well as the discussion of possible paths for this issue.
Since de advent of what is known as new constitucionalism, jurists have faced a difficult task in order to overcome some failures of normative positivism. In this context, the judiciary has played a renewed role, which can be justified on grounds of legal theory and on institutional reasons. However, this new role has led legal philosophers to several concerns, such as the relationship between law and ethics. On one hand, Critical Legal Studies points out that the judge always acts informed by his own convictions. On the other hand, according to R. Forst (within another context, but also relevant here), this is not really a problem, because a rule can be provided with ethics, but not ethically justified. This openness of law to moral makes it difficult for the interpretative judicial discourse to be taken as claimed by K. Günther: as a discourse of application only, and not of justification. All these controversies, however, lead to a common statement: the constitutional adjudication has been exercising a different activity. Some legal systems allows such activity legitimacy in some extent, like Brazilian’s, for example, which i) states a very broad adjudication, ii) provides an extensive catalog of basic rights, and iii) contains several procedural mechanisms for their protection. This empowers the adjudication to exercise what can be called a political activity. Therefore, a series of moral issues which were once exclusive to the political arena have been brought to the judiciary, such as: gay marriage, abortion, affirmative action, religious freedom, federation, separation of powers, distribution of scarce resources. In a democracy, these moral questions ought to be mainly decided through deliberation outside the judiciary, but not always this is what happens. The paper discusses these issues, showing also how the Brazilian Supreme Court has dealt - technically, or not - with this relationship between law and justice in a complex and pluralist society.
Are Kantian philosophy and its principle of respect for persons inadequate to the protection of environmental values? This paper answers this question by elucidating how Kantian ethics can take environmental values seriously. In the period that starts with the Critique of Judgment in 1790 and ends with the Metaphysics of Morals in 1797, the subject would have been approached by Kant in a different manner; although the respect that we may owe to non-human nature is still grounded in our duties to mankind, the basis for such respect stems from nature’s aesthetic properties, and the duty to preserve nature lies in our duties to ourselves. Compared to the “market paradigm”, as it is called by Gillroy (the reference is to a conception of a public policy based on a criterion of economic efficiency or utility), Kantian philosophy can offer a better explanation of the relationship between environmental policy and the theory of justice. Kantian justice defines the “just state” as the one that protects the moral capacities of its “active” citizens, as presented in the first Part of the Metaphysics of Morals. In the Kantian paradigm, the environmental risk becomes a “public” concern. That means it is not subsumed under an individual decision, based on a calculus.
The increase in the volume of litigation verified since the 1990’s, having the Brazilian society as context, made the judiciary open itself to new technologies which facilitate the access to justice, as well as to a faster resolution of the demands. However, the intense insertion of technical rationalization in the process and decision operations by the judiciary, during the last years, led to a legalization supported by presuppositions of technical-instrumental regulation. According to the goal policy established by the CNJ, the annoyance of the instrumental rationality is present “with respect to purposes”, which demands, more and more, a mere fulfillment of previously instituted goals from the law operators. The matter is to know if the implementation of new technologies to solve the growing litigation coming from the complexity of societies is enough to adjust the Law to a post-conventional platform. If the social complexity implies resources coming from new technologies, it’s not certain that such technologies, on their own, satisfactorily answer a judicial model which, seen under the eyes of the post- conventional legitimacy and regulation, is adequate to complex societies. This illustrates that a judicial model, able to deal with the social plurality, must take into account not only the rules of instrumental rationality, but also the fundamental issues of communicative rationality. This current work intends to evaluate if the applicability of the instrumental rationality in the judiciary equally allows the law to extent the useful conditions of the communicative rationality to the consensual formation of will and opinion in the Democratic State of Law.
Civil Society became an important theme in the recent discussion of political or social theory. Civil Society is playing a substantial role for the legislation process. We can find it especially in the activities of international NGO. It gives a new aspect of the relationship between state and society, and legal philosophically speaking, of validity of law. Activities of Civil Society are socially recognized and their support systems are gradually institutionalized also domestic in Japan. But Japanese NPO has its own weak point, which arises from the political structure of our society.
Scientific and technical achievements can cause deep changes in spheres of morals and law. I am going to discuss some philosophical conclusions which follow from two significant ideas of contemporary civilization. First of them is a thesis about indistinguishability of natural from artificial, and the second one is an opportunity of creation of artificial human.
The first thesis is a consequence of the principle of relativity of physical reality to conditions and a way of observation, on which both interpretations of quantum theory and Einstein’s theories of relativity are based. I show that the given principle deprives us of objective criteria to distinguish natural from artificial, freedom from necessity, freedom from violence.
Today power of technique is directed not only on the external world, but also on a person. Due to information technology, and biotechnology an opportunity of creation of artificial and controlled individual increases. So human loses many features of a person and transforms to a part of a collective super individual subject. In modern time a search of the transcendental basis of law and power leads to impersonal human and recognition of super individuality.
Traditional belief about natural rights will disappear. There is necessity of revision of such concept as right of freedom. Liberal belief about freedom as a condition of human existence is changing. Prospects of technical development make justified R. Dworkin's reflections about superiority of right of equality in comparison with right of freedom.
The Brazilian Constitution of 1988 declares Brazil as a Democratic State of Law. This formally democratic legal status has been facing difficulties when it comes to its material implementation. Brazilian legal procedures are still greatly influenced by the catholic heritage from Portugal in the times of colonization, translated in the present times into a strong moral set of dogmas that still reflects upon the legal production and interpretation in the country. Recently in Brazil, a debate brought to the Supremo Tribunal Federal, the Brazilian Federal Supreme Court, has evidenced the struggle between Ethics and Morality in the country’s legal scenario. The focus of the discussion was the possibility of abortion of anencephalic fetuses (in Brazil, abortion in considered a crime against life). In order to properly ground its decision, the Court invited scientists, doctors, members of feminist movements and representatives of certain religions to a public dialogue, in which both scientific-technical and purely moral-religious arguments were presented. Although these procedures encouraged and promoted a democratic and pluralistic legal debate, it seems like the crucial point of the discussion were not taken into account: the scientific character of Law. This is the object of the present manuscript: in order to ensure an intersubjective construction and application of Law, this must be perceived as an Applied Social Science and judges, lawyers, legislators and all other legal actors must proceed in a scientific way. To illustrate the theme, the specific case of abortion of anencephalic fetuses will be mentioned through the text.
The very idea of the European Convention on Human Rights is to bring the laws of contracting states into line with fundamental human rights principles. Where the Convention is not explicit, the Court should never rule restrictively so as to reduce the scope of a general right. In the case of homeschooling, the Convention sets forth the general principle that “the state shall respect the right of parents to ensure such education and teaching in conformity with their own religious and philosophical convictions.” It must not, therefore, allow a contracting state to eliminate a means of achieving this desired by parents—unless the state can show that the means in question is ineffective.