Refine
Year of publication
- 2013 (29) (remove)
Document Type
- Conference Proceeding (29) (remove)
Language
- English (29) (remove)
Has Fulltext
- yes (29)
Is part of the Bibliography
- no (29)
Keywords
- Benchmark testing (1)
- Cognition (1)
- Concrete (1)
- Data processing (1)
- Educational institutions (1)
- Extraterrestrial measurements (1)
- Programming (1)
- learning analytics (1)
- massive open online courses (1)
Institute
- Physik (15)
- Frankfurt Institute for Advanced Studies (FIAS) (10)
- Medizin (8)
- Biochemie und Chemie (1)
- Biowissenschaften (1)
- Informatik (1)
- Rechtswissenschaft (1)
- Wirtschaftswissenschaften (1)
- Zentrum für Weiterbildung (1)
[Tagungsbericht] Making finance sustainable: Ten years equator principles – success or letdown?
(2013)
In 2003, a number of banks adopted the Equator Principles (EPs), a voluntary Code of Conduct based on the International Finance Corporation’s (IFC) performance standards, to ensure the ecological and social sustainability of project finance. These so called Equator Principles Financial Institutions (EPFI) commit to requiring their borrowers to adopt sustainable management plans of environmental and social risks associated with their projects. The Principles apply to the project finance business segment of the banks and cover projects with a total cost of US $10 million or more. While for long developing countries relied on World Bank and other public assistance to finance infrastructure projects there has occurred a shift in recent years to private funding. The NGOs have been frustrated by this shift of project finance as they had spent their resources to exercise pressure on the public financial institutions to incorporate environmental and social standards in their project finance activities. However, after a shift of NGO pressure to private financial institutions the latter adopted the EPs for fear of reputational risks. NGOs had laid down their own more ambitious ideas about sustainable finance in the Collevecchio Declaration on Financial Institutions and Sustainability. Legally speaking, the EPs are a self-regulatory soft law instrument. However, it has a hard law dimension as the Equator Banks require their borrowers to comply with the EPs through covenants in the loan contracts that may trigger a default in a case of violation. ...
The physics of EPOS
(2013)
Exhaustive, automatic testing of dataflow (esp. mapreduce) programs has emerged as an important challenge. Past work demonstrated effective ways to generate small example data sets that exercise operators in the Pig platform, used to generate Hadoop map-reduce programs. Although such prior techniques attempt to cover all cases of operator use, in practice they often fail. Our SEDGE system addresses these completeness problems: for every dataflow operator, we produce data aiming to cover all cases that arise in the dataflow program (e.g., both passing and failing a filter). SEDGE relies on transforming the program into symbolic constraints, and solving the constraints using a symbolic reasoning engine (a powerful SMT solver), while using input data as concrete aids in the solution process. The approach resembles dynamic-symbolic (a.k.a. "concolic") execution in a conventional programming language, adapted to the unique features of the dataflow domain.
In third-party benchmarks, SEDGE achieves higher coverage than past techniques for 5 out of 20 PigMix benchmarks and 7 out of 11 SDSS benchmarks and (with equal coverage for the rest of the benchmarks). We also show that our targeting of the high-level dataflow language pays off: for complex programs, state-of-the-art dynamic-symbolic execution at the level of the generated map-reduce code (instead of the original dataflow program) requires many more test cases or achieves much lower coverage than our approach.
Using a partonic transport model we investigate the evolution of conical structures in ultrarelativistic matter. Using two different source terms and varying the transport properties of the matter we study the formation of Mach Cones. Furthermore, in an additional study we extract the two-particle correlations from the numerical calculations and compare them to an analytical approximation. The influence of the viscosity to the shape of Mach Cones and the corresponding two-particle correlations is studied by adjusting the cross section of the medium.
The paper takes a deeper look at participation rates in cMOOCs. To get a better insight into the behavior of learners in MOOCs, studiumdigitale has developed a tool which helps to analyze the contribution of participants in the so called cMOOCs. These are MOOCs which are fostering the active participation of learners in various tools and which are based on the concept of connectivism [1]. After an approach at each part of the definition of MOOCs and the discussion of the different categories of this quite new phenomena a deeper look will be taken into the analysis of two cMOOCs, OPCO11 and OPCO12 which took place 2011 and 2012 [2].
The Compressed Baryonic Matter (CBM) experiment [1] is a fixed target heavy-ion experiment that will operate at the international Facility for Antiproton and Ion Research (FAIR) [2] now under construction in Darmstadt, Germany. The experiment intends to study rare probes, which are emitted from heavy ion collisions with a beam energy of 4 to 45 AGeV. A focus is laid to the short lived open charm particles and to particles decaying into di-lepton pairs. Handling the up to 107 Au+Au collisions/s required for generating those probes with sufficient statistics, as much as reaching the required sensitivity for observing them, forms a major challenge for the silicon detectors of the experiment. We present the concept and the development status of two central detectors of CBM, the CMOS pixel based micro vertex detector (MVD) and the micro-strip detector based silicon tracking system (STS).
22nd International Workshop on Vertex Detectors, 15-20 September 2013 Lake Starnberg, Germany
We report on the event-by-event multiplicity fluctuations of identified particles in central Pb+Pb collisions measured by the NA49 experiment at the CERN SPS. Employing a novel approach we unfolded the moments of the unknown multiplicity distributions of protons (p), kaons (K), pions (π) and electrons. Using these moments we reconstructed an excitation function of the fluctuation measure νdyn[A;B], with A and B denoting different particle types. Specifically, we reconstructed νdyn for the [p, π], [p, K] and [K, π] pairs. The energy dependence of νdyn is in agreement with previously published NA49 results on the related measure σdyn. Moreover, for [K; p] and [K;p] pairs, we discovered a dependence of the fluctuation measure νdyn on the phase space coverage (acceptance). Interestingly for the [p,π] case no significant acceptance dependence was observed. These observations provide a likely explanation of the reported differences between measurements of NA49 and those of STAR in central Au+Au collisions.
The study of energy and system size dependence of fluctuations of identified hadrons is one of the key goals of NA61/SHINE at the CERN SPS. Results may allow to discover the critical point (CP) of strongly interacting matter as well as to uncover properties of the onset of deconfinement (OD). Measured fluctuations are affected by numerous other effects like volume fluctuations and conservation laws. NA49 seems to observe fluctuations possibly related to the CP in collisions of medium size nuclei at the top SPS energy. However, this result will remain inconclusive until systematic data on energy and system size dependence will be available. Moreover, fluctuations in p+p as well as in Pb+Pb interactions should be better understood. In this contribution new results on multiplicity fluctuations of identified hadrons in p+p interactions at the CERN SPS energies will be presented. The NA61 data will be compared with the corresponding results on central Pb+Pb collisions of NA49 in the common acceptance region of both experiments. Furthermore, predictions of models (EPOS, UrQMD and HSD) for p+p interactions will be tested.
While the existence of a strongly interacting state of matter, known as “quark-gluon plasma” (QGP), has been established in heavy ion collision experiments in the past decade, the task remains to map out the transition from the hadronic matter to the QGP. This is done by measuring the dependence of key observables (such as particle suppression and elliptic flow) on the collision energy of the heavy ions. This procedure, known as "beam energy scan", has been most recently performed at the Relativistic Heavy Ion Collider (RHIC).
Utilizing a Boltzmann+hydrodynamics hybrid model, we study the collision energy dependence of initial state eccentricities and the final state elliptic and triangular flow. This approach is well suited to investigate the relative importance of hydrodynamics and hadron transport at different collision energies.