Filtern
Erscheinungsjahr
- 2012 (87) (entfernen)
Dokumenttyp
- Arbeitspapier (87) (entfernen)
Volltext vorhanden
- ja (87)
Gehört zur Bibliographie
- nein (87)
Schlagworte
- model uncertainty (4)
- monetary policy (4)
- DSGE models (3)
- Monetary Policy (3)
- forecasting (3)
- ECB (2)
- Fiscal Policy (2)
- Formale Semantik (2)
- Funktionale Programmierung (2)
- Greenbook (2)
Institut
- Center for Financial Studies (CFS) (25)
- Institute for Monetary and Financial Stability (IMFS) (20)
- Wirtschaftswissenschaften (16)
- Institut für sozial-ökologische Forschung (ISOE) (10)
- House of Finance (HoF) (7)
- Rechtswissenschaft (6)
- Exzellenzcluster Die Herausbildung normativer Ordnungen (5)
- Gesellschaftswissenschaften (4)
- Informatik (4)
- Institute for Law and Finance (ILF) (4)
Eine wesentliche Voraussetzung für die Entschlüsselung herrschender Justizverständnisse ist die Auseinandersetzung mit den Rollen, die die beteiligten Akteure in einem Rechtssystem einnehmen sowie die Untersuchung der rechtlichen und institutionellen Bedingungen unter denen diese Akteure handeln. Der vorliegende Beitrag beschäftigt sich zunächst mit der Macht- und Aufgabenverteilung zwischen Richtern und Parteien. Dabei wird deutlich, dass die Rollenallokation nicht einheitlich ist, sondern in Abhängigkeit von unterschiedlichen verfahrensrechtlichen und institutionellen Voraussetzungen variiert. In Verfahren vor einer Jury wird die richterliche Autorität durch eine maximal ausgeprägte Parteiautonomie stark eingeschränkt. Als Rechthonoratioren (im Weberschen Sinne) agieren Richter dagegen immer dann, wenn Sie ohne Geschworene Recht sprechen. Dies geschieht insbesondere in den einzelstaatlichen Obergerichten und den Bundesberufungsgereichten, aber auch in Verfahren erster Instanz, in denen „claims in equity“ zu entscheiden sind. Der Beitrag beschäftigt sich abschließend mit dem Einfluss, den die Besonderheiten der amerikanischen Juristenausbildung auf das amerikanische Justizverständnis ausüben: Sie prägen und reproduzieren eine der Rollen und Selbstbilder unter amerikanischen Juristen, sowohl in der Anwaltschaft als auch auf Seiten der Richter.
Venture capital (VC) investment has long been conceptualized as a local business , in which the VC’s ability to source, syndicate, fund, monitor, and add value to portfolio firms critically depends on their access to knowledge obtained through their ties to the local (i.e., geographically proximate) network. Consistent with the view that local networks matter, existing research confirms that local and geographically distant portfolio firms are sourced, syndicated, funded, and monitored differently. Curiously, emerging research on VC investment practice within the United States finds that distant investments, as measured by “exits” (either initial public offering or merger & acquisition) out-perform local investments. These findings raise important questions about the assumed benefits of local network membership and proximity. To more deeply probe these questions, we contrast the deal structure of cross-border VC investment with domestic VC investment, and contrast the deal structure of cross-border VC investments that include a local
partner with those that do not. Evidence from 139,892 rounds of venture capital financing in the period 1980-2009 suggests that cross-border investment practice, in terms of deal sourcing, syndication, and performance indeed change with proximity, but that monitoring practices do not. Further, we find that the inclusion of a local partner in the investment syndicate yields surprisingly few benefits. This evidence, we argue, raises important questions about VC investment practice as well as the ability of firms to capture and lever the presumed benefits of network membership.
This paper investigates the accuracy of point and density forecasts of four DSGE models for inflation, output growth and the federal funds rate. Model parameters are estimated and forecasts are derived successively from historical U.S. data vintages synchronized with the Fed’s Greenbook projections. Point forecasts of some models are of similar accuracy as the forecasts of nonstructural large dataset methods. Despite their common underlying New Keynesian modeling philosophy, forecasts of different DSGE models turn out to be quite distinct. Weighted forecasts are more precise than forecasts from individual models. The accuracy of a simple average of DSGE model forecasts is comparable to Greenbook projections for medium term horizons. Comparing density forecasts of DSGE models with the actual distribution of observations shows that the models overestimate uncertainty around point forecasts.
This paper investigates the accuracy of forecasts from four DSGE models for inflation, output growth and the federal funds rate using a real-time dataset synchronized with the Fed’s Greenbook projections. Conditioning the model forecasts on the Greenbook nowcasts leads to forecasts that are as accurate as the Greenbook projections for output growth and the federal funds rate. Only for inflation the model forecasts are dominated by the Greenbook projections. A comparison with forecasts from Bayesian VARs shows that the economic structure of the DSGE models which is useful for the interpretation of forecasts does not lower the accuracy of forecasts. Combining forecasts of several DSGE models increases precision in comparison to individual model forecasts. Comparing density forecasts with the actual distribution of observations shows that DSGE models overestimate uncertainty around point forecasts.
In the aftermath of the global financial crisis, the state of macroeconomicmodeling and the use of macroeconomic models in policy analysis has come under heavy criticism. Macroeconomists in academia and policy institutions have been blamed for relying too much on a particular class of macroeconomic models. This paper proposes a comparative approach to macroeconomic policy analysis that is open to competing modeling paradigms. Macroeconomic model comparison projects have helped produce some very influential insights such as the Taylor rule. However, they have been infrequent and costly, because they require the input of many teams of researchers and multiple meetings to obtain a limited set of comparative findings. This paper provides a new approach that enables individual researchers to conduct model comparisons easily, frequently, at low cost and on a large scale. Using this approach a model archive is built that includes many well-known empirically estimated models that may be used for quantitative analysis of monetary and fiscal stabilization policies. A computational platform is created that allows straightforward comparisons of models’ implications. Its application is illustrated by comparing different monetary and fiscal policies across selected models. Researchers can easily include new models in the data base and compare the effects of novel extensions to established benchmarks thereby fostering a comparative instead of insular approach to model development
In the aftermath of the global financial crisis, the state of macroeconomic modeling and the use of macroeconomic models in policy analysis has come under heavy criticism. Macroeconomists in academia and policy institutions have been blamed for relying too much on a particular class of macroeconomic models. This paper proposes a comparative approach to macroeconomic policy analysis that is open to competing modeling paradigms. Macroeconomic model comparison projects have helped produce some very influential insights such as the Taylor rule. However, they have been infrequent and costly, because they require the input of many teams of researchers and multiple meetings to obtain a limited set of comparative findings. This paper provides a new approach that enables individual researchers to conduct model comparisons easily, frequently, at low cost and on a large scale. Using this approach a model archive is built that includes many well-known empirically estimated models that may be used for quantitative analysis of monetary and fiscal stabilization policies. A computational platform is created that allows straightforward comparisons of models’ implications. Its application is illustrated by comparing different monetary and fiscal policies across selected models. Researchers can easily include new models in the data base and compare the effects of novel extensions to established benchmarks thereby fostering a comparative instead of insular approach to model development.
Motivated by the U.S. events of the 2000s, we address whether a too low for too long interest rate policy may generate a boom-bust cycle. We simulate anticipated and unanticipated monetary policies in state-of-the-art DSGE models and in a model with bond financing via a shadow banking system, in which the bond spread is calibrated for normal and optimistic times. Our results suggest that the U.S. boom-bust was caused by the combination of (i) too low for too long interest rates, (ii) excessive optimism and (iii) a failure of agents to anticipate the extent of the abnormally favorable conditions.
In this paper, I introduce lumpy micro-level capital adjustment into a sticky information general equilibrium model. Lumpy adjustment arises because of inattentiveness in capital investment decisions instead of the more common assumption of non-convex adjustment costs. The model features inattentiveness as the only source of stickiness. I find that the model with lumpy investment yields business cycle dynamics which differ substantially from those of an otherwise identical model with frictionless investment and are much more consistent with the empirical evidence. These results therefore strengthen the case in favour of the relevance of microeconomic investment lumpiness for the business cycle.
This paper outlines relatively easy to implement reforms for the supervision of transnational banking-groups in the E.U. that should not be primarily based on legal form but on the actual risk structures of the pertinent financial institutions. The proposal also aims at paying close attention to the economics of public administration and international relations in allocating competences among national and supranational supervisory bodies. Before detailing the own proposition, this paper looks into the relationship between sovereign debt and banking crises that drive regulatory reactions to the financial turmoil in the Euro area. These initiatives inter alia affirm effective prudential supervision as a pivotal element of crisis prevention. In order to arrive at a more informed idea, which determinants apart from a perceived appetite for regulatory arbitrage drive banks’ organizational choices, this paper scrutinizes the merits of either a branch or subsidiary structure for the cross-border business of financial institutions. In doing so, it also considers the policy-makers perspective. The analysis shows that no one size fits all organizational structure is available and concludes that banks’ choices should generally not be second-guessed, particularly because they are subject to (some) market discipline. The analysis proceeds with describing and evaluating how competences in prudential supervision are currently allocated among national and supranational supervisory authorities. In order to assess the findings the appraisal adopts insights form the economics of public administration and international relations. It argues that the supervisory architecture has to be more aligned with bureaucrats’ incentives and that inefficient requirements to cooperate and share information should be reduced. Contrary to a widespread perception, shifting responsibility to a supranational authority cannot solve all the problems identified. Resting on these foundations, the last part of this paper finally sketches an alternative solution that dwells on far-reaching mutual recognition of national supervisory regimes and allocates competences in line with supervisors’ incentives and the risk inherent in crossborder banking groups.
We examine both the degree and the structural stability of inflation persis tence at different quantiles of the conditional inflation distribution. Previous research focused exclusively on persistence at the conditional mean of the inflation rate. Economic theory, however, provides various reasons -for example downward wage rigidities or menu costs- to expect higher inflation persistence at the upper than at the lower tail of the conditional inflation distribution.
Based on post-war US data we indeed find slower mean reversion in response to positive than to negative shocks. We find robust evidence for a structural break in persistence at all quantiles of the inflation process in the early 1980s. Inflation persistence has decreased and become more homogeneous across quantiles. Persistence at the conditional mean became more informative about the degree of persistence across the entire conditional inflation distribution. While prior to the 1980s inflation was not mean reverting in response to large positive shocks, our evidence strongly suggests that since the end of the Volcker disinflation the unit root can be rejected at every quantile including the upper tail of the conditional inflation distribution.
The withdrawal of foreign capital from emerging countries at the height of the recent financial crisis and its quick return sparked a debate about the impact of capital flow surges on asset markets. This paper addresses the response of property prices to an inflow of foreign capital. For that purpose we estimate a panel VAR on a set of Asian emerging market economies, for which the waves of inflows were particularly pronounced, and identify capital inflow shocks based on sign restrictions. Our results suggest that capital inflow shocks have a significant effect on the appreciation of house prices and equity prices. Capital inflow shocks account for - roughly - twice the portion of overall house price changes they explain in OECD countries. We also address crosscountry differences in the house price responses to shocks, which are most likely due to differences in the monetary policy response to capital inflows.
In this paper we investigate the comparative properties of empirically-estimated monetary models of the U.S. economy using a new database of models designed for such investigations. We focus on three representative models due to Christiano, Eichenbaum, Evans (2005), Smets and Wouters (2007) and Taylor (1993a). Although these models differ in terms of structure, estimation method, sample period, and data vintage, we find surprisingly similar economic impacts of unanticipated changes in the federal funds rate. However, optimized monetary policy rules differ across models and lack robustness. Model averaging offers an effective strategy for improving the robustness of policy rules.
Missachtung rechtlicher Vorgaben des AEUV durch die Mitgliedstaaten und die EZB in der Schuldenkrise
(2012)
Zusammenfassung und Ergebnisse
1. Es gibt gute Argumente für ein generelles Verbot (freiwilliger) Unterstützungsleistungen an Euro-Mitgliedstaaten.
2. Die Vereinbarkeit der Leistungen der EU im Rahmen des EFSM mit Art. 122 Abs. 2 AEUV ist fraglich. Die Beurteilung der Kausalitätsfrage ist maßgebend.
3. Die Vereinbarkeit der Leistungen der Mitgliedstaaten im Rahmen der speziellen Griechenlandhilfe und im Rahmen der EFSF mit dem AEUV in der damals geltenden Fassung ist nicht sicher.
4. Die Einführung von Art. 136 Abs. 3 AEUV modifiziert das Vertragsrecht und ist wohl noch in Einklang mit Art. 48 Abs. 6 EUV erfolgt.
5. ESM und Fiskalpakt verstoßen nach der Änderung des Primärrechts wohl nicht gegen den AEUV.
6. Unabdingbar für die Schaffung des ESM sind aber das Inkrafttreten von Art. 136 Abs. 3 AEUV und
7. Der Erwerb von Forderungen gegen Mitgliedstaaten über einen längeren Zeitraum und zur Erleichterung von Zinslasten überschreitet die Befugnisse und Zuständigkeiten des ESZB.
8. Der Erwerb von Forderungen gegen Mitgliedstaaten über einen längeren Zeitraum und zur Erleichterung von Zinslasten ist nicht mit dem Verbot der Kreditgewährung durch Zentralbanken an Hoheitsträger nach Art. 123 AEUV zu vereinbaren
9. Die Gewährung von langfristigen Krediten an Banken verstößt ebenfalls gegen die Zuständigkeitsordnung des AEUV und ist bei einer Weiterleitung der Mittel an Hoheitsträger nicht mit Art. 123 AEUV zu vereinbaren.
10. Die Akzeptierung von ausfallgefährdeten Forderungen als Sicherheit für die Gewährung von Krediten durch das ESZB verstößt gegen Art. 18.1., zweiter Spiegelstrich, Satzung ESZB/EZB.
Erfahrungen aus einer transdisziplinär angeleiteten Serie von Stakeholder-Workshops zur nachhaltigen Klima-Adaption von mitteleuropäischen Wirtschaftswäldern werden vorgestellt und hinsichtlich der Baumartenwahl, der Risikoreduktion und der Segregation von Funktionen ausgewertet. Eine vorhergehende Diskursfeldanalyse erleichterte sowohl die Auswahl der Stakeholder als auch die anschließende Analyse der durchgeführten Stakeholder-Prozesse. Die ausreichende Beteiligung gesellschaftlicher Anspruchsgruppen trägt nicht nur dazu bei, mögliche gesellschaftliche Anforderungen an die Klima-Adaption von Wirtschaftswäldern zu identifizieren, sondern sie auch so breit zu erörtern, dass damit deren Konkretisierung möglich wird. Soweit eine Atmosphäre gegenseitigen Lernens geschaffen werden kann, können dabei auch bekannte (oder vermutete) Frontstellungen aufgebrochen und Auswege zu einer konfliktvermeidenden Umsetzung (z.B. durch Aufbau einer fachübergreifende Begleitforschung) aufgezeigt werden.
Aufbauend auf Interviews mit Experten aus dem Einzugsgebiet und einschlägigen Veröffentlichungen wurden zunächst die touristischen Potenziale des Einzugsgebiets dargestellt, die sich als sehr beachtlich erweisen. Mithilfe der dann durchgeführten Defizitanalyse wurde deutlich, dass diese Potenziale derzeit insbesondere aufgrund einer mangelhaften politischen Flankierung nicht realisiert werden können. In einer SWOT-Analyse konnten die Stärken und Schwächen ebenso wie die Chancen und Risiken einer touristischen Entwicklung des Einzugsgebiets erfasst werden.
Ausgehend von dieser SWOT-Analyse wurden zwei unterschiedliche Szenarien für die nächsten 15 Jahre erarbeitet, neben einem „Weiter-so-wie-bisher“-Szenario einen Best-Case-Fall (aus der Perspektive der nachhaltigen Entwicklung).
This paper shows equivalence of applicative similarity and contextual approximation, and hence also of bisimilarity and contextual equivalence, in LR, the deterministic call-by-need lambda calculus with letrec extended by data constructors, case-expressions and Haskell's seqoperator. LR models an untyped version of the core language of Haskell. Bisimilarity simplifies equivalence proofs in the calculus and opens a way for more convenient correctness proofs for program transformations.
The proof is by a fully abstract and surjective transfer of the contextual approximation into a call-by-name calculus, which is an extension of Abramsky's lazy lambda calculus. In the latter calculus equivalence of similarity and contextual approximation can be shown by Howe's method. Using an equivalent but inductive definition of behavioral preorder we then transfer similarity back to the calculus LR.
The translation from the call-by-need letrec calculus into the extended call-by-name lambda calculus is the composition of two translations. The first translation replaces the call-by-need strategy by a call-by-name strategy and its correctness is shown by exploiting infinite tress, which emerge by unfolding the letrec expressions. The second translation encodes letrec-expressions by using multi-fixpoint combinators and its correctness is shown syntactically by comparing reductions of both calculi. A further result of this paper is an isomorphism between the mentioned calculi, and also with a call-by-need letrec calculus with a less complex definition of reduction than LR.
A concurrent implementation of software transactional memory in Concurrent Haskell using a call-by-need functional language with processes and futures is given. The description of the small-step operational semantics is precise and explicit, and employs an early abort of conflicting transactions. A proof of correctness of the implementation is given for a contextual semantics with may- and should-convergence. This implies that our implementation is a correct evaluator for an abstract specification equipped with a big-step semantics.
This chapter aims to provide a hands-on approach to New Keynesian models and their uses for macroeconomic policy analysis. It starts by reviewing the origins of the New Keynesian approach, the key model ingredients and representative models. Building blocks of current-generation dynamic stochastic general equilibrium (DSGE) models are discussed in detail. These models address the famous Lucas critique by deriving behavioral equations systematically from the optimizing and forward-looking decision-making of households and firms subject to well-defined constraints. State-of-the-art methods for solving and estimating such models are reviewed and presented in examples. The chapter goes beyond the mere presentation of the most popular benchmark model by providing a framework for model comparison along with a database that includes a wide variety of macroeconomic models. Thus, it offers a convenient approach for comparing new models to available benchmarks and for investigating whether particular policy recommendations are robust to model uncertainty. Such robustness analysis is illustrated by evaluating the performance of simple monetary policy rules across a range of recently-estimated models including some with financial market imperfections and by reviewing recent comparative findings regarding the magnitude of government spending multipliers. The chapter concludes with a discussion of important objectives for on-going and future research using the New Keynesian framework.
I characterize optimal monetary and fiscal policy in a stochastic New Keynesian model when nominal interest rates may occasionally hit the zero lower bound. The benevolent policymaker controls the short-term nominal interest rate and the level of government spending. Under discretionary policy, accounting for fiscal stabilization policy eliminates to a large extent the welfare losses associated with the presence of the zero bound. Under commitment, the gains associated with the use of the fiscal policy tool remain modest, even though fiscal stabilization policy is part of the optimal policy mix.
Im Mai 2008 verwüstete der Sturm Nargis über Myanmar/Burma hinweg, 140.000 Menschen wurden getötet. Das autokratisch regierte Land wies jedoch Katastrophenhilfe als innere Einmischung zurück und verweigerte die Einfuhr von Medikamenten und Lebensmitteln. Der französische Außenminister Kouschner drängte angesichts dieser Situation die UN zum Handeln, auf Grundlage der Responsibility to Protect (kurz R2P).
Dieser Akt der Versicherheitlichung steht allerdings im Kontrast zur Medienberichterstattung, wie Gabi Schlag in diesem Papier untersucht. Besonders das Bildmaterial aus dem Katastrophengebiet erzählt eine andere Geschichte. Die Photos der Berichterstattung von BBC.com zum Thema bilden ein visuelles Narrativ, welches keine Hilfsbedürftigkeit suggeriert, sondern kontrolliertes, besonnenes Vorgehen der lokalen Kräfte. Dieser Kontrast verweist auf die sprichwörtliche Macht der Bilder, welche die jeweiligen Bedingungen von Handlungsmöglichkeiten vorstrukturieren.
The calculus CHF models Concurrent Haskell extended by concurrent, implicit futures. It is a process calculus with concurrent threads, monadic concurrent evaluation, and includes a pure functional lambda-calculus which comprises data constructors, case-expressions, letrec-expressions, and Haskell’s seq. Futures can be implemented in Concurrent Haskell using the primitive unsafeInterleaveIO, which is available in most implementations of Haskell. Our main result is conservativity of CHF, that is, all equivalences of pure functional expressions are also valid in CHF. This implies that compiler optimizations and transformations from pure Haskell remain valid in Concurrent Haskell even if it is extended by futures. We also show that this is no longer valid if Concurrent Haskell is extended by the arbitrary use of unsafeInterleaveIO.
We show how Sestoft’s abstract machine for lazy evaluation of purely functional programs can be extended to evaluate expressions of the calculus CHF – a process calculus that models Concurrent Haskell extended by imperative and implicit futures. The abstract machine is modularly constructed by first adding monadic IO-actions to the machine and then in a second step we add concurrency. Our main result is that the abstract machine coincides with the original operational semantics of CHF, w.r.t. may- and should-convergence.
Apokalypsen beruhen auf tradierten Bildern, fiktiven Imaginationen und kulturellen Deutungsmustern. Damit sind weder reproduzierbar noch wissenschaftlich mit validen Methoden beschreibbar. Auch das traditionell starke Risikokonzept der Sozialwissenschaften zur Beschreibung der Zukunft ihres Forschungsgegenstandes greift hier nicht. Der folgende Beitrag unternimmt deshalb den Versuch, im Rahmen dieser sozialwissenschaftlichen Ansätze nach den spezifisch sicherheitskulturellen Aspekten von Apokalypsen zu fragen. Dazu wird eine Typisierung vorgeschlagen, die sich historisch auf das 20. Jahrhundert beschränkt und sich in drei Phasen unterteilt. Kreisten zu Beginn des 20. Jahrhunderts die apokalyptischen Bedrohungsszenarien noch um die Subjekte (die apokalyptische Bedrohung der Menschheit ging von der modernen Gesellschaftsordnung, d.h. von der Menschheit selbst aus), so geriet gegen Mitte des 20. Jahrhunderts zunehmend die objektive Welt der Dinge und Technologien unter Verdacht, eine Apokalypse auszulösen. Inzwischen scheint sich mit Übergang zum 21. Jahrhundert eine dritte Phase von apokalyptischen Szenarien auszudifferenzieren: Existentielle Bedrohungen gehen nicht mehr von identifizierbaren Bedrohungen wie gesellschaftlichen Entfremdungen oder Atomwaffen aus. Vielmehr gelten das Nicht-Identifizierbare, die Ununterscheidbarkeit als existentielle Bedrohung. Auf die Apokalypse der Subjekte und der Apokalypse der Objekte, so der Vorschlag dieses Papiers, folgt die ‚apokalyptoide’, d.h. Apokalypse-ähnliche Situation.
The complexity resulting from intertwined uncertainties regarding model misspecification and mismeasurement of the state of the economy defines the monetary policy landscape. Using the euro area as laboratory this paper explores the design of robust policy guides aiming to maintain stability in the economy while recognizing this complexity. We document substantial output gap mismeasurement and make use of a new model data base to capture the evolution of model specification. A simple interest rate rule is employed to interpret ECB policy since 1999. An evaluation of alternative policy rules across 11 models of the euro area confirms the fragility of policy analysis optimized for any specific model and shows the merits of model averaging in policy design. Interestingly, a simple difference rule with the same coefficients on inflation and output growth as the one used to interpret ECB policy is quite robust as long as it responds to current outcomes of these variables.
The complexity resulting from intertwined uncertainties regarding model misspecification and mismeasurement of the state of the economy defines the monetary policy landscape. Using the euro area as laboratory this paper explores the design of robust policy guides aiming to maintain stability in the economy while recognizing this complexity. We document substantial output gap mismeasurement and make use of a new model data base to capture the evolution of model specification. A simple interest rate rule is employed to interpret ECB policy since 1999. An evaluation of alternative policy rules across 11 models of the euro area confirms the fragility of policy analysis optimized for any specific model and shows the merits of model averaging in policy design. Interestingly, a simple difference rule with the same coefficients on inflation and output growth as the one used to interpret ECB policy is quite robust as long as it responds to current outcomes of these variables.
This paper investigates how an office-motivated incumbent can use transparency enhancement on public spending to signal his budgetary management ability and win re-election. We show that when the incumbent faces a popular challenger, transparency policy can be an effective signaling device. A more popular challenger can reduce the probability to enhance transparency, while voters can be better off due to a more informative signaling. It is also shown that a higher level of public interest in fiscal issues can increase the probability of enhancing transparency, while voters can be worse off by a less informative signaling.
I evaluate the effect of inflation targeting on inflation and how it interacts with product market deregulation during the disinflationary process in the 1990s. Using a sample of 21 OECD countries, I show that, after controlling for product market deregulation, the effect of inflation targeting is quantitatively important and statistically significant. Moreover, product market deregulation also matters in particular in countries that adopted an inflation targeting regime. I propose a New Keynesian Phillips curve with an explicit role for market deregulation to rationalize the empirical evidence.
Law making becomes an increasingly important function of the higher courts in civil law matters. This observation leads to the question of whether the law making function is nevertheless carried out in a “classical” legal-principled way or whether the courts increasingly employ a political-formative style. To answer this question, one should not only focus on the content of the courts’ reasoning but also on their procedural-institutional framework. From that perspective, the processing of so-called legislative facts is a key issue in determining the role of courts between legal reasoning and social engineering. The paper shows that Germany, England and the United States pursue different lines in processing legislative facts. Notwithstanding these differences, it seems to be the case that the increasing importance of law making will also change the institutional framework of appellate courts towards a quasi-legislative forum.
This paper examines data on financial sophistication among the U.S. older population, using a special-purpose module implemented in the Health and Retirement Study. We show that financial sophistication is deficient for older respondents (aged 55+). Specifically, many in this group lack a basic grasp of asset pricing, risk diversification, portfolio choice, and investment fees. Subpopulations with particular deficits include women, the least educated, persons over the age of 75, and non-Whites. In view of the fact that people are increasingly being asked to take on responsibility for their own retirement security, such lack of knowledge can have serious implications.
How do changes in market structure affect the US business cycle? We estimate a monetary DSGE model with endogenous
rm/product entry and a translog expenditure function by Bayesian methods. The dynamics of net business formation allow us to identify the 'competition effect', by which desired price markups and inflation decrease when entry rises. We
find that a 1 percent increase in the number of competitors lowers desired markups by 0.18 percent. Most of the cyclical variability in inflation is driven by markup fluctuations due to sticky prices or exogenous shocks rather than endogenous changes in desired markups.
This paper characterises optimal monetary policy in an economy with endogenous
firm entry, a cash-in-advance constraint and preset wages. Firms must make pro
fits to cover entry costs; thus the markup on goods prices is efficient. However, because leisure is not priced at a markup, the consumption-leisure tradeoff is distorted. Consequently, the real wage, hours and production are suboptimally low. Due to the labour requirement in entry, insufficient labour supply also implies that entry is too low. The paper shows that in the absence of
fiscal instruments such as labour income subsidies, the optimal monetary policy under sticky wages achieves higher welfare than under flexible wages. The policy maker uses the money supply instrument to raise the real wage - the cost of leisure - above its flexible-wage level, in response to expansionary shocks to productivity and entry costs. This raises labour supply, expanding production and
rm entry.
Seit dem Jahr 2005 ist der Schutz vor schweren Menschenrechtsverletzungen, Kriegsverbrechen und Völkermord durch die UN zum überwölbenden Ziel von staatlicher, regionaler und globaler Sicherheit erhoben. Diese Schutzverantwortung (Responsibility to Protect, R2P) illustriert somit die Abkehr von "alter" globaler Sicherheitskultur, die sich über Jahrzehnte auf die scheinbar unumstößlichen Eckpfeiler souveräner Gleichheit und strikter Nicht-Einmischung gestützt hatte. Nimmt diese globale Norm aber regionale Sicherheitskomplexe ausreichend in den Blick? In diesem Working Paper beleuchtet die Perzeption der Schutzverantwortung in den regionalen Organisationen Südostasiens und Afrikas durch die Linse ihrer Sicherheitskulturen. Anstatt die Schutzverantwortung bloß als normative Innovation zu erfassen, wird sie als Ausdruck "kulturellen Wandels" konzeptualisiert, um neben der "abstrakten Norm" auch die "konkrete Praxis" in die Betrachtung einfließen zu lassen.
This paper investigates the effect of anticipated/experienced regret and pride on individual investors’ decisions to hold or sell a winning or losing investment, in the form of the disposition effect. As expected the results suggest that in the loss domain, low anticipated regret predicts a greater probability of selling a losing investment. While in the gain domain, high anticipated pride indicates a greater probability of selling a winning investment. The effects of high experienced regret/pride on the selling probability are found as well. An unexpected finding is that regret (pride) seems to be not only relevant for the loss (gain) domain, but also for the gain (loss) domain. In addition, this paper presents evidence of interconnectedness between anticipated and experienced emotions. The authors discuss the implications of these findings and possible avenues for further research.
Funktionalität von Subventions- und Förderinstrumenten zur Versorgungssicherheit in Privathaushalten
(2012)
Haushaltsnahe Dienstleistungen als niedrigschwellige Angebote sind ein fester und wachsender Bestandteil der Sozialwirtschaft. Die ungebrochene Dominanz von Schwarzarbeit in diesem Segment wirkt jedoch aufgrund ihrer nicht hinreichenden Verbindlichkeit destabilisierend auf die Versorgungssicherheit. Dies ist insbesondere relevant, weil die Gruppe der älteren, unterstützungsbedürftigen Menschen stetig anwächst und diese als die Hauptnachfrager nach solchen Dienstleistungen gelten. Erfahrungen – auch aus dem Ausland – zeigen, dass formalisierte Beschäftigungsverhältnisse von Haushaltshilfen mehr Verbindlichkeit mit verlässlicher Qualität und Transparenz hinsichtlich der erbrachten Dienstleistungen schaffen können. Formalisierte Beschäftigung kann über Subventionen gefördert werden. Entsprechende Instrumente sind bereits implementiert: Es handelt sich dabei um Steuerermäßigungen gemäß § 35a EStG, geringfügige Beschäftigung in Privathaushalten in Form von Minijobs, Förderinstrumente zur Unternehmensgründung, Lohnsubventionen und arbeitsmarktpolitische Förderinstrumente. Sie setzen an verschiedenen Stellen der Erbringung und Nutzung der haushaltsnahen Dienstleistungen an und zwar bei den Haushalten selbst (Steuerermäßigungen und Minijobs), den Beschäftigten (geförderte Qualifizierung, Eingliederungshilfen und Lohnsubventionen) und den Unternehmen, die diese Dienstleistungen anbieten (Gründungsförderung). Die Beschäftigten sind dabei in besonderem Fokus, da das Segment auch als geeignet gilt, um arbeitsmarktfernen Zielgruppen einen niedrigschwelligen Eintritt in Erwerbsarbeit zu eröffnen. Über die hier vorgelegte Expertise wird exploriert, in wie weit diese Subventionsinstrumente tatsächlich geeignet sind, um formalisierte Beschäftigung in Privathaushalten zu fördern und über welche Modifikationen eine Verbesserung der Wirkungen erreicht werden kann.
Power and law in enlightened absolutism : Carl Gottlieb Svarez' theoretical and practical approach
(2012)
The term Enlightened Absolutism reflects a certain tension between its two components. This tension is in a way a continuation of the dichotomy between power on one hand and law on the other. The present paper shall provide an analysis of these two concepts from the perspective of Carl Gottlieb Svarez, who, in his position as a high-ranking Prussian civil servant and legal reformist, has had unparalleled influence on the legislative history of the
Prussian states towards the end of the 18th century. Working side-by-side with Johann Heinrich Casimir von Carmer, who held the post of Prussian minister of justice from 1779 to 1798, Svarez was able to make use of his talent for reforming and legislating. From 1780 to 1794 he was primarily responsible for the elaboration of the codification of the Prussian private law – the “Allgemeines Landrecht für die Preußischen Staaten” in 1794. In the present paper, Svarez’ approach to the relation between law and power shall be analysed on two different levels. Firstly, on a theoretical level, the reformist’s thoughts and reflections as laid down in his numerous works, papers and memorandums, shall be discussed. Secondly, on a practical level, the question of the extent to which he implemented his ideas in Prussian legal reality shall be explored.
After nearly two decades of US leadership during the 1980s and 1990s, are Europe’s venture capital (VC) markets in the 2000s finally catching up regarding the provision of financing and successful exits, or is the performance gap as wide as ever? Are we amid an overall VC performance slump with no encouraging news? We attempt to answer these questions by tracking over 40,000 VC-backed firms stemming from six industries in 13 European countries and the US between 1985 and 2009; determining the type of exit – if any – each particular firm’s investors choose for the venture.
This paper studies constrained portfolio problems that may involve constraints on the probability or the expected size of a shortfall of wealth or consumption. Our first contribution is that we solve the problems by dynamic programming, which is in contrast to the existing literature that applies the martingale method. More precisely, we construct the non-separable value function by formalizing the optimal constrained terminal wealth to be a (conjectured) contingent claim on the optimal non-constrained terminal wealth. This is relevant by itself, but also opens up the opportunity to derive new solutions to constrained problems. As a second contribution, we thus derive new results for non-strict constraints on the shortfall of inter¬mediate wealth and/or consumption.
Das hier vorgelegte Eckpunktepapier befasst sich mit den Zukunftsperspektiven der deutschen Wasserwirtschaft hinsichtlich ihrer Produkte und Konzepte. Ausgehend von schwierigen Herausforderungen für die globale Wasserwirtschaft legt es dar, wie die deutsche Wasserwirtschaft diesen Herausforderungen gegenübersteht und zeigt auf, welche Maßnahmen zu ergreifen sind, um die wirtschaftlichen Perspektiven der deutschen Wasserwirtschaft dauerhaft zu verbessern. Die Abschnitte 1-5 erläutern die Ausgangslage der deutschen Wasserwirtschaft, beschreiben sich neu stellende Herausforderungen und ordnen in diesen Zusammenhang das BMBF-Verbundprojekt „Wasser 2050“ ein, in dem diese Eckpunkte und Empfehlungen erarbeitet wurden. Die Abschnitte 6-10 wenden sich dann im Einzelnen zu ergreifenden Strategien und Ansätzen zu, die dazu beitragen, die Wettbewerbsposition der deutschen Wasserwirtschaft nachhaltig zu entwickeln. Der abschließende Abschnitt 11 fasst die Empfehlungen des Projekts zusammen.
We test whether investor mood affects trading with data on all stock market transactions in Finland, utilizing variation in daylight and local weather. We find some evidence that environmental mood variables (local weather, length of day, daylight saving and lunar phase) affect investors’ direction of trade and volume. The effect magnitudes are roughly comparable to those of classical seasonals, such as the Monday effect. The statistical significance of the mood variables is weak in many cases, however. Only very little of the day-to-day variation in trading is collectively explained by all mood variables and calendar effects, but lower frequency variation seems connected to holiday seasons.
In this paper, we provide some reflections on the development of monetary theory and monetary policy over the last 150 years. Rather than presenting an encompassing overview, which would be overambitious, we simply concentrate on a few selected aspects that we view as milestones in the development of this subject. We also try to illustrate some of the interactions with the political and financial system, academic discussion and the views and actions of central banks.
In this paper, we provide some reflections on the development of monetary theory and monetary policy over the last 150 years. Rather than presenting an encompassing overview, which would be overambitious, we simply concentrate on a few selected aspects that we view as milestones in the development of this subject. We also try to illustrate some of the interactions with the political and financial system, academic discussion and the views and actions of central banks.