Refine
Year of publication
- 2012 (87) (remove)
Document Type
- Working Paper (87) (remove)
Has Fulltext
- yes (87)
Is part of the Bibliography
- no (87)
Keywords
- model uncertainty (4)
- monetary policy (4)
- DSGE models (3)
- Monetary Policy (3)
- forecasting (3)
- ECB (2)
- Fiscal Policy (2)
- Formale Semantik (2)
- Funktionale Programmierung (2)
- Greenbook (2)
Institute
- Center for Financial Studies (CFS) (25)
- Institute for Monetary and Financial Stability (IMFS) (20)
- Wirtschaftswissenschaften (16)
- Institut für sozial-ökologische Forschung (ISOE) (10)
- House of Finance (HoF) (7)
- Rechtswissenschaft (6)
- Exzellenzcluster Die Herausbildung normativer Ordnungen (5)
- Gesellschaftswissenschaften (4)
- Informatik (4)
- Institute for Law and Finance (ILF) (4)
Motivated by the U.S. events of the 2000s, we address whether a too low for too long interest rate policy may generate a boom-bust cycle. We simulate anticipated and unanticipated monetary policies in state-of-the-art DSGE models and in a model with bond financing via a shadow banking system, in which the bond spread is calibrated for normal and optimistic times. Our results suggest that the U.S. boom-bust was caused by the combination of (i) too low for too long interest rates, (ii) excessive optimism and (iii) a failure of agents to anticipate the extent of the abnormally favorable conditions.
Im Mai 2008 verwüstete der Sturm Nargis über Myanmar/Burma hinweg, 140.000 Menschen wurden getötet. Das autokratisch regierte Land wies jedoch Katastrophenhilfe als innere Einmischung zurück und verweigerte die Einfuhr von Medikamenten und Lebensmitteln. Der französische Außenminister Kouschner drängte angesichts dieser Situation die UN zum Handeln, auf Grundlage der Responsibility to Protect (kurz R2P).
Dieser Akt der Versicherheitlichung steht allerdings im Kontrast zur Medienberichterstattung, wie Gabi Schlag in diesem Papier untersucht. Besonders das Bildmaterial aus dem Katastrophengebiet erzählt eine andere Geschichte. Die Photos der Berichterstattung von BBC.com zum Thema bilden ein visuelles Narrativ, welches keine Hilfsbedürftigkeit suggeriert, sondern kontrolliertes, besonnenes Vorgehen der lokalen Kräfte. Dieser Kontrast verweist auf die sprichwörtliche Macht der Bilder, welche die jeweiligen Bedingungen von Handlungsmöglichkeiten vorstrukturieren.
This paper studies constrained portfolio problems that may involve constraints on the probability or the expected size of a shortfall of wealth or consumption. Our first contribution is that we solve the problems by dynamic programming, which is in contrast to the existing literature that applies the martingale method. More precisely, we construct the non-separable value function by formalizing the optimal constrained terminal wealth to be a (conjectured) contingent claim on the optimal non-constrained terminal wealth. This is relevant by itself, but also opens up the opportunity to derive new solutions to constrained problems. As a second contribution, we thus derive new results for non-strict constraints on the shortfall of inter¬mediate wealth and/or consumption.
In the aftermath of the global financial crisis, the state of macroeconomic modeling and the use of macroeconomic models in policy analysis has come under heavy criticism. Macroeconomists in academia and policy institutions have been blamed for relying too much on a particular class of macroeconomic models. This paper proposes a comparative approach to macroeconomic policy analysis that is open to competing modeling paradigms. Macroeconomic model comparison projects have helped produce some very influential insights such as the Taylor rule. However, they have been infrequent and costly, because they require the input of many teams of researchers and multiple meetings to obtain a limited set of comparative findings. This paper provides a new approach that enables individual researchers to conduct model comparisons easily, frequently, at low cost and on a large scale. Using this approach a model archive is built that includes many well-known empirically estimated models that may be used for quantitative analysis of monetary and fiscal stabilization policies. A computational platform is created that allows straightforward comparisons of models’ implications. Its application is illustrated by comparing different monetary and fiscal policies across selected models. Researchers can easily include new models in the data base and compare the effects of novel extensions to established benchmarks thereby fostering a comparative instead of insular approach to model development.
In the aftermath of the global financial crisis, the state of macroeconomicmodeling and the use of macroeconomic models in policy analysis has come under heavy criticism. Macroeconomists in academia and policy institutions have been blamed for relying too much on a particular class of macroeconomic models. This paper proposes a comparative approach to macroeconomic policy analysis that is open to competing modeling paradigms. Macroeconomic model comparison projects have helped produce some very influential insights such as the Taylor rule. However, they have been infrequent and costly, because they require the input of many teams of researchers and multiple meetings to obtain a limited set of comparative findings. This paper provides a new approach that enables individual researchers to conduct model comparisons easily, frequently, at low cost and on a large scale. Using this approach a model archive is built that includes many well-known empirically estimated models that may be used for quantitative analysis of monetary and fiscal stabilization policies. A computational platform is created that allows straightforward comparisons of models’ implications. Its application is illustrated by comparing different monetary and fiscal policies across selected models. Researchers can easily include new models in the data base and compare the effects of novel extensions to established benchmarks thereby fostering a comparative instead of insular approach to model development
The nineteenth century in Britain saw tumultuous changes that reshaped the fabric of society and altered the course of modernization. It also saw the rise of the novel to the height of its cultural power as the most important literary form of the period. This paper reports on a long-term experiment in tracing such macroscopic changes in the novel during this crucial period. Specifically, we present findings on two interrelated transformations in novelistic language that reveal a systemic concretization in language and fundamental change in the social spaces of the novel. We show how these shifts have consequences for setting, characterization, and narration as well as implications for the responsiveness of the novel to the dramatic changes in British society.
This paper has a second strand as well. This project was simultaneously an experiment in developing quantitative and computational methods for tracing changes in literary language. We wanted to see how far quantifiable features such as word usage could be pushed toward the investigation of literary history. Could we leverage quantitative methods in ways that respect the nuance and complexity we value in the humanities? To this end, we present a second set of results, the techniques and methodological lessons gained in the course of designing and running this project.
We show how Sestoft’s abstract machine for lazy evaluation of purely functional programs can be extended to evaluate expressions of the calculus CHF – a process calculus that models Concurrent Haskell extended by imperative and implicit futures. The abstract machine is modularly constructed by first adding monadic IO-actions to the machine and then in a second step we add concurrency. Our main result is that the abstract machine coincides with the original operational semantics of CHF, w.r.t. may- and should-convergence.
This paper constructs a dynamic model of health insurance to evaluate the short- and long run effects of policies that prevent firms from conditioning wages on health conditions of their workers, and that prevent health insurance companies from charging individuals with adverse health conditions higher insurance premia. Our study is motivated by recent US legislation that has tightened regulations on wage discrimination against workers with poorer health status (Americans with Disability Act of 2009, ADA, and ADA Amendments Act of 2008, ADAAA) and that will prohibit health insurance companies from charging different premiums for workers of different health status starting in 2014 (Patient Protection and Affordable Care Act, PPACA). In the model, a trade-off arises between the static gains from better insurance against poor health induced by these policies and their adverse dynamic incentive effects on household efforts to lead a healthy life. Using household panel data from the PSID we estimate and calibrate the model and then use it to evaluate the static and dynamic consequences of no-wage discrimination and no-prior conditions laws for the evolution of the cross-sectional health and consumption distribution of a cohort of households, as well as ex-ante lifetime utility of a typical member of this cohort. In our quantitative analysis we find that although a combination of both policies is effective in providing full consumption insurance period by period, it is suboptimal to introduce both policies jointly since such policy innovation induces a more rapid deterioration of the cohort health distribution over time. This is due to the fact that combination of both laws severely undermines the incentives to lead healthier lives. The resulting negative effects on health outcomes in society more than offset the static gains from better consumption insurance so that expected discounted lifetime utility is lower under both policies, relative to only implementing wage nondiscrimination legislation.
Apokalypsen beruhen auf tradierten Bildern, fiktiven Imaginationen und kulturellen Deutungsmustern. Damit sind weder reproduzierbar noch wissenschaftlich mit validen Methoden beschreibbar. Auch das traditionell starke Risikokonzept der Sozialwissenschaften zur Beschreibung der Zukunft ihres Forschungsgegenstandes greift hier nicht. Der folgende Beitrag unternimmt deshalb den Versuch, im Rahmen dieser sozialwissenschaftlichen Ansätze nach den spezifisch sicherheitskulturellen Aspekten von Apokalypsen zu fragen. Dazu wird eine Typisierung vorgeschlagen, die sich historisch auf das 20. Jahrhundert beschränkt und sich in drei Phasen unterteilt. Kreisten zu Beginn des 20. Jahrhunderts die apokalyptischen Bedrohungsszenarien noch um die Subjekte (die apokalyptische Bedrohung der Menschheit ging von der modernen Gesellschaftsordnung, d.h. von der Menschheit selbst aus), so geriet gegen Mitte des 20. Jahrhunderts zunehmend die objektive Welt der Dinge und Technologien unter Verdacht, eine Apokalypse auszulösen. Inzwischen scheint sich mit Übergang zum 21. Jahrhundert eine dritte Phase von apokalyptischen Szenarien auszudifferenzieren: Existentielle Bedrohungen gehen nicht mehr von identifizierbaren Bedrohungen wie gesellschaftlichen Entfremdungen oder Atomwaffen aus. Vielmehr gelten das Nicht-Identifizierbare, die Ununterscheidbarkeit als existentielle Bedrohung. Auf die Apokalypse der Subjekte und der Apokalypse der Objekte, so der Vorschlag dieses Papiers, folgt die ‚apokalyptoide’, d.h. Apokalypse-ähnliche Situation.
This paper investigates whether preference interactions can explain why risk preferences change over time and across contexts. We conduct an experiment in which subjects accept or reject gambles involving real money gains and losses. We introduce within-subject variation by alternating subjectively liked music and disliked music in the background. We find that favourite music increases risk-taking, and disliked music suppresses risk-taking, compared to a baseline of no music. Several theories in psychology propose mechanisms by which mood affects risktaking, but none of them fully explain our results. The results are, however, consistent with preference complementarities that extend to risk preference.
Im Projekt OPTUM wurde untersucht, welche Umweltentlastungen durch Elektrofahrzeuge in Zukunft erzielt werden könnten. Hierzu wurde ein integrativer Ansatz verfolgt, der neben der fahrzeugseitigen Betrachtung auch die Interaktionen mit dem Strommarkt berücksichtigt. Im Einzelnen fanden Analysen zu den folgenden zentralen Aspekten statt: Akzeptanz und Attraktivität von Elektrofahrzeugen, Marktpotenziale für Elektrofahrzeuge, Interaktion von Elektrofahrzeugen mit dem Stromsektor, CO2-Minderungspotenziale von Elektromobilität, Ökonomische Betrachtung der Speichermedien und Ressourceneffizienz des Systems Elektromobilität. In diesem Studientext werden die Forschungsergebnisse zur Frage nach der Attraktivität und Akzeptanz von Elektroautos vorgestellt. Dabei wird auf Ergebnisse aus zwei empirischen Untersuchungen eingegangen, die in OPTUM zur Ermittlung der Attraktivität und Akzeptanz von Elektrofahrzeugen durchgeführt wurden. Bei diesen Untersuchungen handelt es sich zum einen um eine qualitative Untersuchung mittels Fokusgruppen und zum anderen um eine standardisierte Erhebung, bei der NeuwagenkäuferInnen befragt wurden. Mit der standardisierten Befragung wurde eine Conjoint-Analyse zur Fahrzeugwahl gekoppelt, bei der sich die Befragten zwischen Fahrzeugen mit Verbrennungsmotor, Plug-in-Hybrid-Antrieb und voll-elektrischem Antrieb entscheiden mussten. Die empirischen Analysen verdeutlichen, dass es ein erhebliches Akzeptanzpotenzial für die beiden Elektrofahrzeugkonzepte – Plug-in-Hybride und vollelektrische Fahrzeuge – gibt. Speziell für voll-elektrische Fahrzeuge existiert je nach Szenario und Fahrzeugklasse ein Akzeptanzpotenzial von 12 bis 25 Prozent. Des Weiteren liefern beide empirischen Erhebungen Hinweise, wie dieses Akzeptanzpotenzial ausgeschöpft oder gar vergrößert werden kann.
In seiner Entscheidung in Sachen Fresenius - Der Konzern 2012, 420 - hat der Bundesgerichtshof entschieden, der Vorstand einer Aktiengesellschaft handele pflichtwidrig, wenn er einem Aufsichtsratsmitglied die vereinbarte Vergütung für Beratungsleistungen zahle, noch bevor der Aufsichtsrat dem Vertrag zugestimmt habe. In diesem Zusammenhang hat er die bereits zuvor herrschende Lehre bestätigt, der zufolge § 114 AktG auch Beratungsverträge zwischen einem Aufsichtsratsmitglied und einem von der Aktiengesellschaft abhängigen Unternehmen erfasst. Schließlich hat der Bundesgerichtshof seine Rechtsprechung konkretisiert, nach der § 114 AktG auch dann Anwendung findet, wenn die Beratungsleistung nicht von einem Aufsichtsratsmitglied, sondern von einer Gesellschaft erbracht wird, an der das Aufsichtsratsmitglied beteiligt ist, sofern es nur in nicht unerheblichem Umfang an der Vergütung partizipiert. Der vorliegende Beitrag nimmt kritisch zu allen vorgenannten Aspekten des Fresenius-Urteils Stellung.
Was ist »Neoliberalismus« und wie ist es um ihn bestellt? Welche Rolle spielt der Begriff nach all den Abgesängen und Wiederbelebungen heute in Politik und den Sozialwissenschaften? Ziel des vorliegenden Beitrags ist es, das unübersichtliche Feld der Forschungsrichtungen, die sich mit dem Neoliberalismus befassen, in Augenschein zu nehmen und die wichtigsten Debatten sowie ihre Fortentwicklungen vorzustellen, um die Orientierung zu erleichtern. Ausgehend von einem kurzen Überblick über aktuelle Stellungnahmen zum Neoliberalismus im politischen Diskurs werden die beiden wichtigsten theoretischen Perspektiven wird – Hegemonietheorie und Governmentality Studies –vorgestellt, aus denen der Neoliberalismus untersucht wird, um dann verschiedene der wichtigsten Schauplätze des Neoliberalismus abzuschreiten. Das kritische Interesse der größtenteils aus einer der beiden Perspektiven heraus arbeitenden Forscher richtet sich unter anderem auf die Rolle des Nationalstaats, den Umbau urbaner Räume, seine Auswirkungen auf die Geschlechterverhältnisse oder die Art und Weise, wie das Leben im Neoliberalismus die Selbstverhältnisse der Subjekte transformiert. Der Artikel schließt mit Überlegungen zum theoretischen Preis, der für den ungeheuer weit gefasste Neoliberalismusbegriff zu zahlen ist und der nicht zuletzt in einer vermeintlichen Alternativlosigkeit besteht, die ironischerweise aus den zahllosen kritisch intendierten Beschwörungen des Neoliberalismus hervorgeht.
The withdrawal of foreign capital from emerging countries at the height of the recent financial crisis and its quick return sparked a debate about the impact of capital flow surges on asset markets. This paper addresses the response of property prices to an inflow of foreign capital. For that purpose we estimate a panel VAR on a set of Asian emerging market economies, for which the waves of inflows were particularly pronounced, and identify capital inflow shocks based on sign restrictions. Our results suggest that capital inflow shocks have a significant effect on the appreciation of house prices and equity prices. Capital inflow shocks account for - roughly - twice the portion of overall house price changes they explain in OECD countries. We also address crosscountry differences in the house price responses to shocks, which are most likely due to differences in the monetary policy response to capital inflows.
The complexity resulting from intertwined uncertainties regarding model misspecification and mismeasurement of the state of the economy defines the monetary policy landscape. Using the euro area as laboratory this paper explores the design of robust policy guides aiming to maintain stability in the economy while recognizing this complexity. We document substantial output gap mismeasurement and make use of a new model data base to capture the evolution of model specification. A simple interest rate rule is employed to interpret ECB policy since 1999. An evaluation of alternative policy rules across 11 models of the euro area confirms the fragility of policy analysis optimized for any specific model and shows the merits of model averaging in policy design. Interestingly, a simple difference rule with the same coefficients on inflation and output growth as the one used to interpret ECB policy is quite robust as long as it responds to current outcomes of these variables.
The complexity resulting from intertwined uncertainties regarding model misspecification and mismeasurement of the state of the economy defines the monetary policy landscape. Using the euro area as laboratory this paper explores the design of robust policy guides aiming to maintain stability in the economy while recognizing this complexity. We document substantial output gap mismeasurement and make use of a new model data base to capture the evolution of model specification. A simple interest rate rule is employed to interpret ECB policy since 1999. An evaluation of alternative policy rules across 11 models of the euro area confirms the fragility of policy analysis optimized for any specific model and shows the merits of model averaging in policy design. Interestingly, a simple difference rule with the same coefficients on inflation and output growth as the one used to interpret ECB policy is quite robust as long as it responds to current outcomes of these variables.
A concurrent implementation of software transactional memory in Concurrent Haskell using a call-by-need functional language with processes and futures is given. The description of the small-step operational semantics is precise and explicit, and employs an early abort of conflicting transactions. A proof of correctness of the implementation is given for a contextual semantics with may- and should-convergence. This implies that our implementation is a correct evaluator for an abstract specification equipped with a big-step semantics.
Mit dem mathematischen Werkzeug zur Filterung von Lärmprozessen, die für die Praktizierung der numerischen Wettervorhersage in ihrer Anfangszeit eine große Rolle gespielt hat, wird an den linearisierten Gleichungen für ein barotropes Flachwassermodell im β-Format die Frage behandelt, wie sich der Rossby-förmige Wellenmodus entwickelt, wenn gleichzeitige Trägheitsschwerewellenmoden infolge von veränderlichen filternden Gleichungen unterdrückt werden.
Zunächst werden einige analytische Grundbeziehungen einer lärmfreien Adjustierung von Anfangsfeldern ausgeführt. Dann richtet sich die Untersuchung auf lärmfilternde prognostische Modellgleichungen (Absch. 5, 6, 7). Aus dieser Analysis in Abhängigkeit von der Filterstufe findet man im Vergleich mit der konventionellen synoptisch-skaligen Rossby-Formel, die der Filterstufe Null entspricht, zum Teil erweiterte Wellengeschwindigkeiten der Rossby-förmigen Physik. Dabei treten sogar unerwartete neue Effekte auf, die wohl primär im Langwellenspektrum zählen.
Law making becomes an increasingly important function of the higher courts in civil law matters. This observation leads to the question of whether the law making function is nevertheless carried out in a “classical” legal-principled way or whether the courts increasingly employ a political-formative style. To answer this question, one should not only focus on the content of the courts’ reasoning but also on their procedural-institutional framework. From that perspective, the processing of so-called legislative facts is a key issue in determining the role of courts between legal reasoning and social engineering. The paper shows that Germany, England and the United States pursue different lines in processing legislative facts. Notwithstanding these differences, it seems to be the case that the increasing importance of law making will also change the institutional framework of appellate courts towards a quasi-legislative forum.
We argue that the U.S. personal saving rate’s long stability (1960s–1980s), subsequent steady decline (1980s–2007), and recent substantial rise (2008–2011) can be interpreted using a parsimonious ‘buffer stock’ model of consumption in the presence of labor income uncertainty and credit constraints. Saving in the model is affected by the gap between ‘target’ and actual wealth, with the target determined by credit conditions and uncertainty. An estimated structural version of the model suggests that increased credit availability accounts for most of the long-term saving decline, while fluctuations in wealth and uncertainty capture the bulk of the business-cycle variation.
We test whether investor mood affects trading with data on all stock market transactions in Finland, utilizing variation in daylight and local weather. We find some evidence that environmental mood variables (local weather, length of day, daylight saving and lunar phase) affect investors’ direction of trade and volume. The effect magnitudes are roughly comparable to those of classical seasonals, such as the Monday effect. The statistical significance of the mood variables is weak in many cases, however. Only very little of the day-to-day variation in trading is collectively explained by all mood variables and calendar effects, but lower frequency variation seems connected to holiday seasons.
The comprehension and production of single words involve a variety of processing stages. Which stages need to be accessed differs depending on whether objects (pictures in an experimental environment) or words are supposed to be named. Naming tasks are often employed in psycholinguistic studies in order to provide an insight into the function of mental processes during word production. Differences in naming latencies and naming accuracy between words suggest that the retrieval of some lexical items is easier or more difficult in contrast to others. The relative ease of word retrieval has been found to be strongly influenced by properties of these words, such as familiarity and written or spoken frequency.
Exploring which variables affect naming speed and accuracy will allow gaining more information about the storage and processing of words in general. If a variable has a discernable effect on a specific experimental task, the localization of this effect is of interest for psycholinguistic research. This is because finding the locus of the effect can help specify models of speech production with respect to what processes occur at which stage of lexical retrieval. Additionally, identifying which variables influence language processing is inevitable in order to control for these variables when necessary. Otherwise variance in naming latencies could not be explained by the variable that was to be tested because other, uncontrolled variables could have altered the results.
The 'de-allative'-pattern (Heine/ Kuteva 2008: 103) gives rise to the French grammaticalized periphrasis aller + INF and the Spanish grammaticalized periphrasis ir a + INF. This construction (anar + INF) also consists in Catalan, but here, however, with the periphrasis expressing a past tense. Concerning the grammaticalization path ir a + INF and aller + INF were formerly used to express a past (historical present), whereas anar + INF also expressed a future (and can still take on this function). This paper discusses possible reasons for the development and the thus exceptional position of the Catalan past-periphrasis. In addition to morphological and normative explanations, language contact between Catalan and Spanish/ French as well as sociolinguistic circumstances are factors which may possibly account for the development of the Catalan construction. After a separate presentation of the development and the former and actual use(s) and forms of the three periphrasis, the cognitive processes which took place during the grammaticalization are presented. Afterward the three periphrasis are compared using the parameters of Lehmann. The second part of this paper consists of a corpus which verifies and illustrates the results of the previous part.
Im Feldversuch von Future Fleet wurde einer Vielzahl von Fragestellungen nachgegangen. In
Bezug auf das Verkehrsverhalten und die Akzeptanz von Elektroautos standen folgende Forschungsfragen
im Vordergrund:
·Welches sind Faktoren für Attraktivität und Akzeptanz von Elektrofahrzeugen im Rahmen
einer betrieblichen Nutzung?
·Wie entwickelt sich das Verkehrsverhalten der Nutzerinnen und Nutzer?
·Wie wirken sich die veränderte Technik und Poolkonzepte auf das Verkehrsverhalten
und die Einstellungen der Nutzer/innen aus? Wie integrieren Nutzer und Nutzerinnen die
veränderten Eigenschaften in ihre Alltagsroutinen?
Es bestand ein sehr großes Interesse der SAP-Mitarbeiter/innen am Feldtest. Es konnte nur ein
Bruchteil der Interessierten am Feldtest teilnehmen. Zwei Nutzungsszenarien wurden entwickelt.
Das Szenario 1 "wochenweise Überlassung" für eine mehrtägige Nutzung der Elektrofahrzeuge
durch eine Person auf beruflichen und privaten Wegen. Das Szenario 2 "Dienstliche Nutzung
(Poolfahrzeug)" für Dienstfahrten von Mitarbeiterinnen und Mitarbeitern zu einem anderen Standort
oder Außenterminen. ...
Eine wesentliche Voraussetzung für die Entschlüsselung herrschender Justizverständnisse ist die Auseinandersetzung mit den Rollen, die die beteiligten Akteure in einem Rechtssystem einnehmen sowie die Untersuchung der rechtlichen und institutionellen Bedingungen unter denen diese Akteure handeln. Der vorliegende Beitrag beschäftigt sich zunächst mit der Macht- und Aufgabenverteilung zwischen Richtern und Parteien. Dabei wird deutlich, dass die Rollenallokation nicht einheitlich ist, sondern in Abhängigkeit von unterschiedlichen verfahrensrechtlichen und institutionellen Voraussetzungen variiert. In Verfahren vor einer Jury wird die richterliche Autorität durch eine maximal ausgeprägte Parteiautonomie stark eingeschränkt. Als Rechthonoratioren (im Weberschen Sinne) agieren Richter dagegen immer dann, wenn Sie ohne Geschworene Recht sprechen. Dies geschieht insbesondere in den einzelstaatlichen Obergerichten und den Bundesberufungsgereichten, aber auch in Verfahren erster Instanz, in denen „claims in equity“ zu entscheiden sind. Der Beitrag beschäftigt sich abschließend mit dem Einfluss, den die Besonderheiten der amerikanischen Juristenausbildung auf das amerikanische Justizverständnis ausüben: Sie prägen und reproduzieren eine der Rollen und Selbstbilder unter amerikanischen Juristen, sowohl in der Anwaltschaft als auch auf Seiten der Richter.
This paper investigates the accuracy of point and density forecasts of four DSGE models for inflation, output growth and the federal funds rate. Model parameters are estimated and forecasts are derived successively from historical U.S. data vintages synchronized with the Fed’s Greenbook projections. Point forecasts of some models are of similar accuracy as the forecasts of nonstructural large dataset methods. Despite their common underlying New Keynesian modeling philosophy, forecasts of different DSGE models turn out to be quite distinct. Weighted forecasts are more precise than forecasts from individual models. The accuracy of a simple average of DSGE model forecasts is comparable to Greenbook projections for medium term horizons. Comparing density forecasts of DSGE models with the actual distribution of observations shows that the models overestimate uncertainty around point forecasts.
This paper investigates the accuracy of forecasts from four DSGE models for inflation, output growth and the federal funds rate using a real-time dataset synchronized with the Fed’s Greenbook projections. Conditioning the model forecasts on the Greenbook nowcasts leads to forecasts that are as accurate as the Greenbook projections for output growth and the federal funds rate. Only for inflation the model forecasts are dominated by the Greenbook projections. A comparison with forecasts from Bayesian VARs shows that the economic structure of the DSGE models which is useful for the interpretation of forecasts does not lower the accuracy of forecasts. Combining forecasts of several DSGE models increases precision in comparison to individual model forecasts. Comparing density forecasts with the actual distribution of observations shows that DSGE models overestimate uncertainty around point forecasts.
This paper examines data on financial sophistication among the U.S. older population, using a special-purpose module implemented in the Health and Retirement Study. We show that financial sophistication is deficient for older respondents (aged 55+). Specifically, many in this group lack a basic grasp of asset pricing, risk diversification, portfolio choice, and investment fees. Subpopulations with particular deficits include women, the least educated, persons over the age of 75, and non-Whites. In view of the fact that people are increasingly being asked to take on responsibility for their own retirement security, such lack of knowledge can have serious implications.
In the aftermath of the global financial crisis and great recession, many countries face substantial deficits and growing debts. In the United States, federal government outlays as a ratio to GDP rose substantially from about 19.5 percent before the crisis to over 24 percent after the crisis. In this paper we consider a fiscal consolidation strategy that brings the budget to balance by gradually reducing this spending ratio over time to the level that prevailed prior to the crisis. A crucial issue is the impact of such a consolidation strategy on the economy. We use structural macroeconomic models to estimate this impact focussing primarily on a dynamic stochastic general equilibrium model with price and wage rigidities and adjustment costs. We separate out the impact of reductions in government purchases and transfers, and we allow for a reduction in both distortionary taxes and government debt relative to the baseline of no consolidation. According to the model simulations GDP rises in the short run upon announcement and implementation of this fiscal consolidation strategy and remains higher than the baseline in the long run. We explore the role of the mix of expenditure cuts and tax reductions as well as gradualism in achieving this policy outcome. Finally, we conduct sensitivity studies regarding the type of model used and its parameterization.
In the aftermath of the global financial crisis and great recession, many countries face substantial deficits and growing debts. In the United States, federal government outlays as a ratio to GDP rose substantially from about 19.5 percent before the crisis to over 24 percent after the crisis. In this paper we consider a fiscal consolidation strategy that brings the budget to balance by gradually reducing this spending ratio over time to the level that prevailed prior to the crisis. A crucial issue is the impact of such a consolidation strategy on the economy. We use structural macroeconomic models to estimate this impact focussing primarily on a dynamic stochastic general equilibrium model with price and wage rigidities and adjustment costs. We separate out the impact of reductions in government purchases and transfers, and we allow for a reduction in both distortionary taxes and government debt relative to the baseline of no consolidation. According to the model simulations GDP rises in the short run upon announcement and implementation of this fiscal consolidation strategy and remains higher than the baseline in the long run. We explore the role of the mix of expenditure cuts and tax reductions as well as gradualism in achieving this policy outcome. Finally, we conduct sensitivity studies regarding the type of model used and its parameterization.
Funktionalität von Subventions- und Förderinstrumenten zur Versorgungssicherheit in Privathaushalten
(2012)
Haushaltsnahe Dienstleistungen als niedrigschwellige Angebote sind ein fester und wachsender Bestandteil der Sozialwirtschaft. Die ungebrochene Dominanz von Schwarzarbeit in diesem Segment wirkt jedoch aufgrund ihrer nicht hinreichenden Verbindlichkeit destabilisierend auf die Versorgungssicherheit. Dies ist insbesondere relevant, weil die Gruppe der älteren, unterstützungsbedürftigen Menschen stetig anwächst und diese als die Hauptnachfrager nach solchen Dienstleistungen gelten. Erfahrungen – auch aus dem Ausland – zeigen, dass formalisierte Beschäftigungsverhältnisse von Haushaltshilfen mehr Verbindlichkeit mit verlässlicher Qualität und Transparenz hinsichtlich der erbrachten Dienstleistungen schaffen können. Formalisierte Beschäftigung kann über Subventionen gefördert werden. Entsprechende Instrumente sind bereits implementiert: Es handelt sich dabei um Steuerermäßigungen gemäß § 35a EStG, geringfügige Beschäftigung in Privathaushalten in Form von Minijobs, Förderinstrumente zur Unternehmensgründung, Lohnsubventionen und arbeitsmarktpolitische Förderinstrumente. Sie setzen an verschiedenen Stellen der Erbringung und Nutzung der haushaltsnahen Dienstleistungen an und zwar bei den Haushalten selbst (Steuerermäßigungen und Minijobs), den Beschäftigten (geförderte Qualifizierung, Eingliederungshilfen und Lohnsubventionen) und den Unternehmen, die diese Dienstleistungen anbieten (Gründungsförderung). Die Beschäftigten sind dabei in besonderem Fokus, da das Segment auch als geeignet gilt, um arbeitsmarktfernen Zielgruppen einen niedrigschwelligen Eintritt in Erwerbsarbeit zu eröffnen. Über die hier vorgelegte Expertise wird exploriert, in wie weit diese Subventionsinstrumente tatsächlich geeignet sind, um formalisierte Beschäftigung in Privathaushalten zu fördern und über welche Modifikationen eine Verbesserung der Wirkungen erreicht werden kann.
We investigate the decisions of listed firms to go private once again. We start by revealing that while a significant number of firms which go public is VC-backed, an overproportional share of these VC-backed firms go private later on (they stay on the exchange for an average of 8.5 years). We interpret this very robust pattern such that IPOs of VC-backed firms are to a large extent a temporary rather than a permanent feature of the corporate governance of these firms. We investigate various potential hypotheses why VCs actually seem to be able to bring marginal firms to the exchange by relating the going-private decisions to various characteristics of the IPO market as well as to VC characteristics. We find strong support for the certification ability of VCs: more experienced and reputable VCs are more able to bring marginal firms to public exchanges via an IPOs. These marginal firms backed-by more reputable and experienced VCs are more likely to go private later on. Hence, our analysis suggests that IPOs backed by experienced VCs are most likely to be a temporary rather than the final stage in the life of the portfolio firm. We find no support that reputable VCs underprice their IPO-exits more implying that they have no need to leave more money on the table to take the marginal firms public.
In Germany, as in almost all industrial countries, active pharmaceutical substances can now be found in virtually all water bodies and occasionally also in drinking water. Even though the concentrations in question tend to be very low, there are initial signs of their impact on aquatic life. There is no evidence as yet of any acute consequences for human health. It is, however, impossible to rule out long-term consequences from these minimal concentrations or unexpected effects from the interaction between various active ingredients (cocktail effect). At special risk here are sensitive segments of the population such as children and the chronically ill. There is thus a need for action on precautionary grounds.
The main actors in the health system are largely unaware of the problem posed by drug residues in water. Although knowledge cannot be equated with awareness – given the existence of the ‘not wanting to know' phenomenon – the first step is to generate a consolidated knowledge base. Only by creating awareness of the problem can further strategies be implemented to ultimately enlighten and bring about behavioural change. At stake here is the overall everyday handling of medications, including prescription, compliance, and drug-free disease prevention down to the doctor-patient relationship. The latter, namely, is often characterised by misunderstandings and a lack of communication about the – supposed – need to prescribe drugs.
The first part of the strategy for the general public involves using various channels and media to address three different target groups. These were identified by ISOE in an empirical survey as reacting differently to the problem under review:
· ‘The Deniers/Relativists'
· ‘The Truth-Seekers'
· ‘The Hypersensitives'
The intention is to address each target group in the right tone and using the most suitable line of reasoning via specific media and with the proper degree of differentiation. The ‘Truth-Seekers' play an opinion-leading role here. They can be provided with highly differentiated information through sophisticated media which they then pass on to their dialogue partners in an appropriate form.
The second part of the strategy for the general public relates to the communication of proper disposal routes for expired drugs. The goal is to confine disposal to pharmacies so that on no account are they flushed down the sink or toilet. Based on an analysis of typical errors in existing communications media on this topic, ISOE prepared recommendations for drafting proper information materials.
In addressing pharmacists, the first priority is to convey hard facts: to this end we propose a PR campaign to place articles in the main specialist media. At the same time, the subject should feature in training and continuing education programmes. Another aim is to strengthen the advisory function of the pharmacies. The environmentally sensitive target group would indeed react positively to having their attention drawn to the issue of drug residues in water. For all other customers, the pharmacists can and should act as consultants: they emphasise how important it is to take medication as instructed (compliance) and use suitable pack sizes, and warn older customers in particular about the potential hazards of improper drug intake.
The first stage of the communications strategy for doctors likewise revolves around knowledge. Here, however, it is important to take into account their self-image as scientists while in fact having little grasp of this specific area. The line to take is that of ‘discursive selfenlightenment'. This means that the issue of drug residues in water cannot be conveyed to doctors by laymen but must be taken up and imparted via the major media of the medical profession and by medical association officials (top-down).
The second stage, namely that of raising doctors’ awareness of the problem, is likely to encounter strong resistance from some of the medical profession. They may fear a threat of interference in treatment plans from an environmental perspective and feel the need to emphasise that doctors are not responsible for environmental issues. As shown in empirical surveys by ISOE, such a defensive reaction is ultimately down to an underlying taboo: people are loath to discuss the over-prescription taking place in countless doctors' surgeries. And it is a fact that this problem cannot be tackled from the environmental perspective, although the goals of water protection are indeed consistent with the economic objectives of restraint in the deployment of drugs. Any communications measure for this target group has to bear in mind that doctors feel restricted by what they see as a ‘perpetual health reform' no matter which government is in power. On no account are they prepared to tolerate any new form of regulation, in this case for environmental reasons.
An entirely different view of the problem is taken by ‘critical doctors' such as specialists in environmental health and those with a naturopathic focus. They are interested in the problem because they see a connection between the quality of our environment and our health. What is more, they have patients keen to be prescribed as few drugs as possible and who are instead interested in ‘talking medicine'. So, any communication strategy intent on tackling the difficult problem of oversubscribing drugs needs to look carefully at the experiences of these medical professionals and also at a ‘bottom-up strategy'.
Implementation of strategic communications should be entrusted to an agency with experience in ‘issue management'. Knowledge of social marketing and the influencing of behaviour are further prerequisites. All important decisions should be taken by a consensus committee (‘MeriWa'1 round table), in which the medical profession, pharmacists and consumers are represented.
After nearly two decades of US leadership during the 1980s and 1990s, are Europe’s venture capital (VC) markets in the 2000s finally catching up regarding the provision of financing and successful exits, or is the performance gap as wide as ever? Are we amid an overall VC performance slump with no encouraging news? We attempt to answer these questions by tracking over 40,000 VC-backed firms stemming from six industries in 13 European countries and the US between 1985 and 2009; determining the type of exit – if any – each particular firm’s investors choose for the venture.
Debt-induced crises, including the subprime, are usually attributed exclusively to supply-side factors. We examine the role of social influences on debt culture, emanating from perceived average income of peers. Utilizing unique information from a household survey representative of the Dutch population, that circumvents the issue of defining the social circle, we consider collateralized, consumer, and informal loans. We find robust social effects on borrowing, especially among those who consider themselves poorer than their peers; and on indebtedness, suggesting a link to financial distress. We employ a number of approaches to rule out spurious associations and to handle correlated effects.
I evaluate the effect of inflation targeting on inflation and how it interacts with product market deregulation during the disinflationary process in the 1990s. Using a sample of 21 OECD countries, I show that, after controlling for product market deregulation, the effect of inflation targeting is quantitatively important and statistically significant. Moreover, product market deregulation also matters in particular in countries that adopted an inflation targeting regime. I propose a New Keynesian Phillips curve with an explicit role for market deregulation to rationalize the empirical evidence.
Venture capital (VC) investment has long been conceptualized as a local business , in which the VC’s ability to source, syndicate, fund, monitor, and add value to portfolio firms critically depends on their access to knowledge obtained through their ties to the local (i.e., geographically proximate) network. Consistent with the view that local networks matter, existing research confirms that local and geographically distant portfolio firms are sourced, syndicated, funded, and monitored differently. Curiously, emerging research on VC investment practice within the United States finds that distant investments, as measured by “exits” (either initial public offering or merger & acquisition) out-perform local investments. These findings raise important questions about the assumed benefits of local network membership and proximity. To more deeply probe these questions, we contrast the deal structure of cross-border VC investment with domestic VC investment, and contrast the deal structure of cross-border VC investments that include a local
partner with those that do not. Evidence from 139,892 rounds of venture capital financing in the period 1980-2009 suggests that cross-border investment practice, in terms of deal sourcing, syndication, and performance indeed change with proximity, but that monitoring practices do not. Further, we find that the inclusion of a local partner in the investment syndicate yields surprisingly few benefits. This evidence, we argue, raises important questions about VC investment practice as well as the ability of firms to capture and lever the presumed benefits of network membership.
Erfahrungen aus einer transdisziplinär angeleiteten Serie von Stakeholder-Workshops zur nachhaltigen Klima-Adaption von mitteleuropäischen Wirtschaftswäldern werden vorgestellt und hinsichtlich der Baumartenwahl, der Risikoreduktion und der Segregation von Funktionen ausgewertet. Eine vorhergehende Diskursfeldanalyse erleichterte sowohl die Auswahl der Stakeholder als auch die anschließende Analyse der durchgeführten Stakeholder-Prozesse. Die ausreichende Beteiligung gesellschaftlicher Anspruchsgruppen trägt nicht nur dazu bei, mögliche gesellschaftliche Anforderungen an die Klima-Adaption von Wirtschaftswäldern zu identifizieren, sondern sie auch so breit zu erörtern, dass damit deren Konkretisierung möglich wird. Soweit eine Atmosphäre gegenseitigen Lernens geschaffen werden kann, können dabei auch bekannte (oder vermutete) Frontstellungen aufgebrochen und Auswege zu einer konfliktvermeidenden Umsetzung (z.B. durch Aufbau einer fachübergreifende Begleitforschung) aufgezeigt werden.