Refine
Year of publication
- 2012 (87) (remove)
Document Type
- Working Paper (87) (remove)
Has Fulltext
- yes (87) (remove)
Is part of the Bibliography
- no (87)
Keywords
- model uncertainty (4)
- monetary policy (4)
- DSGE models (3)
- Monetary Policy (3)
- forecasting (3)
- ECB (2)
- Fiscal Policy (2)
- Formale Semantik (2)
- Funktionale Programmierung (2)
- Greenbook (2)
Institute
- Center for Financial Studies (CFS) (25)
- Institute for Monetary and Financial Stability (IMFS) (20)
- Wirtschaftswissenschaften (16)
- Institut für sozial-ökologische Forschung (ISOE) (10)
- House of Finance (HoF) (7)
- Rechtswissenschaft (6)
- Exzellenzcluster Die Herausbildung normativer Ordnungen (5)
- Gesellschaftswissenschaften (4)
- Informatik (4)
- Institute for Law and Finance (ILF) (4)
This paper studies constrained portfolio problems that may involve constraints on the probability or the expected size of a shortfall of wealth or consumption. Our first contribution is that we solve the problems by dynamic programming, which is in contrast to the existing literature that applies the martingale method. More precisely, we construct the non-separable value function by formalizing the optimal constrained terminal wealth to be a (conjectured) contingent claim on the optimal non-constrained terminal wealth. This is relevant by itself, but also opens up the opportunity to derive new solutions to constrained problems. As a second contribution, we thus derive new results for non-strict constraints on the shortfall of inter¬mediate wealth and/or consumption.
The complexity resulting from intertwined uncertainties regarding model misspecification and mismeasurement of the state of the economy defines the monetary policy landscape. Using the euro area as laboratory this paper explores the design of robust policy guides aiming to maintain stability in the economy while recognizing this complexity. We document substantial output gap mismeasurement and make use of a new model data base to capture the evolution of model specification. A simple interest rate rule is employed to interpret ECB policy since 1999. An evaluation of alternative policy rules across 11 models of the euro area confirms the fragility of policy analysis optimized for any specific model and shows the merits of model averaging in policy design. Interestingly, a simple difference rule with the same coefficients on inflation and output growth as the one used to interpret ECB policy is quite robust as long as it responds to current outcomes of these variables.
The 'de-allative'-pattern (Heine/ Kuteva 2008: 103) gives rise to the French grammaticalized periphrasis aller + INF and the Spanish grammaticalized periphrasis ir a + INF. This construction (anar + INF) also consists in Catalan, but here, however, with the periphrasis expressing a past tense. Concerning the grammaticalization path ir a + INF and aller + INF were formerly used to express a past (historical present), whereas anar + INF also expressed a future (and can still take on this function). This paper discusses possible reasons for the development and the thus exceptional position of the Catalan past-periphrasis. In addition to morphological and normative explanations, language contact between Catalan and Spanish/ French as well as sociolinguistic circumstances are factors which may possibly account for the development of the Catalan construction. After a separate presentation of the development and the former and actual use(s) and forms of the three periphrasis, the cognitive processes which took place during the grammaticalization are presented. Afterward the three periphrasis are compared using the parameters of Lehmann. The second part of this paper consists of a corpus which verifies and illustrates the results of the previous part.
In this paper we investigate the comparative properties of empirically-estimated monetary models of the U.S. economy using a new database of models designed for such investigations. We focus on three representative models due to Christiano, Eichenbaum, Evans (2005), Smets and Wouters (2007) and Taylor (1993a). Although these models differ in terms of structure, estimation method, sample period, and data vintage, we find surprisingly similar economic impacts of unanticipated changes in the federal funds rate. However, optimized monetary policy rules differ across models and lack robustness. Model averaging offers an effective strategy for improving the robustness of policy rules.
We show how Sestoft’s abstract machine for lazy evaluation of purely functional programs can be extended to evaluate expressions of the calculus CHF – a process calculus that models Concurrent Haskell extended by imperative and implicit futures. The abstract machine is modularly constructed by first adding monadic IO-actions to the machine and then in a second step we add concurrency. Our main result is that the abstract machine coincides with the original operational semantics of CHF, w.r.t. may- and should-convergence.
Der Beitrag arbeitet d’Alemberts Absicht heraus, durch eine mathematisch begründete Wissensordnung, die einen universellen Geltungsanspruch erhebt, die Dogmatik des religiösen Glaubens genauso wie die Autoritätshörigkeit der ständischen Gesellschaft zu diskreditieren und somit die bestehenden Machtverhältnisse zu destabilisieren. Mit der universalistischen Wissensordnung der Enzyklopädie soll schließlich auch eine neue normative Ordnung etabliert werden. Dabei ist die Rolle der Enzyklopädisten als Aufklärer der unaufgeklärten Gesellschaft Teil der universalistischen Wissensordnung. Obwohl sich die beanspruchte emanzipative Wirkung des enzyklopädischen Universalismus im historischen Kontext von Dogmatismus und Despotismus nachvollziehen lässt, stellt sich die Frage, inwieweit dieser universalistische Anspruch tatsächlich eine ‚kritischen Haltung’, wie sie von Foucault anhand Kants Schrift „Was ist Aufklärung?“ definiert wird, genügt oder doch nur eine ‚Wissenskritik’ bleibt. Die Haltungsfrage verweist auf d’Alemberts ambivalentes Verhältnis zu den politischen Autoritäten seiner Zeit.
In Germany, as in almost all industrial countries, active pharmaceutical substances can now be found in virtually all water bodies and occasionally also in drinking water. Even though the concentrations in question tend to be very low, there are initial signs of their impact on aquatic life. There is no evidence as yet of any acute consequences for human health. It is, however, impossible to rule out long-term consequences from these minimal concentrations or unexpected effects from the interaction between various active ingredients (cocktail effect). At special risk here are sensitive segments of the population such as children and the chronically ill. There is thus a need for action on precautionary grounds.
The main actors in the health system are largely unaware of the problem posed by drug residues in water. Although knowledge cannot be equated with awareness – given the existence of the ‘not wanting to know' phenomenon – the first step is to generate a consolidated knowledge base. Only by creating awareness of the problem can further strategies be implemented to ultimately enlighten and bring about behavioural change. At stake here is the overall everyday handling of medications, including prescription, compliance, and drug-free disease prevention down to the doctor-patient relationship. The latter, namely, is often characterised by misunderstandings and a lack of communication about the – supposed – need to prescribe drugs.
The first part of the strategy for the general public involves using various channels and media to address three different target groups. These were identified by ISOE in an empirical survey as reacting differently to the problem under review:
· ‘The Deniers/Relativists'
· ‘The Truth-Seekers'
· ‘The Hypersensitives'
The intention is to address each target group in the right tone and using the most suitable line of reasoning via specific media and with the proper degree of differentiation. The ‘Truth-Seekers' play an opinion-leading role here. They can be provided with highly differentiated information through sophisticated media which they then pass on to their dialogue partners in an appropriate form.
The second part of the strategy for the general public relates to the communication of proper disposal routes for expired drugs. The goal is to confine disposal to pharmacies so that on no account are they flushed down the sink or toilet. Based on an analysis of typical errors in existing communications media on this topic, ISOE prepared recommendations for drafting proper information materials.
In addressing pharmacists, the first priority is to convey hard facts: to this end we propose a PR campaign to place articles in the main specialist media. At the same time, the subject should feature in training and continuing education programmes. Another aim is to strengthen the advisory function of the pharmacies. The environmentally sensitive target group would indeed react positively to having their attention drawn to the issue of drug residues in water. For all other customers, the pharmacists can and should act as consultants: they emphasise how important it is to take medication as instructed (compliance) and use suitable pack sizes, and warn older customers in particular about the potential hazards of improper drug intake.
The first stage of the communications strategy for doctors likewise revolves around knowledge. Here, however, it is important to take into account their self-image as scientists while in fact having little grasp of this specific area. The line to take is that of ‘discursive selfenlightenment'. This means that the issue of drug residues in water cannot be conveyed to doctors by laymen but must be taken up and imparted via the major media of the medical profession and by medical association officials (top-down).
The second stage, namely that of raising doctors’ awareness of the problem, is likely to encounter strong resistance from some of the medical profession. They may fear a threat of interference in treatment plans from an environmental perspective and feel the need to emphasise that doctors are not responsible for environmental issues. As shown in empirical surveys by ISOE, such a defensive reaction is ultimately down to an underlying taboo: people are loath to discuss the over-prescription taking place in countless doctors' surgeries. And it is a fact that this problem cannot be tackled from the environmental perspective, although the goals of water protection are indeed consistent with the economic objectives of restraint in the deployment of drugs. Any communications measure for this target group has to bear in mind that doctors feel restricted by what they see as a ‘perpetual health reform' no matter which government is in power. On no account are they prepared to tolerate any new form of regulation, in this case for environmental reasons.
An entirely different view of the problem is taken by ‘critical doctors' such as specialists in environmental health and those with a naturopathic focus. They are interested in the problem because they see a connection between the quality of our environment and our health. What is more, they have patients keen to be prescribed as few drugs as possible and who are instead interested in ‘talking medicine'. So, any communication strategy intent on tackling the difficult problem of oversubscribing drugs needs to look carefully at the experiences of these medical professionals and also at a ‘bottom-up strategy'.
Implementation of strategic communications should be entrusted to an agency with experience in ‘issue management'. Knowledge of social marketing and the influencing of behaviour are further prerequisites. All important decisions should be taken by a consensus committee (‘MeriWa'1 round table), in which the medical profession, pharmacists and consumers are represented.
Bei der Information und Sensibilisierung von Eigenheimbesitzer/innen für das Thema
Energie und CO2-Einsparung im Gebäudesektor stehen bislang vor allem breitenwirksame
Instrumente zur Verfügung. Dialogische Kommunikationsangebote, die in anderen
Bereichen des Nachhaltigkeitsmarketing eingesetzt werden, sind bislang nur wenig
verbreitet.
In dem vorliegenden Arbeitspapier werden die Bausteine einer integrierten Kommunikationsstrategie
für eine energetische Gebäudesanierung beleuchtet. Nach dem Verständnis
der Autor/innen umfasst eine solche Strategie monologisches und dialogisches
Marketing, Energieberatung sowie Markenbildung. Gestützt auf konzeptionelle
Überlegungen und empirische Ergebnisse werden im ersten Teil grundlegende Ziele
und Elemente einer dialogischen Kommunikationsstrategie für eine energetische Sanierung
erläutert. Im zweiten Teil illustrieren konkrete Beispiele, wie unter anderem
eine dialogische Kommunikation für unterschiedliche Sanierungsanlässe in der Praxis
gestaltet werden kann.
This paper examines data on financial sophistication among the U.S. older population, using a special-purpose module implemented in the Health and Retirement Study. We show that financial sophistication is deficient for older respondents (aged 55+). Specifically, many in this group lack a basic grasp of asset pricing, risk diversification, portfolio choice, and investment fees. Subpopulations with particular deficits include women, the least educated, persons over the age of 75, and non-Whites. In view of the fact that people are increasingly being asked to take on responsibility for their own retirement security, such lack of knowledge can have serious implications.
We argue that the U.S. personal saving rate’s long stability (1960s–1980s), subsequent steady decline (1980s–2007), and recent substantial rise (2008–2011) can be interpreted using a parsimonious ‘buffer stock’ model of consumption in the presence of labor income uncertainty and credit constraints. Saving in the model is affected by the gap between ‘target’ and actual wealth, with the target determined by credit conditions and uncertainty. An estimated structural version of the model suggests that increased credit availability accounts for most of the long-term saving decline, while fluctuations in wealth and uncertainty capture the bulk of the business-cycle variation.
In the aftermath of the global financial crisis, the state of macroeconomic modeling and the use of macroeconomic models in policy analysis has come under heavy criticism. Macroeconomists in academia and policy institutions have been blamed for relying too much on a particular class of macroeconomic models. This paper proposes a comparative approach to macroeconomic policy analysis that is open to competing modeling paradigms. Macroeconomic model comparison projects have helped produce some very influential insights such as the Taylor rule. However, they have been infrequent and costly, because they require the input of many teams of researchers and multiple meetings to obtain a limited set of comparative findings. This paper provides a new approach that enables individual researchers to conduct model comparisons easily, frequently, at low cost and on a large scale. Using this approach a model archive is built that includes many well-known empirically estimated models that may be used for quantitative analysis of monetary and fiscal stabilization policies. A computational platform is created that allows straightforward comparisons of models’ implications. Its application is illustrated by comparing different monetary and fiscal policies across selected models. Researchers can easily include new models in the data base and compare the effects of novel extensions to established benchmarks thereby fostering a comparative instead of insular approach to model development.
In the aftermath of the global financial crisis, the state of macroeconomicmodeling and the use of macroeconomic models in policy analysis has come under heavy criticism. Macroeconomists in academia and policy institutions have been blamed for relying too much on a particular class of macroeconomic models. This paper proposes a comparative approach to macroeconomic policy analysis that is open to competing modeling paradigms. Macroeconomic model comparison projects have helped produce some very influential insights such as the Taylor rule. However, they have been infrequent and costly, because they require the input of many teams of researchers and multiple meetings to obtain a limited set of comparative findings. This paper provides a new approach that enables individual researchers to conduct model comparisons easily, frequently, at low cost and on a large scale. Using this approach a model archive is built that includes many well-known empirically estimated models that may be used for quantitative analysis of monetary and fiscal stabilization policies. A computational platform is created that allows straightforward comparisons of models’ implications. Its application is illustrated by comparing different monetary and fiscal policies across selected models. Researchers can easily include new models in the data base and compare the effects of novel extensions to established benchmarks thereby fostering a comparative instead of insular approach to model development
A. Einleitung
B. Die Rechtsgrundlagen der Compliance in Aktiengesellschaft und Aktienkonzern
I. Normierung der Compliance im Bank- und Versicherungsaufsichtsrecht
II. Deutscher Corporate Governance Kodex
III. Bestandteil des aktienrechtlichen Früherkennungssystems
IV. Organisationspflichten des Vorstands im Innenverhältnis
1. Legalitätspflicht
2. Legalitätskontrollpflicht
a. Residualpflichten bei vertikaler Delegation
b. Schadenabwendungspflicht
3. Zwischenergebnis
V. Organisationspflichten der Gesellschaft im Außenverhältnis
1. Aufsichtsmaßnahmen nach § 130 OWiG
a. Aufsichtsmaßnahmen
b. Begrenzte Reichweite
c. Geltung im Konzern
2. Haftung für Verrichtungsgehilfen nach § 831 BGB
a. Pflichtenumfang
aa. Eignungsaufsicht
bb. Einweisung und Anleitung
b. Begrenzte Reichweite
aa. Dezentralisierter Entlastungsbeweis
bb. Pointilistisches Konzept
c. Geltung im Konzern
3. Betriebliche Organisationspflichten gemäß § 823 BGB
a. Inhalt
b. Entstehung durch Delegation von Verkehrspflichten
c. Entstehung durch Arbeitsteilung d. Geltung im Konzern
aa. Delegation von Verkehrspflichten
bb. Ausgliederung einer gefährlichen Tätigkeit
cc. Verkehrspflicht aus Arbeitsteilung
C. Schluss
This paper outlines relatively easy to implement reforms for the supervision of transnational banking-groups in the E.U. that should not be primarily based on legal form but on the actual risk structures of the pertinent financial institutions. The proposal also aims at paying close attention to the economics of public administration and international relations in allocating competences among national and supranational supervisory bodies. Before detailing the own proposition, this paper looks into the relationship between sovereign debt and banking crises that drive regulatory reactions to the financial turmoil in the Euro area. These initiatives inter alia affirm effective prudential supervision as a pivotal element of crisis prevention. In order to arrive at a more informed idea, which determinants apart from a perceived appetite for regulatory arbitrage drive banks’ organizational choices, this paper scrutinizes the merits of either a branch or subsidiary structure for the cross-border business of financial institutions. In doing so, it also considers the policy-makers perspective. The analysis shows that no one size fits all organizational structure is available and concludes that banks’ choices should generally not be second-guessed, particularly because they are subject to (some) market discipline. The analysis proceeds with describing and evaluating how competences in prudential supervision are currently allocated among national and supranational supervisory authorities. In order to assess the findings the appraisal adopts insights form the economics of public administration and international relations. It argues that the supervisory architecture has to be more aligned with bureaucrats’ incentives and that inefficient requirements to cooperate and share information should be reduced. Contrary to a widespread perception, shifting responsibility to a supranational authority cannot solve all the problems identified. Resting on these foundations, the last part of this paper finally sketches an alternative solution that dwells on far-reaching mutual recognition of national supervisory regimes and allocates competences in line with supervisors’ incentives and the risk inherent in crossborder banking groups.
Motivated by the U.S. events of the 2000s, we address whether a too low for too long interest rate policy may generate a boom-bust cycle. We simulate anticipated and unanticipated monetary policies in state-of-the-art DSGE models and in a model with bond financing via a shadow banking system, in which the bond spread is calibrated for normal and optimistic times. Our results suggest that the U.S. boom-bust was caused by the combination of (i) too low for too long interest rates, (ii) excessive optimism and (iii) a failure of agents to anticipate the extent of the abnormally favorable conditions.
This paper investigates the accuracy of point and density forecasts of four DSGE models for inflation, output growth and the federal funds rate. Model parameters are estimated and forecasts are derived successively from historical U.S. data vintages synchronized with the Fed’s Greenbook projections. Point forecasts of some models are of similar accuracy as the forecasts of nonstructural large dataset methods. Despite their common underlying New Keynesian modeling philosophy, forecasts of different DSGE models turn out to be quite distinct. Weighted forecasts are more precise than forecasts from individual models. The accuracy of a simple average of DSGE model forecasts is comparable to Greenbook projections for medium term horizons. Comparing density forecasts of DSGE models with the actual distribution of observations shows that the models overestimate uncertainty around point forecasts.
In this paper, I introduce lumpy micro-level capital adjustment into a sticky information general equilibrium model. Lumpy adjustment arises because of inattentiveness in capital investment decisions instead of the more common assumption of non-convex adjustment costs. The model features inattentiveness as the only source of stickiness. I find that the model with lumpy investment yields business cycle dynamics which differ substantially from those of an otherwise identical model with frictionless investment and are much more consistent with the empirical evidence. These results therefore strengthen the case in favour of the relevance of microeconomic investment lumpiness for the business cycle.
In the aftermath of the global financial crisis and great recession, many countries face substantial deficits and growing debts. In the United States, federal government outlays as a ratio to GDP rose substantially from about 19.5 percent before the crisis to over 24 percent after the crisis. In this paper we consider a fiscal consolidation strategy that brings the budget to balance by gradually reducing this spending ratio over time to the level that prevailed prior to the crisis. A crucial issue is the impact of such a consolidation strategy on the economy. We use structural macroeconomic models to estimate this impact focussing primarily on a dynamic stochastic general equilibrium model with price and wage rigidities and adjustment costs. We separate out the impact of reductions in government purchases and transfers, and we allow for a reduction in both distortionary taxes and government debt relative to the baseline of no consolidation. According to the model simulations GDP rises in the short run upon announcement and implementation of this fiscal consolidation strategy and remains higher than the baseline in the long run. We explore the role of the mix of expenditure cuts and tax reductions as well as gradualism in achieving this policy outcome. Finally, we conduct sensitivity studies regarding the type of model used and its parameterization.
The complexity resulting from intertwined uncertainties regarding model misspecification and mismeasurement of the state of the economy defines the monetary policy landscape. Using the euro area as laboratory this paper explores the design of robust policy guides aiming to maintain stability in the economy while recognizing this complexity. We document substantial output gap mismeasurement and make use of a new model data base to capture the evolution of model specification. A simple interest rate rule is employed to interpret ECB policy since 1999. An evaluation of alternative policy rules across 11 models of the euro area confirms the fragility of policy analysis optimized for any specific model and shows the merits of model averaging in policy design. Interestingly, a simple difference rule with the same coefficients on inflation and output growth as the one used to interpret ECB policy is quite robust as long as it responds to current outcomes of these variables.
We examine both the degree and the structural stability of inflation persis tence at different quantiles of the conditional inflation distribution. Previous research focused exclusively on persistence at the conditional mean of the inflation rate. Economic theory, however, provides various reasons -for example downward wage rigidities or menu costs- to expect higher inflation persistence at the upper than at the lower tail of the conditional inflation distribution.
Based on post-war US data we indeed find slower mean reversion in response to positive than to negative shocks. We find robust evidence for a structural break in persistence at all quantiles of the inflation process in the early 1980s. Inflation persistence has decreased and become more homogeneous across quantiles. Persistence at the conditional mean became more informative about the degree of persistence across the entire conditional inflation distribution. While prior to the 1980s inflation was not mean reverting in response to large positive shocks, our evidence strongly suggests that since the end of the Volcker disinflation the unit root can be rejected at every quantile including the upper tail of the conditional inflation distribution.