Refine
Year of publication
Document Type
- Working Paper (56)
- Article (8)
- Report (8)
- Book (1)
- Contribution to a Periodical (1)
Has Fulltext
- yes (74)
Is part of the Bibliography
- no (74)
Keywords
- Geldpolitik (16)
- monetary policy (10)
- European Central Bank (6)
- Europäische Union (6)
- Model Uncertainty (6)
- Monetary Policy (6)
- Währungsunion (5)
- rational expectations (5)
- Europäische Zentralbank (4)
- Robustness (4)
Institute
- Center for Financial Studies (CFS) (47)
- Wirtschaftswissenschaften (44)
- Institute for Monetary and Financial Stability (IMFS) (37)
- Sustainable Architecture for Finance in Europe (SAFE) (20)
- House of Finance (HoF) (11)
- ELEMENTS (1)
- Institute for Law and Finance (ILF) (1)
- Präsidium (1)
- Rechtswissenschaft (1)
Renewed interest in fiscal policy has increased the use of quantitative models to evaluate policy. Because of modeling uncertainty, it is essential that policy evaluations be robust to alternative assumptions. We find that models currently being used in practice to evaluate fiscal policy stimulus proposals are not robust. Government spending multipliers in an alternative empirically-estimated and widely-cited new Keynesian model are much smaller than in these old Keynesian models; the estimated stimulus is extremely small with GDP and employment effects only one-sixth as large.
This paper introduces adaptive learning and endogenous indexation in the New-Keynesian Phillips curve and studies disinflation under inflation targeting policies. The analysis is motivated by the disinflation performance of many inflation-targeting countries, in particular the gradual Chilean disinflation with temporary annual targets. At the start of the disinflation episode price-setting firms’ expect inflation to be highly persistent and opt for backward-looking indexation. As the central bank acts to bring inflation under control, price-setting firms revise their estimates of the degree of persistence. Such adaptive learning lowers the cost of disinflation. This reduction can be exploited by a gradual approach to disinflation. Firms that choose the rate for indexation also re-assess the likelihood that announced inflation targets determine steady-state inflation and adjust indexation of contracts accordingly. A strategy of announcing and pursuing short-term targets for inflation is found to influence the likelihood that firms switch from backward-looking indexation to the central bank’s targets. As firms abandon backward-looking indexation the costs of disinflation decline further. We show that an inflation targeting strategy that employs temporary targets can benefit from lower disinflation costs due to the reduction in backward-looking indexation.
Monetary policy analysts often rely on rules-of-thumb, such as the Taylor rule, to describe historical monetary policy decisions and to compare current policy to historical norms. Analysis along these lines also permits evaluation of episodes where policy may have deviated from a simple rule and examination of the reasons behind such deviations. One interesting question is whether such rules-of-thumb should draw on policymakers "forecasts of key variables such as inflation and unemployment or on observed outcomes. Importantly, deviations of the policy from the prescriptions of a Taylor rule that relies on outcomes may be due to systematic responses to information captured in policymakers" own projections. We investigate this proposition in the context of FOMC policy decisions over the past 20 years using publicly available FOMC projections from the biannual monetary policy reports to the Congress (Humphrey-Hawkins reports). Our results indicate that FOMC decisions can indeed be predominantly explained in terms of the FOMC´s own projections rather than observed outcomes. Thus, a forecast-based rule-of-thumb better characterizes FOMC decision-making. We also confirm that many of the apparent deviations of the federal funds rate from an outcome-based Taylor-style rule may be considered systematic responses to information contained in FOMC projections.
The European Central Bank
(2007)
The establishment of the ECB and with it the launch of the euro has arguably been a unique endeavor in economic history, representing an important experiment in central banking. This note aims to summarize some of the main lessons learned from this experiment and sketch some of the prospects for the ECB. It is written for "The New Palgrave Dictionary of Economics", 2nd edition. JEL Classification: E52, E58
This paper investigates the accuracy and heterogeneity of output growth and inflation forecasts during the current and the four preceding NBER-dated U.S. recessions. We generate forecasts from six different models of the U.S. economy and compare them to professional forecasts from the Federal Reserve’s Greenbook and the Survey of Professional Forecasters (SPF). The model parameters and model forecasts are derived from historical data vintages so as to ensure comparability to historical forecasts by professionals. The mean model forecast comes surprisingly close to the mean SPF and Greenbook forecasts in terms of accuracy even though the models only make use of a small number of data series. Model forecasts compare particularly well to professional forecasts at a horizon of three to four quarters and during recoveries. The extent of forecast heterogeneity is similar for model and professional forecasts but varies substantially over time. Thus, forecast heterogeneity constitutes a potentially important source of economic fluctuations. While the particular reasons for diversity in professional forecasts are not observable, the diversity in model forecasts can be traced to different modeling assumptions, information sets and parameter estimates. JEL Classification: C53, D84, E31, E32, E37 Keywords: Forecasting, Business Cycles, Heterogeneous Beliefs, Forecast Distribution, Model Uncertainty, Bayesian Estimation
Price stability and monetary policy effectiveness when nominal interest rates are bounded at zero
(2003)
This paper employs stochastic simulations of a small structural rational expectations model to investigate the consequences of the zero bound on nominal interest rates. We find that if the economy is subject to stochastic shocks similar in magnitude to those experienced in the U.S. over the 1980s and 1990s, the consequences of the zero bound are negligible for target inflation rates as low as 2 percent. However, the effects of the constraint are non-linear with respect to the inflation target and produce a quantitatively significant deterioration of the performance of the economy with targets between 0 and 1 percent. The variability of output increases significantly and that of inflation also rises somewhat. Also, we show that the asymmetry of the policy ineffectiveness induced by the zero bound generates a non-vertical long-run Phillips curve. Output falls increasingly short of potential with lower inflation targets.
In this paper we estimate a small model of the euro area to be used as a laboratory for evaluating the performance of alternative monetary policy strategies. We start with the relationship between output and inflation and investigate the fit of the nominal wage contracting model due to Taylor (1980)and three different versions of the relative real wage contracting model proposed by Buiter and Jewitt (1981)and estimated by Fuhrer and Moore (1995a) for the United States. While Fuhrer and Moore reject the nominal contracting model in favor of the relative contracting model which induces more inflation persistence, we find that both models fit euro area data reasonably well. When considering France, Germany and Italy separately, however, we find that the nominal contracting model fits German data better, while the relative contracting model does quite well in countries which transitioned out of a high inflation regime such as France and Italy. We close the model by estimating an aggregate demand relationship and investigate the consequences of the different wage contracting specifications for the inflation-output variability tradeoff, when interest rates are set according to Taylor 's rule.
In this paper we study the role of the exchange rate in conducting monetary policy in an economy with near-zero nominal interest rates as experienced in Japan since the mid-1990s. Our analysis is based on an estimated model of Japan, the United States and the euro area with rational expectations and nominal rigidities. First, we provide a quantitative analysis of the impact of the zero bound on the effectiveness of interest rate policy in Japan in terms of stabilizing output and inflation. Then we evaluate three concrete proposals that focus on depreciation of the currency as a way to ameliorate the effect of the zero bound and evade a potential liquidity trap. Finally, we investigate the international consequences of these proposals.
Inflation-targeting central banks have only imperfect knowledge about the effect of policy decisions on inflation. An important source of uncertainty is the relationship between inflation and unemployment. This paper studies the optimal monetary policy in the presence of uncertainty about the natural unemployment rate, the short-run inflation-unemployment tradeoff and the degree of inflation persistence in a simple macroeconomic model, which incorporates rational learning by the central bank as well as private sector agents. Two conflicting motives drive the optimal policy. In the static version of the model, uncertainty provides a motive for the policymaker to move more cautiously than she would if she knew the true parameters. In the dynamic version, uncertainty also motivates an element of experimentation in policy. I find that the optimal policy that balances the cautionary and activist motives typically exhibits gradualism, that is, it still remains less aggressive than a policy that disregards parameter uncertainty. Exceptions occur when uncertainty is very high and in inflation close to target.
We investigate the performance of forecast-based monetary policy rules using five macroeconomic models that reflect a wide range of views on aggregate dynamics. We identify the key characteristics of rules that are robust to model uncertainty: such rules respond to the one-year-ahead inflation forecast and to the current output gap and incorporate a substantial degree of policy inertia. In contrast, rules with longer forecast horizons are less robust and are prone to generating indeterminacy. Finally, we identify a robust benchmark rule that performs very well in all five models over a wide range of policy preferences.
In this study, we perform a quantitative assessment of the role of money as an indicator variable for monetary policy in the euro area. We document the magnitude of revisions to euro area-wide data on output, prices, and money, and find that monetary aggregates have a potentially significant role in providing information about current real output. We then proceed to analyze the information content of money in a forward-looking model in which monetary policy is optimally determined subject to incomplete information about the true state of the economy. We show that monetary aggregates may have substantial information content in an environment with high variability of output measurement errors, low variability of money demand shocks, and a strong contemporaneous linkage between money demand and real output. As a practical matter, however, we conclude that money has fairly limited information content as an indicator of contemporaneous aggregate demand in the euro area.
In this paper, we examine the cost of insurance against model uncertainty for the Euro area considering four alternative reference models, all of which are used for policy-analysis at the ECB.We find that maximal insurance across this model range in terms of aMinimax policy comes at moderate costs in terms of lower expected performance. We extract priors that would rationalize the Minimax policy from a Bayesian perspective. These priors indicate that full insurance is strongly oriented towards the model with highest baseline losses. Furthermore, this policy is not as tolerant towards small perturbations of policy parameters as the Bayesian policy rule. We propose to strike a compromise and use preferences for policy design that allow for intermediate degrees of ambiguity-aversion.These preferences allow the specification of priors but also give extra weight to the worst uncertain outcomes in a given context. JEL Klassifikation: E52, E58, E61.
In this paper, we examine the cost of insurance against model uncertainty for the Euro area considering four alternative reference models, all of which are used for policy-analysis at the ECB.We find that maximal insurance across this model range in terms of aMinimax policy comes at moderate costs in terms of lower expected performance. We extract priors that would rationalize the Minimax policy from a Bayesian perspective. These priors indicate that full insurance is strongly oriented towards the model with highest baseline losses. Furthermore, this policy is not as tolerant towards small perturbations of policy parameters as the Bayesian policy rule. We propose to strike a compromise and use preferences for policy design that allow for intermediate degrees of ambiguity-aversion.These preferences allow the specification of priors but also give extra weight to the worst uncertain outcomes in a given context. JEL Klassifikation: E52, E58, E61
In this paper, we study the effectiveness of monetary policy in a severe recession and deflation when nominal interest rates are bounded at zero. We compare two alternative proposals for ameliorating the effect of the zero bound: an exchange-rate peg and price-level targeting. We conduct this quantitative comparison in an empirical macroeconometric model of Japan, the United States and the euro area. Furthermore, we use a stylized micro-founded two-country model to check our qualitative findings. We find that both proposals succeed in generating inflationary expectations and work almost equally well under full credibility of monetary policy. However, price-level targeting may be less effective under imperfect credibility, because the announced price-level target path is not directly observable. Klassifikation: E31, E52, E58, E61
Under a conventional policy rule, a central bank adjusts its policy rate linearly according to the gap between inflation and its target, and the gap between output and its potential. Under "the opportunistic approach to disinflation" a central bank controls inflation aggressively when inflation is far from its target, but concentrates more on output stabilization when inflation is close to its target, allowing supply shocks and unforeseen fluctuations in aggregate demand to move inflation within a certain band. We use stochastic simulations of a small-scale rational expectations model to contrast the behavior of output and inflation under opportunistic and linear rules. Klassifikation: E31, E52, E58, E61. July, 2005.
The Russian war of aggression against Ukraine since 24 February 2022 has intensified the discussion of Europe’s reliance on energy imports from Russia. A ban on Russian imports of oil, natural gas and coal has already been imposed by the United States, while the United Kingdom plans to cease imports of oil and coal from Russia by the end of 2022. The German Federal Government is currently opposing an energy embargo against Russia. However, the Federal Ministry for Economic Affairs and Climate Action is working on a strategy to reduce energy imports from Russia. In this paper, the authors give an overview of the German and European reliance on energy imports from Russia with a focus on gas imports and discuss price effects, alternative suppliers of natural gas, and the potential for saving and replacing natural gas. They also provide an overview of estimates of the consequences on the economic outlook if the conflict intensifies.
Veronika Grimm, Lukas Nöh, and Volker Wieland assess the possible development of government interest expenditures as a share of GDP for Germany, France, Italy and Spain. Until 2021, these and other member states could anticipate a further reduction of interest expenditure in the future. This outlook has changed considerably with the recent surge in inflation and government bond rates. Nevertheless, under reasonable assumptions current yield curves still imply that interest expenditure relative to GDP can be stabilized at the current level. The authors also review the implications of a further upward shift in the yield curves of 1 or 2 percentage points. These implications suggest significant medium-term risks for highly indebted member states with interest expenditure approaching or exceeding levels last observed on the eve of the euro area debt crisis. In light of these risks, governments of euro area member states should take substantive action to achieve a sustained decline in debt-to-GDP ratios towards safer levels. They bear the responsibility for making sure that government finances can weather the higher interest rates which are required to achieve price stability in the euro area.
This note argues that the European Central Bank should adjust its strategy in order to consider broader measures of inflation in its policy deliberations and communications. In particular, it points out that a broad measure of domestic goods and services price inflation such as the GDP deflator has increased along with the euro area recovery and the expansion of monetary policy since 2013, while HICP inflation has become more variable and, on average, has declined. Similarly, the cost of owner-occupied housing, which is excluded from the HICP, has risen during this period. Furthermore, it shows that optimal monetary policy at the effective lower bound on nominal interest rates aims to return inflation more slowly to the inflation target from below than in normal times because of uncertainty about the effects and potential side effects of quantitative easing.
This paper proposes a new approach for modeling investor fear after rare disasters. The key element is to take into account that investors’ information about fundamentals driving rare downward jumps in the dividend process is not perfect. Bayesian learning implies that beliefs about the likelihood of rare disasters drop to a much more pessimistic level once a disaster has occurred. Such a shift in beliefs can trigger massive declines in price-dividend ratios. Pessimistic beliefs persist for some time. Thus, belief dynamics are a source of apparent excess volatility relative to a rational expectations benchmark. Due to the low frequency of disasters, even an infinitely-lived investor will remain uncertain about the exact probability. Our analysis is conducted in continuous time and offers closed-form solutions for asset prices. We distinguish between rational and adaptive Bayesian learning. Rational learners account for the possibility of future changes in beliefs in determining their demand for risky assets, while adaptive learners take beliefs as given. Thus, risky assets tend to be lower-valued and price-dividend ratios vary less under adaptive versus rational learning for identical priors. Keywords: beliefs, Bayesian learning, controlled diffusions and jump processes, learning about jumps, adaptive learning, rational learning. JEL classification: D83, G11, C11, D91, E21, D81, C61
The complexity resulting from intertwined uncertainties regarding model misspecification and mismeasurement of the state of the economy defines the monetary policy landscape. Using the euro area as laboratory this paper explores the design of robust policy guides aiming to maintain stability in the economy while recognizing this complexity. We document substantial output gap mismeasurement and make use of a new model data base to capture the evolution of model specification. A simple interest rate rule is employed to interpret ECB policy since 1999. An evaluation of alternative policy rules across 11 models of the euro area confirms the fragility of policy analysis optimized for any specific model and shows the merits of model averaging in policy design. Interestingly, a simple difference rule with the same coefficients on inflation and output growth as the one used to interpret ECB policy is quite robust as long as it responds to current outcomes of these variables.
In this paper we investigate the comparative properties of empirically-estimated monetary models of the U.S. economy using a new database of models designed for such investigations. We focus on three representative models due to Christiano, Eichenbaum, Evans (2005), Smets and Wouters (2007) and Taylor (1993a). Although these models differ in terms of structure, estimation method, sample period, and data vintage, we find surprisingly similar economic impacts of unanticipated changes in the federal funds rate. However, optimized monetary policy rules differ across models and lack robustness. Model averaging offers an effective strategy for improving the robustness of policy rules.
In the aftermath of the global financial crisis, the state of macroeconomic modeling and the use of macroeconomic models in policy analysis has come under heavy criticism. Macroeconomists in academia and policy institutions have been blamed for relying too much on a particular class of macroeconomic models. This paper proposes a comparative approach to macroeconomic policy analysis that is open to competing modeling paradigms. Macroeconomic model comparison projects have helped produce some very influential insights such as the Taylor rule. However, they have been infrequent and costly, because they require the input of many teams of researchers and multiple meetings to obtain a limited set of comparative findings. This paper provides a new approach that enables individual researchers to conduct model comparisons easily, frequently, at low cost and on a large scale. Using this approach a model archive is built that includes many well-known empirically estimated models that may be used for quantitative analysis of monetary and fiscal stabilization policies. A computational platform is created that allows straightforward comparisons of models’ implications. Its application is illustrated by comparing different monetary and fiscal policies across selected models. Researchers can easily include new models in the data base and compare the effects of novel extensions to established benchmarks thereby fostering a comparative instead of insular approach to model development.
In the aftermath of the global financial crisis, the state of macroeconomicmodeling and the use of macroeconomic models in policy analysis has come under heavy criticism. Macroeconomists in academia and policy institutions have been blamed for relying too much on a particular class of macroeconomic models. This paper proposes a comparative approach to macroeconomic policy analysis that is open to competing modeling paradigms. Macroeconomic model comparison projects have helped produce some very influential insights such as the Taylor rule. However, they have been infrequent and costly, because they require the input of many teams of researchers and multiple meetings to obtain a limited set of comparative findings. This paper provides a new approach that enables individual researchers to conduct model comparisons easily, frequently, at low cost and on a large scale. Using this approach a model archive is built that includes many well-known empirically estimated models that may be used for quantitative analysis of monetary and fiscal stabilization policies. A computational platform is created that allows straightforward comparisons of models’ implications. Its application is illustrated by comparing different monetary and fiscal policies across selected models. Researchers can easily include new models in the data base and compare the effects of novel extensions to established benchmarks thereby fostering a comparative instead of insular approach to model development
In the aftermath of the global financial crisis and great recession, many countries face substantial deficits and growing debts. In the United States, federal government outlays as a ratio to GDP rose substantially from about 19.5 percent before the crisis to over 24 percent after the crisis. In this paper we consider a fiscal consolidation strategy that brings the budget to balance by gradually reducing this spending ratio over time to the level that prevailed prior to the crisis. A crucial issue is the impact of such a consolidation strategy on the economy. We use structural macroeconomic models to estimate this impact focussing primarily on a dynamic stochastic general equilibrium model with price and wage rigidities and adjustment costs. We separate out the impact of reductions in government purchases and transfers, and we allow for a reduction in both distortionary taxes and government debt relative to the baseline of no consolidation. According to the model simulations GDP rises in the short run upon announcement and implementation of this fiscal consolidation strategy and remains higher than the baseline in the long run. We explore the role of the mix of expenditure cuts and tax reductions as well as gradualism in achieving this policy outcome. Finally, we conduct sensitivity studies regarding the type of model used and its parameterization.
The complexity resulting from intertwined uncertainties regarding model misspecification and mismeasurement of the state of the economy defines the monetary policy landscape. Using the euro area as laboratory this paper explores the design of robust policy guides aiming to maintain stability in the economy while recognizing this complexity. We document substantial output gap mismeasurement and make use of a new model data base to capture the evolution of model specification. A simple interest rate rule is employed to interpret ECB policy since 1999. An evaluation of alternative policy rules across 11 models of the euro area confirms the fragility of policy analysis optimized for any specific model and shows the merits of model averaging in policy design. Interestingly, a simple difference rule with the same coefficients on inflation and output growth as the one used to interpret ECB policy is quite robust as long as it responds to current outcomes of these variables.
This chapter aims to provide a hands-on approach to New Keynesian models and their uses for macroeconomic policy analysis. It starts by reviewing the origins of the New Keynesian approach, the key model ingredients and representative models. Building blocks of current-generation dynamic stochastic general equilibrium (DSGE) models are discussed in detail. These models address the famous Lucas critique by deriving behavioral equations systematically from the optimizing and forward-looking decision-making of households and firms subject to well-defined constraints. State-of-the-art methods for solving and estimating such models are reviewed and presented in examples. The chapter goes beyond the mere presentation of the most popular benchmark model by providing a framework for model comparison along with a database that includes a wide variety of macroeconomic models. Thus, it offers a convenient approach for comparing new models to available benchmarks and for investigating whether particular policy recommendations are robust to model uncertainty. Such robustness analysis is illustrated by evaluating the performance of simple monetary policy rules across a range of recently-estimated models including some with financial market imperfections and by reviewing recent comparative findings regarding the magnitude of government spending multipliers. The chapter concludes with a discussion of important objectives for on-going and future research using the New Keynesian framework.
Recently, we evaluated a fiscal consolidation strategy for the United States that would bring the government budget into balance by gradually reducing government spending relative to GDP to the ratio that prevailed prior to the crisis (Cogan et al, JEDC 2013). Specifically, we published an analysis of the macroeconomic consequences of the 2013 Budget Resolution that was passed by the U.S. House of Representatives in March 2012. In this note, we provide an update of our research that evaluates this year’s budget reform proposal that is to be discussed and voted on in the House of Representative in March 2013. Contrary to the views voiced by critics of fiscal consolidation, we show that such a reduction in government purchases and transfer payments can increase GDP immediately and permanently relative to a policy without spending restraint. Our research makes use of a modern structural model of the economy that incorporates the long-standing essential features of economics: opportunity costs, efficiency, foresight and incentives. GDP rises because households take into account that spending restraint helps avoid future increases in tax rates. Lower taxes imply less distorted incentives for work, investment and production relative to a scenario without fiscal consolidation and lead to higher growth.
In the aftermath of the global financial crisis and great recession, many countries face substantial deficits and growing debts. In the United States, federal government outlays as a ratio to GDP rose substantially from about 19.5 percent before the crisis to over 24 percent after the crisis. In this paper we consider a fiscal consolidation strategy that brings the budget to balance by gradually reducing this spending ratio over time to the level that prevailed prior to the crisis. A crucial issue is the impact of such a consolidation strategy on the economy. We use structural macroeconomic models to estimate this impact focussing primarily on a dynamic stochastic general equilibrium model with price and wage rigidities and adjustment costs. We separate out the impact of reductions in government purchases and transfers, and we allow for a reduction in both distortionary taxes and government debt relative to the baseline of no consolidation. According to the model simulations GDP rises in the short run upon announcement and implementation of this fiscal consolidation strategy and remains higher than the baseline in the long run. We explore the role of the mix of expenditure cuts and tax reductions as well as gradualism in achieving this policy outcome. Finally, we conduct sensitivity studies regarding the type of model used and its parameterization.
This note reviews the legal issues and concerns that are likely to play an important role in the ongoing deliberations of the Federal Constitutional Court of Germany concerning the legality of ECB government bond purchases such as those conducted in the context of its earlier Securities Market Programme or potential future Outright Monetary Transactions.
In this paper, we provide some reflections on the development of monetary theory and monetary policy over the last 150 years. Rather than presenting an encompassing overview, which would be overambitious, we simply concentrate on a few selected aspects that we view as milestones in the development of this subject. We also try to illustrate some of the interactions with the political and financial system, academic discussion and the views and actions of central banks.
In this paper, we provide some reflections on the development of monetary theory and monetary policy over the last 150 years. Rather than presenting an encompassing overview, which would be overambitious, we simply concentrate on a few selected aspects that we view as milestones in the development of this subject. We also try to illustrate some of the interactions with the political and financial system, academic discussion and the views and actions of central banks.
Notenbanken haben heute nicht die Aufgabe, die Geldmenge zu kontrollieren. Ihr Job ist es, den Wert des Geldes – und damit den Preis der Wirtschaftsgüter in der jeweiligen Währung – zu stabilisieren. Doch wie ist diese Preisstabilität am besten herzustellen? Muss man dabei nicht doch die Geldmenge im Auge behalten? Unter monetären Ökonomen gibt es dazu eine wissenschaftliche Debatte.
This contribution draws on two recent publications in which the macroeconomic model data base (www.macromodelbase.com) is employed for model comparisons. The comparative approach is used to base policy analysis on a systematic evaluation of the different implications that a certain economic policy can have when submitted to different modeling approaches. In this manner, policy recommendations are more robust to modeling uncertainty. By extending the comparative approach to forecasting, the authors investigate the accuracy of different forecasting models and obtain more reliable mean forecasts.
In 2011 wurde der Preis für Wirtschaftswissenschaften der schwedischen Reichsbank im Gedenken an Alfred Nobel an die US-Ökonomen Thomas J. Sargent von der New York University und Chistopher A. Sims von Princeton University verliehen. Gerade deutsche Zeitungskommentare kritisierten die Forscher vielfach für die Verwendung „unrealistischer“ Annahmen wie Nutzenmaximierung und rationale Erwartungen. Diese Kritik verkennt den maßgeblichen Beitrag von Sargent und Sims zur Entwicklung der modernen Makroökonomik. Ihre empirischen Methoden sind heute Standardwerkzeuge der akademischen Forschung und werden auch von Ökonomen in Zentralbanken, Finanzministerien und internationalen Organisationen eingesetzt. Sie haben grundlegende neue Erkenntnisse ermöglicht, zum Beispiel über die Wirkungsweise der Geld- und Fiskalpolitik.
How to be a good European...
(2010)
Unter der Überschrift "Ich kaufe griechische Staatsanleihen weil..." sollten Persönlichkeiten aus Politik, Wirtschaft und Kultur kurz begründen, warum sie griechische Staatsanleihen gekauft haben bzw. kaufen werden--idealerweise unter Nachweis ihres finanziellen Engagements. Zum jetzigen Zeitpunkt kaufe ich keine griechischen Staatsanleihen...
Schlechte Erfahrungen
(2012)
Eine Transaktionssteuer auf Finanzgeschäfte würde weniger Geld einbringen, als viele ihrer Anhänger hoffen - und sie birgt gravierende ökonomische und juristische Risiken. Die Bundesregierung sollte sich der Belastungen durch eine Finanztransaktionssteuer bewusst sein – und sie nicht ohne Beteiligung der weltweit führenden Finanzplätze einführen. Eine internationale Einigung auf strengere Eigenkapitalvorschriften für Banken muss Vorrang haben.
In the aftermath of the global financial crisis and great recession, many countries face substantial deficits and growing debts. In the United States, federal government outlays as a ratio to GDP rose substantially from about 19.5 percent before the crisis to over 24 percent after the crisis. In this paper we consider a fiscal consolidation strategy that brings the budget to balance by gradually reducing this spending ratio over time to the level that prevailed prior to the crisis. A crucial issue is the impact of such a consolidation strategy on the economy. We use structural macroeconomic models to estimate this impact. We consider two types of dynamic stochastic general equilibrium models: a neoclassical growth model and more complicated models with price and wage rigidities and adjustment costs. We separate out the impact of reductions in government purchases and transfers, and we allow for a reduction in both distortionary taxes and government debt relative to the baseline of no consolidation. According to the initial model simulations GDP rises in the short run upon announcement and implementation of this fiscal consolidation strategy and remains higher than the baseline in the long run.
Notenbanken haben heute nicht die Aufgabe die Geldmenge zu kontrollieren. Ihr Job ist es, den Wert des Geldes – und damit den Preis der Wirtschaftsgüter in der jeweiligen Währung – zu stabilisieren. Doch wie ist diese Preisstabilität am besten herzustellen? Muß man dabei nicht doch die Geldmenge im Auge behalten? Unter monetären Ökonomen gibt es dazu eine wissenschaftliche Debatte.
This note reviews the legal issues and concerns that are likely to play an important role in the ongoing deliberations of the Federal Constitutional Court of Germany concerning the legality of ECB government bond purchases such as those conducted in the context of its earlier Securities Market Programme or potential future Outright Monetary Transactions.
For some time now, structural macroeconomic models used at central banks have been predominantly New Keynesian DSGE models featuring nominal rigidities and forwardlooking decision-making. While these features are widely deemed crucial for policy evaluation exercises, most central banks have added more detailed characterizations of the financial sector to these models following the Great Recession in order to improve their fit to the data and their forecasting performance. We employ a comparative approach to investigate the characteristics of this new generation of New Keynesian DSGE models and document an elevated degree of model uncertainty relative to earlier model generations. Policy transmission is highly heterogeneous across types of financial frictions and monetary policy causes larger effects, on average. The New Keynesian DSGE models we analyze suggest that a simple policy rule robust to model uncertainty involves a weaker response to inflation and the output gap in the presence of financial frictions as compared to earlier generations of such models. Leaning-against-the-wind policies in models of this class estimated for the Euro Area do not lead to substantial gains. With regard to forecasting performance, the inclusion of financial frictions can generate improvements, if conditioned on appropriate data. Looking forward, we argue that model-averaging and embracing alternative modelling paradigms is likely to yield a more robust framework for the conduct of monetary policy.
Das Working Paper bietet die zusammenfassende Stellungnahme von Prof. Volker Wieland zum Ankaufprogramm der Europäischen Zentralbank für Anleihen des öffentlichen Sektors (Public Sector Purchase Programme, PSPP) am Bundesverfassungsgericht am 30.07.2019. Dabei liegt der Schwerpunkt auf der Frage der Einordnung des PSPP als monetäre, geldpolitische Maßnahme und der Verhältnismäßigkeit des Programms und seiner Umsetzung. Ebenfalls wird kurz auf die weiteren Fragen zur Umsetzung, insbesondere Ankündigung, Begrenzung und Abstand zum Primärmarkt für Staatsanleihen eingegangen.
There is substantial disagreement about the consequences of the Tax Cuts and Jobs Act (TCJA) of 2017, which constitutes the most extensive tax reform in the United States in more than 30 years. Using a large-scale two-country dynamic general equilibrium model with nominal rigidities, we find that the TCJA increases GDP by about 2% in the medium-run and by about 2.5% in the long-run. The shortrun impact depends crucially on the degree and costs of variable capital utilization, with GDP effects ranging from 1 to 3%. At the same time, the TCJA does not pay for itself. In our analysis, the reform decreases tax revenues and raises the debt-to-GDP ratio by about 15 percentage points in the medium-run until 2025. We show that combining the TCJA with spending cuts can dampen the increase in government indebtedness without reducing its expansionary effect.
The ruling of the German Federal Constitutional Court and its call for conducting and communicating proportionality assessments regarding monetary policy have been the subject of some controversy. However, it can also be understood as a way to strengthen the de-facto independence of the European Central Bank. The authors shows how a regular proportionality check could be integrated in the ECB’s strategy that is currently undergoing a systematic review. In particular, they propose to include quantitative benchmarks for policy rates and the central bank balance sheet. Deviations from such benchmarks can have benefits in terms of the intended path for inflation while involving costs in terms of risks and side effects that need to be balanced. Practical applications to the euro area are provided
Since 2014 the ECB has implemented a massive expansion of monetary policy including large-scale asset purchases and negative policy rates. As the euro area economy has improved and inflation has risen, questions concerning the future normalization of monetary policy are starting to dominate the public debate.
The study argues that the ECB should develop a strategy for policy normalization and communicate it very soon to prepare the ground for subsequent steps towards tightening. It provides analysis and makes proposals concerning key aspects of this strategy. The aim is to facilitate the emergence of expectations among market participants that are consistent with a smooth process of policy normalization.
Recently there has been an explosion of research on whether the equilibrium real interest rate has declined, an issue with significant implications for monetary policy. A common finding is that the rate has declined. In this paper we provide evidence that contradicts this finding. We show that the perceived decline may well be due to shifts in regulatory policy and monetary policy that have been omitted from the research. In developing the monetary policy implications, it is promising that much of the research approaches the policy problem through the framework of monetary policy rules, as uncertainty in the equilibrium real rate is not a reason to abandon rules in favor of discretion. But the results are still inconclusive and too uncertain to incorporate into policy rules in the ways that have been suggested.
This paper summarizes key elements of the German Federal Constitutional Court’s decision on the European Central Bank’s Public Sector Asset Purchase Programme. It briefly explains how it is possible for the German Court to disagree with the ruling of the Court of Justice of the European Union. Finally, it makes suggestions concerning a practical way forward for the Governing Council of the ECB in light of these developments.
The recent decline in euro area inflation has triggered new calls for additional monetary stimulus by the ECB in order to counter the threat of a self‐reinforcing deflation and recession spiral. This note reviews the available evidence on inflation expectations, output gaps and other factors driving current inflation through the lens of the Phillips curve. It also draws a comparison to the Japanese experience with deflation in the late 1990s and the evidence from Japan concerning the outputinflation nexus at low trend inflation. The note concludes from this evidence that the risk of a selfreinforcing deflation remains very small. Thus, the ECB best await the impact of the long‐term refinancing operations decided in June that have the potential to induce substantial monetary accommodation once implemented for the first time in September.
The purpose of the data presented in this article is to use it in ex post estimations of interest rate decisions by the European Central Bank (ECB), as it is done by Bletzinger and Wieland (2017) [1]. The data is of quarterly frequency from 1999 Q1 until 2013 Q2 and consists of the ECB's policy rate, inflation rate, real output growth and potential output growth in the euro area. To account for forward-looking decision making in the interest rate rule, the data consists of expectations about future inflation and output dynamics. While potential output is constructed based on data from the European Commission's annual macro-economic database, inflation and real output growth are taken from two different sources both provided by the ECB: the Survey of Professional Forecasters and projections made by ECB staff. Careful attention was given to the publication date of the collected data to ensure a real-time dataset only consisting of information which was available to the decision makers at the time of the decision.