Refine
Year of publication
- 2012 (12) (remove)
Document Type
- Working Paper (7)
- Report (5)
Has Fulltext
- yes (12) (remove)
Is part of the Bibliography
- no (12) (remove)
Keywords
- monetary policy (3)
- ECB (2)
- fiscal policy (2)
- forecasting (2)
- macroeconomic models (2)
- model comparison (2)
- model uncertainty (2)
- policy rules (2)
- Bankenaufsicht (1)
- Complexity (1)
Institute
- Wirtschaftswissenschaften (12) (remove)
The complexity resulting from intertwined uncertainties regarding model misspecification and mismeasurement of the state of the economy defines the monetary policy landscape. Using the euro area as laboratory this paper explores the design of robust policy guides aiming to maintain stability in the economy while recognizing this complexity. We document substantial output gap mismeasurement and make use of a new model data base to capture the evolution of model specification. A simple interest rate rule is employed to interpret ECB policy since 1999. An evaluation of alternative policy rules across 11 models of the euro area confirms the fragility of policy analysis optimized for any specific model and shows the merits of model averaging in policy design. Interestingly, a simple difference rule with the same coefficients on inflation and output growth as the one used to interpret ECB policy is quite robust as long as it responds to current outcomes of these variables.
In the aftermath of the global financial crisis, the state of macroeconomic modeling and the use of macroeconomic models in policy analysis has come under heavy criticism. Macroeconomists in academia and policy institutions have been blamed for relying too much on a particular class of macroeconomic models. This paper proposes a comparative approach to macroeconomic policy analysis that is open to competing modeling paradigms. Macroeconomic model comparison projects have helped produce some very influential insights such as the Taylor rule. However, they have been infrequent and costly, because they require the input of many teams of researchers and multiple meetings to obtain a limited set of comparative findings. This paper provides a new approach that enables individual researchers to conduct model comparisons easily, frequently, at low cost and on a large scale. Using this approach a model archive is built that includes many well-known empirically estimated models that may be used for quantitative analysis of monetary and fiscal stabilization policies. A computational platform is created that allows straightforward comparisons of models’ implications. Its application is illustrated by comparing different monetary and fiscal policies across selected models. Researchers can easily include new models in the data base and compare the effects of novel extensions to established benchmarks thereby fostering a comparative instead of insular approach to model development.
In the aftermath of the global financial crisis and great recession, many countries face substantial deficits and growing debts. In the United States, federal government outlays as a ratio to GDP rose substantially from about 19.5 percent before the crisis to over 24 percent after the crisis. In this paper we consider a fiscal consolidation strategy that brings the budget to balance by gradually reducing this spending ratio over time to the level that prevailed prior to the crisis. A crucial issue is the impact of such a consolidation strategy on the economy. We use structural macroeconomic models to estimate this impact focussing primarily on a dynamic stochastic general equilibrium model with price and wage rigidities and adjustment costs. We separate out the impact of reductions in government purchases and transfers, and we allow for a reduction in both distortionary taxes and government debt relative to the baseline of no consolidation. According to the model simulations GDP rises in the short run upon announcement and implementation of this fiscal consolidation strategy and remains higher than the baseline in the long run. We explore the role of the mix of expenditure cuts and tax reductions as well as gradualism in achieving this policy outcome. Finally, we conduct sensitivity studies regarding the type of model used and its parameterization.
The complexity resulting from intertwined uncertainties regarding model misspecification and mismeasurement of the state of the economy defines the monetary policy landscape. Using the euro area as laboratory this paper explores the design of robust policy guides aiming to maintain stability in the economy while recognizing this complexity. We document substantial output gap mismeasurement and make use of a new model data base to capture the evolution of model specification. A simple interest rate rule is employed to interpret ECB policy since 1999. An evaluation of alternative policy rules across 11 models of the euro area confirms the fragility of policy analysis optimized for any specific model and shows the merits of model averaging in policy design. Interestingly, a simple difference rule with the same coefficients on inflation and output growth as the one used to interpret ECB policy is quite robust as long as it responds to current outcomes of these variables.
This chapter aims to provide a hands-on approach to New Keynesian models and their uses for macroeconomic policy analysis. It starts by reviewing the origins of the New Keynesian approach, the key model ingredients and representative models. Building blocks of current-generation dynamic stochastic general equilibrium (DSGE) models are discussed in detail. These models address the famous Lucas critique by deriving behavioral equations systematically from the optimizing and forward-looking decision-making of households and firms subject to well-defined constraints. State-of-the-art methods for solving and estimating such models are reviewed and presented in examples. The chapter goes beyond the mere presentation of the most popular benchmark model by providing a framework for model comparison along with a database that includes a wide variety of macroeconomic models. Thus, it offers a convenient approach for comparing new models to available benchmarks and for investigating whether particular policy recommendations are robust to model uncertainty. Such robustness analysis is illustrated by evaluating the performance of simple monetary policy rules across a range of recently-estimated models including some with financial market imperfections and by reviewing recent comparative findings regarding the magnitude of government spending multipliers. The chapter concludes with a discussion of important objectives for on-going and future research using the New Keynesian framework.
In the aftermath of the global financial crisis and great recession, many countries face substantial deficits and growing debts. In the United States, federal government outlays as a ratio to GDP rose substantially from about 19.5 percent before the crisis to over 24 percent after the crisis. In this paper we consider a fiscal consolidation strategy that brings the budget to balance by gradually reducing this spending ratio over time to the level that prevailed prior to the crisis. A crucial issue is the impact of such a consolidation strategy on the economy. We use structural macroeconomic models to estimate this impact focussing primarily on a dynamic stochastic general equilibrium model with price and wage rigidities and adjustment costs. We separate out the impact of reductions in government purchases and transfers, and we allow for a reduction in both distortionary taxes and government debt relative to the baseline of no consolidation. According to the model simulations GDP rises in the short run upon announcement and implementation of this fiscal consolidation strategy and remains higher than the baseline in the long run. We explore the role of the mix of expenditure cuts and tax reductions as well as gradualism in achieving this policy outcome. Finally, we conduct sensitivity studies regarding the type of model used and its parameterization.
This contribution draws on two recent publications in which the macroeconomic model data base (www.macromodelbase.com) is employed for model comparisons. The comparative approach is used to base policy analysis on a systematic evaluation of the different implications that a certain economic policy can have when submitted to different modeling approaches. In this manner, policy recommendations are more robust to modeling uncertainty. By extending the comparative approach to forecasting, the authors investigate the accuracy of different forecasting models and obtain more reliable mean forecasts.
In 2011 wurde der Preis für Wirtschaftswissenschaften der schwedischen Reichsbank im Gedenken an Alfred Nobel an die US-Ökonomen Thomas J. Sargent von der New York University und Chistopher A. Sims von Princeton University verliehen. Gerade deutsche Zeitungskommentare kritisierten die Forscher vielfach für die Verwendung „unrealistischer“ Annahmen wie Nutzenmaximierung und rationale Erwartungen. Diese Kritik verkennt den maßgeblichen Beitrag von Sargent und Sims zur Entwicklung der modernen Makroökonomik. Ihre empirischen Methoden sind heute Standardwerkzeuge der akademischen Forschung und werden auch von Ökonomen in Zentralbanken, Finanzministerien und internationalen Organisationen eingesetzt. Sie haben grundlegende neue Erkenntnisse ermöglicht, zum Beispiel über die Wirkungsweise der Geld- und Fiskalpolitik.
Schlechte Erfahrungen
(2012)
Eine Transaktionssteuer auf Finanzgeschäfte würde weniger Geld einbringen, als viele ihrer Anhänger hoffen - und sie birgt gravierende ökonomische und juristische Risiken. Die Bundesregierung sollte sich der Belastungen durch eine Finanztransaktionssteuer bewusst sein – und sie nicht ohne Beteiligung der weltweit führenden Finanzplätze einführen. Eine internationale Einigung auf strengere Eigenkapitalvorschriften für Banken muss Vorrang haben.