CFS working paper series
https://gfk-cfs.de/working-papers/
Refine
Year of publication
Document Type
- Working Paper (28)
Language
- English (28)
Has Fulltext
- yes (28)
Is part of the Bibliography
- no (28)
Keywords
- Geldpolitik (14)
- Europäische Union (6)
- Model Uncertainty (5)
- Monetary Policy (5)
- Währungsunion (5)
- monetary policy (5)
- European Central Bank (4)
- Europäische Zentralbank (4)
- monetary policy rules (4)
- rational expectations (4)
- Fiscal Stimulus (3)
- Fiskalpolitik (3)
- Monetary Policy Rules (3)
- Robustness (3)
- liquidity trap (3)
- nominal rigidities (3)
- Crowding-out (2)
- Deflation (2)
- Disinflation (2)
- Euro Area (2)
- Fiscal Multiplier (2)
- Fiscal Policy (2)
- Government Spending (2)
- Inflation Targeting (2)
- Japan (2)
- Keynessche Theorie (2)
- Macroeconomic Modelling (2)
- Minimax (2)
- Mitgliedsstaaten (2)
- Money (2)
- New Keynesian Model (2)
- New-Keynesian Model (2)
- Notenbank (2)
- Phillips Curve (2)
- Phillips-Kurve (2)
- Policy Under Uncertainty (2)
- Quantity Theory (2)
- Transmissionsmechanismus (2)
- euro area (2)
- exchange rates (2)
- inflation targeting (2)
- macroeconomic modelling (2)
- policy rules (2)
- policy under uncertainty (2)
- quantity theory (2)
- Öffentliche Ausgaben (2)
- Adaptive Erwartung (1)
- Anreiz (1)
- Arbeitslosigkeit (1)
- Bankbilanz (1)
- Bayesian Estimation (1)
- Business Cycles (1)
- Central Banking (1)
- Complexity (1)
- DSGE Model (1)
- DSGE Models (1)
- Discretionary Fiscal Policy (1)
- ECB (1)
- Einkommen (1)
- Entscheidung bei Unsicherheit (1)
- Entscheidungsregel (1)
- Euro (1)
- European Monetary Union (1)
- FOMC (1)
- Financial Crisis (1)
- Fiscal Consolidation (1)
- Forecast Distribution (1)
- Forecasting (1)
- Forecasts (1)
- Geduld (1)
- Geld (1)
- Geldtheorie (1)
- Gleichgewicht (1)
- Government Debt (1)
- Government Deficit (1)
- Government Spending Multipliers (1)
- Haushalt (1)
- Heterogeneous Beliefs (1)
- Indexation (1)
- Inflation (1)
- Inflation targeting (1)
- Internationaler Kreditmarkt (1)
- Kalman filter (1)
- Kursanomalie (1)
- Learning (1)
- Lernen (1)
- Liquiditätspräferenztheorie (1)
- Macroeconomic Modeling (1)
- Makroökonomisches Modell (1)
- Methode (1)
- Model Comparison (1)
- Monetary (1)
- Monetary Models (1)
- Monetary Union (1)
- Neokeynesianismus (1)
- New-Keynesian Models (1)
- Nichtlineares mathematisches Modell (1)
- Nominalzins (1)
- Output Gap Uncertainty (1)
- Policy Rules (1)
- Prognose (1)
- Quantitative Easing (1)
- Quantitätstheorie (1)
- Rationale Erwartung (1)
- Recursive Least Squares (1)
- Rentenmarkt (1)
- Rezession (1)
- Risikoanalyse (1)
- Robust Simple Rules (1)
- Stabilisierung (1)
- Taylor Rules (1)
- USA (1)
- Verbraucher (1)
- Versicherungsmarkt (1)
- Wechselkurspolitik (1)
- Wertpapierportefeuille (1)
- Wirtschaftsmodell (1)
- Währungspolitik (1)
- Zero Bound (1)
- Zerobond (1)
- Zins (1)
- Zinsfuß (1)
- disinflation (1)
- fiscal policy (1)
- inflation forecast targeting (1)
- interest rates (1)
- macroeconomic models (1)
- measurement error (1)
- model comparison (1)
- model uncertainty (1)
- money (1)
- natural unemployment rate (1)
- optimal learning (1)
- optimal monetary policy (1)
- overlapping wage contracts (1)
- parameter uncertainty (1)
- price stability (1)
- robustness (1)
- zero interest rate bound (1)
- zero-interest-rate bound (1)
2012, 11
The complexity resulting from intertwined uncertainties regarding model misspecification and mismeasurement of the state of the economy defines the monetary policy landscape. Using the euro area as laboratory this paper explores the design of robust policy guides aiming to maintain stability in the economy while recognizing this complexity. We document substantial output gap mismeasurement and make use of a new model data base to capture the evolution of model specification. A simple interest rate rule is employed to interpret ECB policy since 1999. An evaluation of alternative policy rules across 11 models of the euro area confirms the fragility of policy analysis optimized for any specific model and shows the merits of model averaging in policy design. Interestingly, a simple difference rule with the same coefficients on inflation and output growth as the one used to interpret ECB policy is quite robust as long as it responds to current outcomes of these variables.
2012, 03
In the aftermath of the global financial crisis, the state of macroeconomic modeling and the use of macroeconomic models in policy analysis has come under heavy criticism. Macroeconomists in academia and policy institutions have been blamed for relying too much on a particular class of macroeconomic models. This paper proposes a comparative approach to macroeconomic policy analysis that is open to competing modeling paradigms. Macroeconomic model comparison projects have helped produce some very influential insights such as the Taylor rule. However, they have been infrequent and costly, because they require the input of many teams of researchers and multiple meetings to obtain a limited set of comparative findings. This paper provides a new approach that enables individual researchers to conduct model comparisons easily, frequently, at low cost and on a large scale. Using this approach a model archive is built that includes many well-known empirically estimated models that may be used for quantitative analysis of monetary and fiscal stabilization policies. A computational platform is created that allows straightforward comparisons of models’ implications. Its application is illustrated by comparing different monetary and fiscal policies across selected models. Researchers can easily include new models in the data base and compare the effects of novel extensions to established benchmarks thereby fostering a comparative instead of insular approach to model development.
2012, 12
In the aftermath of the global financial crisis and great recession, many countries face substantial deficits and growing debts. In the United States, federal government outlays as a ratio to GDP rose substantially from about 19.5 percent before the crisis to over 24 percent after the crisis. In this paper we consider a fiscal consolidation strategy that brings the budget to balance by gradually reducing this spending ratio over time to the level that prevailed prior to the crisis. A crucial issue is the impact of such a consolidation strategy on the economy. We use structural macroeconomic models to estimate this impact focussing primarily on a dynamic stochastic general equilibrium model with price and wage rigidities and adjustment costs. We separate out the impact of reductions in government purchases and transfers, and we allow for a reduction in both distortionary taxes and government debt relative to the baseline of no consolidation. According to the model simulations GDP rises in the short run upon announcement and implementation of this fiscal consolidation strategy and remains higher than the baseline in the long run. We explore the role of the mix of expenditure cuts and tax reductions as well as gradualism in achieving this policy outcome. Finally, we conduct sensitivity studies regarding the type of model used and its parameterization.
2012, 20
In this paper, we provide some reflections on the development of monetary theory and monetary policy over the last 150 years. Rather than presenting an encompassing overview, which would be overambitious, we simply concentrate on a few selected aspects that we view as milestones in the development of this subject. We also try to illustrate some of the interactions with the political and financial system, academic discussion and the views and actions of central banks.
2010, 08
This paper investigates the accuracy and heterogeneity of output growth and inflation forecasts during the current and the four preceding NBER-dated U.S. recessions. We generate forecasts from six different models of the U.S. economy and compare them to professional forecasts from the Federal Reserve’s Greenbook and the Survey of Professional Forecasters (SPF). The model parameters and model forecasts are derived from historical data vintages so as to ensure comparability to historical forecasts by professionals. The mean model forecast comes surprisingly close to the mean SPF and Greenbook forecasts in terms of accuracy even though the models only make use of a small number of data series. Model forecasts compare particularly well to professional forecasts at a horizon of three to four quarters and during recoveries. The extent of forecast heterogeneity is similar for model and professional forecasts but varies substantially over time. Thus, forecast heterogeneity constitutes a potentially important source of economic fluctuations. While the particular reasons for diversity in professional forecasts are not observable, the diversity in model forecasts can be traced to different modeling assumptions, information sets and parameter estimates. JEL Classification: C53, D84, E31, E32, E37 Keywords: Forecasting, Business Cycles, Heterogeneous Beliefs, Forecast Distribution, Model Uncertainty, Bayesian Estimation
2009, 19
In the New-Keynesian model, optimal interest rate policy under uncertainty is formulated without reference to monetary aggregates as long as certain standard assumptions on the distributions of unobservables are satisfied. The model has been criticized for failing to explain common trends in money growth and inflation, and that therefore money should be used as a cross-check in policy formulation (see Lucas (2007)). We show that the New-Keynesian model can explain such trends if one allows for the possibility of persistent central bank misperceptions. Such misperceptions motivate the search for policies that include additional robustness checks. In earlier work, we proposed an interest rate rule that is near-optimal in normal times but includes a cross-check with monetary information. In case of unusual monetary trends, interest rates are adjusted. In this paper, we show in detail how to derive the appropriate magnitude of the interest rate adjustment following a significant cross-check with monetary information, when the New-Keynesian model is the central bank’s preferred model. The cross-check is shown to be effective in offsetting persistent deviations of inflation due to central bank misperceptions. Keywords: Monetary Policy, New-Keynesian Model, Money, Quantity Theory, European Central Bank, Policy Under Uncertainty
2009, 30
This paper reviews the rationale for quantitative easing when central bank policy rates reach near zero levels in light of recent announcements regarding direct asset purchases by the Bank of England, the Bank of Japan, the U.S. Federal Reserve and the European Central Bank. Empirical evidence from the previous period of quantitative easing in Japan between 2001 and 2006 is presented. During this earlier period the Bank of Japan was able to expand the monetary base very quickly and significantly. Quantitative easing translated into a greater and more lasting expansion of M1 relative to nominal GDP. Deflation subsided by 2005. As soon as inflation appeared to stabilize near a rate of zero, the Bank of Japan rapidly reduced the monetary base as a share of nominal income as it had announced in 2001. The Bank was able to exit from extensive quantitative easing within less than a year. Some implications for the current situation in Europe and the United States are discussed.
2009, 25
The global financial crisis has lead to a renewed interest in discretionary fiscal stimulus. Advocates of discretionary measures emphasize that government spending can stimulate additional private spending — the so-called Keynesian multiplier effect. Thus, we investigate whether the discretionary spending announced by Euro area governments for 2009 and 2010 is likely to boost euro area GDP by more than one for one. Because of modeling uncertainty, it is essential that such policy evaluations be robust to alternative modeling assumptions and different parameterizations. Therefore, we use five different empirical macroeconomic models with Keynesian features such as price and wage rigidities to evaluate the impact of fiscal stimulus. Four of them suggest that the planned increase in government spending will reduce private spending for consumption and investment purposes significantly. If announced government expenditures are implemented with delay the initial effect on euro area GDP, when stimulus is most needed, may even be negative. Traditional Keynesian multiplier effects only arise in a model that ignores the forward-looking behavioral response of consumers and firms. Using a multi-country model, we find that spillovers between euro area countries are negligible or even negative, because direct demand effects are offset by the indirect effect of euro appreciation.
2009, 26
Recent evaluations of the fiscal stimulus packages recently enacted in the United States and Europe such as Cogan, Cwik, Taylor and Wieland (2009) and Cwik and Wieland (2009) suggest that the GDP effects will be modest due to crowding-out of private consumption and investment. Corsetti, Meier and Mueller (2009a,b) argue that spending shocks are typically followed by consolidations with substantive spending cuts, which enhance the short-run stimulus effect. This note investigates the implications of this argument for the estimated impact of recent stimulus packages and the case for discretionary fiscal policy.
2009, 21
In this paper we investigate the comparative properties of empirically-estimated monetary models of the U.S. economy. We make use of a new data base of models designed for such investigations. We focus on three representative models: the Christiano, Eichenbaum, Evans (2005) model, the Smets and Wouters (2007) model, and the Taylor (1993a) model. Although the three models differ in terms of structure, estimation method, sample period, and data vintage, we find surprisingly similar economic impacts of unanticipated changes in the federal funds rate. However, the optimal monetary policy responses to other sources of economic fluctuations are widely different in the different models. We show that simple optimal policy rules that respond to the growth rate of output and smooth the interest rate are not robust. In contrast, policy rules with no interest rate smoothing and no response to the growth rate, as distinct from the level, of output are more robust. Robustness can be improved further by optimizing rules with respect to the average loss across the three models.
2009, 17
Renewed interest in fiscal policy has increased the use of quantitative models to evaluate policy. Because of modeling uncertainty, it is essential that policy evaluations be robust to alternative assumptions. We find that models currently being used in practice to evaluate fiscal policy stimulus proposals are not robust. Government spending multipliers in an alternative empirically-estimated and widely-cited new Keynesian model are much smaller than in these old Keynesian models; the estimated stimulus is extremely small with GDP and employment effects only one-sixth as large.
2008, 25
Research with Keynesian-style models has emphasized the importance of the output gap for policies aimed at controlling inflation while declaring monetary aggregates largely irrelevant. Critics, however, have argued that these models need to be modified to account for observed money growth and inflation trends, and that monetary trends may serve as a useful cross-check for monetary policy. We identify an important source of monetary trends in form of persistent central bank misperceptions regarding potential output. Simulations with historical output gap estimates indicate that such misperceptions may induce persistent errors in monetary policy and sustained trends in money growth and inflation. If interest rate prescriptions derived from Keynesian-style models are augmented with a cross-check against money-based estimates of trend inflation, inflation control is improved substantially.
2008, 17
This paper introduces adaptive learning and endogenous indexation in the New-Keynesian Phillips curve and studies disinflation under inflation targeting policies. The analysis is motivated by the disinflation performance of many inflation-targeting countries, in particular the gradual Chilean disinflation with temporary annual targets. At the start of the disinflation episode price-setting firms’ expect inflation to be highly persistent and opt for backward-looking indexation. As the central bank acts to bring inflation under control, price-setting firms revise their estimates of the degree of persistence. Such adaptive learning lowers the cost of disinflation. This reduction can be exploited by a gradual approach to disinflation. Firms that choose the rate for indexation also re-assess the likelihood that announced inflation targets determine steady-state inflation and adjust indexation of contracts accordingly. A strategy of announcing and pursuing short-term targets for inflation is found to influence the likelihood that firms switch from backward-looking indexation to the central bank’s targets. As firms abandon backward-looking indexation the costs of disinflation decline further. We show that an inflation targeting strategy that employs temporary targets can benefit from lower disinflation costs due to the reduction in backward-looking indexation.
2008, 16
Monetary policy analysts often rely on rules-of-thumb, such as the Taylor rule, to describe historical monetary policy decisions and to compare current policy to historical norms. Analysis along these lines also permits evaluation of episodes where policy may have deviated from a simple rule and examination of the reasons behind such deviations. One interesting question is whether such rules-of-thumb should draw on policymakers "forecasts of key variables such as inflation and unemployment or on observed outcomes. Importantly, deviations of the policy from the prescriptions of a Taylor rule that relies on outcomes may be due to systematic responses to information captured in policymakers" own projections. We investigate this proposition in the context of FOMC policy decisions over the past 20 years using publicly available FOMC projections from the biannual monetary policy reports to the Congress (Humphrey-Hawkins reports). Our results indicate that FOMC decisions can indeed be predominantly explained in terms of the FOMC´s own projections rather than observed outcomes. Thus, a forecast-based rule-of-thumb better characterizes FOMC decision-making. We also confirm that many of the apparent deviations of the federal funds rate from an outcome-based Taylor-style rule may be considered systematic responses to information contained in FOMC projections.
2007, 03
The European Central Bank
(2007)
The establishment of the ECB and with it the launch of the euro has arguably been a unique endeavor in economic history, representing an important experiment in central banking. This note aims to summarize some of the main lessons learned from this experiment and sketch some of the prospects for the ECB. It is written for "The New Palgrave Dictionary of Economics", 2nd edition. JEL Classification: E52, E58
2007, 17
The European Central Bank has assigned a special role to money in its two pillar strategy and has received much criticism for this decision. In this paper, we explore possible justifications. The case against including money in the central bank’s interest rate rule is based on a standard model of the monetary transmission process that underlies many contributions to research on monetary policy in the last two decades. Of course, if one allows for a direct effect of money on output or inflation as in the empirical “two-pillar” Phillips curves estimated in some recent contributions, it would be optimal to include a measure of (long-run) money growth in the rule. In this paper, we develop a justification for including money in the interest rate rule by allowing for imperfect knowledge regarding unobservables such as potential output and equilibrium interest rates. We formulate a novel characterization of ECB-style monetary cross-checking and show that it can generate substantial stabilization benefits in the event of persistent policy misperceptions regarding potential output. Such misperceptions cause a bias in policy setting. We find that cross-checking and changing interest rates in response to sustained deviations of long-run money growth helps the central bank to overcome this bias. Our argument in favor of ECB-style cross-checking does not require direct effects of money on output or inflation. JEL Classification: E32, E41, E43, E52, E58
2007, 18
The European Central Bank has assigned a special role to money in its two pillar strategy and has received much criticism for this decision. The case against including money in the central bank’s interest rate rule is based on a standard model of the monetary transmission process that underlies many contributions to research on monetary policy in the last two decades. In this paper, we develop a justification for including money in the interest rate rule by allowing for imperfect knowledge regarding unobservables such as potential output and equilibrium interest rates. We formulate a novel characterization of ECB-style monetary cross-checking and show that it can generate substantial stabilization benefits in the event of persistent policy misperceptions regarding potential output. JEL Classification: E32, E41, E43, E52, E58
2006, 03
In this paper, we consider expected value, variance and worst-case optimization of nonlinear models. We present algorithms for computing optimal expected values, and variance, based on iterative Taylor expansions. We establish convergence and consider the relative merits of policies beaded on expected value optimization and worst-case robustness. The latter is a minimax strategy and ensures optimal cover in view of the worst-case scenario(s) while the former is optimal expected performance in a stochastic setting. Both approaches are used with a macroeconomic policy model to illustrate relative performances, robustness and trade-offs between the strategies. Klassifikation: C61, E43
2005, 13
In this paper, we examine the cost of insurance against model uncertainty for the Euro area considering four alternative reference models, all of which are used for policy-analysis at the ECB.We find that maximal insurance across this model range in terms of aMinimax policy comes at moderate costs in terms of lower expected performance. We extract priors that would rationalize the Minimax policy from a Bayesian perspective. These priors indicate that full insurance is strongly oriented towards the model with highest baseline losses. Furthermore, this policy is not as tolerant towards small perturbations of policy parameters as the Bayesian policy rule. We propose to strike a compromise and use preferences for policy design that allow for intermediate degrees of ambiguity-aversion.These preferences allow the specification of priors but also give extra weight to the worst uncertain outcomes in a given context. JEL Klassifikation: E52, E58, E61.
2005, 14
In this paper, we examine the cost of insurance against model uncertainty for the Euro area considering four alternative reference models, all of which are used for policy-analysis at the ECB.We find that maximal insurance across this model range in terms of aMinimax policy comes at moderate costs in terms of lower expected performance. We extract priors that would rationalize the Minimax policy from a Bayesian perspective. These priors indicate that full insurance is strongly oriented towards the model with highest baseline losses. Furthermore, this policy is not as tolerant towards small perturbations of policy parameters as the Bayesian policy rule. We propose to strike a compromise and use preferences for policy design that allow for intermediate degrees of ambiguity-aversion.These preferences allow the specification of priors but also give extra weight to the worst uncertain outcomes in a given context. JEL Klassifikation: E52, E58, E61