Refine
Year of publication
Document Type
- Working Paper (56) (remove)
Has Fulltext
- yes (56)
Is part of the Bibliography
- no (56)
Keywords
- Geldpolitik (14)
- monetary policy (10)
- Europäische Union (6)
- Model Uncertainty (6)
- Monetary Policy (6)
- European Central Bank (5)
- Währungsunion (5)
- rational expectations (5)
- Europäische Zentralbank (4)
- Robustness (4)
- fiscal policy (4)
- model comparison (4)
- model uncertainty (4)
- monetary policy rules (4)
- policy rules (4)
- ECB (3)
- Fiscal Policy (3)
- Fiscal Stimulus (3)
- Fiskalpolitik (3)
- Monetary Policy Rules (3)
- interest rates (3)
- liquidity trap (3)
- nominal rigidities (3)
- Crowding-out (2)
- Deflation (2)
- Disinflation (2)
- Euro Area (2)
- Fiscal Multiplier (2)
- Government Spending (2)
- Inflation Targeting (2)
- Japan (2)
- Keynessche Theorie (2)
- Macroeconomic Modelling (2)
- Minimax (2)
- Mitgliedsstaaten (2)
- Model Comparison (2)
- Money (2)
- New Keynesian Model (2)
- New-Keynesian Model (2)
- Notenbank (2)
- Phillips Curve (2)
- Phillips-Kurve (2)
- Policy Rules (2)
- Policy Under Uncertainty (2)
- Quantity Theory (2)
- Transmissionsmechanismus (2)
- euro area (2)
- exchange rates (2)
- forecasting (2)
- inflation targeting (2)
- macroeconomic modelling (2)
- policy robustness (2)
- policy under uncertainty (2)
- quantity theory (2)
- Öffentliche Ausgaben (2)
- Adaptive Erwartung (1)
- Anreiz (1)
- Arbeitslosigkeit (1)
- Bankbilanz (1)
- Bayes-Lernen (1)
- Bayesian Estimation (1)
- Bayesian learning (1)
- Business Cycles (1)
- Capital-Asset-Pricing-Modell (1)
- Central Banking (1)
- Complexity (1)
- DSGE Model (1)
- DSGE Models (1)
- DSGE models (1)
- Discretionary Fiscal Policy (1)
- Einkommen (1)
- Entscheidung bei Unsicherheit (1)
- Entscheidungsregel (1)
- Euro (1)
- European Monetary Union (1)
- FOMC (1)
- Federal Reserve (1)
- Financial Crisis (1)
- Fiscal Consolidation (1)
- Forecast Distribution (1)
- Forecasting (1)
- Forecasts (1)
- Geduld (1)
- Geld (1)
- Geldtheorie (1)
- Gleichgewicht (1)
- Government Debt (1)
- Government Deficit (1)
- Government Spending Multipliers (1)
- Haushalt (1)
- Heterogeneous Beliefs (1)
- Indexation (1)
- Inflation (1)
- Inflation targeting (1)
- Internationaler Kreditmarkt (1)
- Kalman filter (1)
- Keynesian models (1)
- Kursanomalie (1)
- Learning (1)
- Lernen (1)
- Liquiditätspräferenztheorie (1)
- Macroeconomic Modeling (1)
- Macroeconomic Models (1)
- Makroökonomisches Modell (1)
- Methode (1)
- Monetary (1)
- Monetary Models (1)
- Monetary Union (1)
- Monetary macroeconomics (1)
- Monetary policy (1)
- Neokeynesianismus (1)
- New Keynesian DSGE (1)
- New Keynesian macro-epidemic models (1)
- New Keynesian models (1)
- New Neoclassical synthesis (1)
- New-Keynesian Models (1)
- Nichtlineares mathematisches Modell (1)
- Nominalzins (1)
- OMT (1)
- Output Gap Uncertainty (1)
- Prognose (1)
- Quantitative Easing (1)
- Quantitätstheorie (1)
- Rationale Erwartung (1)
- Recursive Least Squares (1)
- Rentenmarkt (1)
- Rezession (1)
- Risikoanalyse (1)
- Robust Simple Rules (1)
- Stabilisierung (1)
- Taylor Rules (1)
- Taylor rule (1)
- USA (1)
- Verbraucher (1)
- Versicherungsmarkt (1)
- Wechselkurspolitik (1)
- Wertpapierportefeuille (1)
- Wirtschaftsmodell (1)
- Währungspolitik (1)
- Zero Bound (1)
- Zerobond (1)
- Zins (1)
- Zinsfuß (1)
- adaptive learning (1)
- beliefs (1)
- business cycles (1)
- capital taxes (1)
- central bank independence (1)
- complexity (1)
- conditional forecasts (1)
- controlled diffusions and jump processes (1)
- corporate taxes (1)
- disinflation (1)
- dynamic stochastic general equilibrium models (1)
- expectations (1)
- financial crisis (1)
- financial frictions (1)
- fiscal policy transmission (1)
- fiscal stimulus (1)
- forward guidance (1)
- inflation forecast targeting (1)
- labor income taxes (1)
- learning about jumps (1)
- macro-financial models (1)
- macroeconomic models (1)
- macroprudential policy transmission (1)
- measurement error (1)
- monetary and fiscal policy (1)
- monetary institutions (1)
- monetary law (1)
- monetary policy strategy (1)
- monetary policy transmission (1)
- money (1)
- natural unemployment rate (1)
- optimal learning (1)
- optimal monetary policy (1)
- overlapping wage contracts (1)
- parameter uncertainty (1)
- policy evaluation (1)
- price stability (1)
- proportionality (1)
- quantitative easing (1)
- rational learning (1)
- robust monetary policy (1)
- robust policy (1)
- robust simple rules (1)
- robustness (1)
- spending cuts (1)
- tax reform (1)
- zero interest rate bound (1)
- zero lower bound (1)
- zero-interest-rate bound (1)
The Federal Reserve has been publishing federal funds rate prescriptions from Taylor rules in its Monetary Policy Report since 2017. The signals from the rules aligned with Fed action on many occasions, but in some cases the Fed opted for a different route. This paper reviews the implications of the rules during the coronavirus pandemic and the subsequent inflation surge and derives projections for the future.
In 2020, the Fed took the negative prescribed rates, which were far below the effective lower bound on the nominal interest rate, as support for extensive and long-lasting quantitative easing. Yet, the calculations overstate the extent of the constraint, because they neglect the supply side effects of the pandemic.
The paper proposes a simple model-based adjustment to the resource gap used by the rules for 2020. In 2021, the rules clearly signaled the need for tightening because of the rise of inflation, yet the Fed waited until spring 2022 to raise the federal funds rate. With the decline of inflation over the course of 2023, the rules’ prescriptions have also come down. They fall below the actual federal funds rate target range in 2024. Several caveats concerning the projections of the interest rate prescriptions are discussed.
Veronika Grimm, Lukas Nöh, and Volker Wieland assess the possible development of government interest expenditures as a share of GDP for Germany, France, Italy and Spain. Until 2021, these and other member states could anticipate a further reduction of interest expenditure in the future. This outlook has changed considerably with the recent surge in inflation and government bond rates. Nevertheless, under reasonable assumptions current yield curves still imply that interest expenditure relative to GDP can be stabilized at the current level. The authors also review the implications of a further upward shift in the yield curves of 1 or 2 percentage points. These implications suggest significant medium-term risks for highly indebted member states with interest expenditure approaching or exceeding levels last observed on the eve of the euro area debt crisis. In light of these risks, governments of euro area member states should take substantive action to achieve a sustained decline in debt-to-GDP ratios towards safer levels. They bear the responsibility for making sure that government finances can weather the higher interest rates which are required to achieve price stability in the euro area.
The Russian war of aggression against Ukraine since 24 February 2022 has intensified the discussion of Europe’s reliance on energy imports from Russia. A ban on Russian imports of oil, natural gas and coal has already been imposed by the United States, while the United Kingdom plans to cease imports of oil and coal from Russia by the end of 2022. The German Federal Government is currently opposing an energy embargo against Russia. However, the Federal Ministry for Economic Affairs and Climate Action is working on a strategy to reduce energy imports from Russia. In this paper, the authors give an overview of the German and European reliance on energy imports from Russia with a focus on gas imports and discuss price effects, alternative suppliers of natural gas, and the potential for saving and replacing natural gas. They also provide an overview of estimates of the consequences on the economic outlook if the conflict intensifies.
This note argues that the European Central Bank should adjust its strategy in order to consider broader measures of inflation in its policy deliberations and communications. In particular, it points out that a broad measure of domestic goods and services price inflation such as the GDP deflator has increased along with the euro area recovery and the expansion of monetary policy since 2013, while HICP inflation has become more variable and, on average, has declined. Similarly, the cost of owner-occupied housing, which is excluded from the HICP, has risen during this period. Furthermore, it shows that optimal monetary policy at the effective lower bound on nominal interest rates aims to return inflation more slowly to the inflation target from below than in normal times because of uncertainty about the effects and potential side effects of quantitative easing.
The ruling of the German Federal Constitutional Court and its call for conducting and communicating proportionality assessments regarding monetary policy have been the subject of some controversy. However, it can also be understood as a way to strengthen the de-facto independence of the European Central Bank. The authors shows how a regular proportionality check could be integrated in the ECB’s strategy that is currently undergoing a systematic review. In particular, they propose to include quantitative benchmarks for policy rates and the central bank balance sheet. Deviations from such benchmarks can have benefits in terms of the intended path for inflation while involving costs in terms of risks and side effects that need to be balanced. Practical applications to the euro area are provided
This paper summarizes key elements of the German Federal Constitutional Court’s decision on the European Central Bank’s Public Sector Asset Purchase Programme. It briefly explains how it is possible for the German Court to disagree with the ruling of the Court of Justice of the European Union. Finally, it makes suggestions concerning a practical way forward for the Governing Council of the ECB in light of these developments.
Das Working Paper bietet die zusammenfassende Stellungnahme von Prof. Volker Wieland zum Ankaufprogramm der Europäischen Zentralbank für Anleihen des öffentlichen Sektors (Public Sector Purchase Programme, PSPP) am Bundesverfassungsgericht am 30.07.2019. Dabei liegt der Schwerpunkt auf der Frage der Einordnung des PSPP als monetäre, geldpolitische Maßnahme und der Verhältnismäßigkeit des Programms und seiner Umsetzung. Ebenfalls wird kurz auf die weiteren Fragen zur Umsetzung, insbesondere Ankündigung, Begrenzung und Abstand zum Primärmarkt für Staatsanleihen eingegangen.
There is substantial disagreement about the consequences of the Tax Cuts and Jobs Act (TCJA) of 2017, which constitutes the most extensive tax reform in the United States in more than 30 years. Using a large-scale two-country dynamic general equilibrium model with nominal rigidities, we find that the TCJA increases GDP by about 2% in the medium-run and by about 2.5% in the long-run. The shortrun impact depends crucially on the degree and costs of variable capital utilization, with GDP effects ranging from 1 to 3%. At the same time, the TCJA does not pay for itself. In our analysis, the reform decreases tax revenues and raises the debt-to-GDP ratio by about 15 percentage points in the medium-run until 2025. We show that combining the TCJA with spending cuts can dampen the increase in government indebtedness without reducing its expansionary effect.
Since 2014 the ECB has implemented a massive expansion of monetary policy including large-scale asset purchases and negative policy rates. As the euro area economy has improved and inflation has risen, questions concerning the future normalization of monetary policy are starting to dominate the public debate.
The study argues that the ECB should develop a strategy for policy normalization and communicate it very soon to prepare the ground for subsequent steps towards tightening. It provides analysis and makes proposals concerning key aspects of this strategy. The aim is to facilitate the emergence of expectations among market participants that are consistent with a smooth process of policy normalization.
For some time now, structural macroeconomic models used at central banks have been predominantly New Keynesian DSGE models featuring nominal rigidities and forwardlooking decision-making. While these features are widely deemed crucial for policy evaluation exercises, most central banks have added more detailed characterizations of the financial sector to these models following the Great Recession in order to improve their fit to the data and their forecasting performance. We employ a comparative approach to investigate the characteristics of this new generation of New Keynesian DSGE models and document an elevated degree of model uncertainty relative to earlier model generations. Policy transmission is highly heterogeneous across types of financial frictions and monetary policy causes larger effects, on average. The New Keynesian DSGE models we analyze suggest that a simple policy rule robust to model uncertainty involves a weaker response to inflation and the output gap in the presence of financial frictions as compared to earlier generations of such models. Leaning-against-the-wind policies in models of this class estimated for the Euro Area do not lead to substantial gains. With regard to forecasting performance, the inclusion of financial frictions can generate improvements, if conditioned on appropriate data. Looking forward, we argue that model-averaging and embracing alternative modelling paradigms is likely to yield a more robust framework for the conduct of monetary policy.
The currrent debate on monetary and fiscal policy is heavily influenced by estimates of the equilibrium real interest rate. Beyer and Wieland re-estimate the U.S. equilibrium rate with the methodology of Laubach and Williams and further modifications. They provide new estimates for the United States, the euro area and Germany and subject them to sensitivity tests. Beyer and Wieland conclude that due to the great uncertainty and sensitivity, the observed decline in the estimates is not a reliable indicator of a need for expansionary monetary and fiscal policy. Yet, if those estimates are employed to determine the appropriate monetary policy stance, such estimates are better used together with the consistent estimate of the level of potential output.
The global financial crisis and the ensuing criticism of macroeconomics have inspired researchers to explore new modeling approaches. There are many new models that deliver improved estimates of the transmission of macroeconomic policies and aim to better integrate the financial sector in business cycle analysis. Policy making institutions need to compare available models of policy transmission and evaluate the impact and interaction of policy instruments in order to design effective policy strategies. This paper reviews the literature on model comparison and presents a new approach for comparative analysis. Its computational implementation enables individual researchers to conduct systematic model comparisons and policy evaluations easily and at low cost. This approach also contributes to improving reproducibility of computational research in macroeconomic modeling. Several applications serve to illustrate the usefulness of model comparison and the new tools in the area of monetary and fiscal policy. They include an analysis of the impact of parameter shifts on the effects of fiscal policy, a comparison of monetary policy transmission across model generations and a cross-country comparison of the impact of changes in central bank rates in the United States and the euro area. Furthermore, the paper includes a large-scale comparison of the dynamics and policy implications of different macro-financial models. The models considered account for financial accelerator effects in investment financing, credit and house price booms and a role for bank capital. A final exercise illustrates how these models can be used to assess the benefits of leaning against credit growth in monetary policy.
Recently there has been an explosion of research on whether the equilibrium real interest rate has declined, an issue with significant implications for monetary policy. A common finding is that the rate has declined. In this paper we provide evidence that contradicts this finding. We show that the perceived decline may well be due to shifts in regulatory policy and monetary policy that have been omitted from the research. In developing the monetary policy implications, it is promising that much of the research approaches the policy problem through the framework of monetary policy rules, as uncertainty in the equilibrium real rate is not a reason to abandon rules in favor of discretion. But the results are still inconclusive and too uncertain to incorporate into policy rules in the ways that have been suggested.
A number of contributions to research on monetary policy have suggested that policy should be asymmetric near the lower bound on nominal interest rates. As inflation and economic activity decline, policy should ease more aggressively than it would in the absence of the lower bound. As activity recovers and inflation picks up, the central bank should act to keep interest rates lower for longer than without the bound. In this note, we investigate to what extent the policy easing implemented by the ECB since summer 2013 mirrors the rate recommendations of a simple policy rule or deviates from it in a way that indicates a “lower for longer” approach to policy near zero interest rates.
Schätzwerte mittelfristiger Gleichgewichtszinsen mit der Methode nach Laubach und Williams (2003) werden inzwischen vielfach in der Diskussion um die Geld- und Fiskalpolitik zitiert. Unter anderem wurden sie von Summers (2014a) als Evidenz für eine säkulare Stagnation angeführt und von Yellen (2015) zur Rechtfertigung der Nullzinspolitik verwendet. In diesem Papier nehmen wir eine umfangreiche Untersuchung und Sensitivitätsanalyse dieser Schätzwerte für die Vereinigten Staaten, Deutschland und den Euro-Raum vor. Aufgrund der hohen Unsicherheit und Sensitivität, die mit den Schätzwerten mittelfristiger Gleichgewichtszinsen mit der Laubach-Williams-Methode und ähnlichen Ansätzen verbunden ist, sollten diese Schätzungen nicht den Ausschlag für entscheidende Weichenstellungen in der Geld- und Fiskalpolitik geben.
The recent decline in euro area inflation has triggered new calls for additional monetary stimulus by the ECB in order to counter the threat of a self‐reinforcing deflation and recession spiral. This note reviews the available evidence on inflation expectations, output gaps and other factors driving current inflation through the lens of the Phillips curve. It also draws a comparison to the Japanese experience with deflation in the late 1990s and the evidence from Japan concerning the outputinflation nexus at low trend inflation. The note concludes from this evidence that the risk of a selfreinforcing deflation remains very small. Thus, the ECB best await the impact of the long‐term refinancing operations decided in June that have the potential to induce substantial monetary accommodation once implemented for the first time in September.
On July 4, 2013 the ECB Governing Council provided more specific forward guidance than in the past by stating that it expects ECB interest rates to remain at present or lower levels for an extended period of time. As explained by ECB President Mario Draghi this expectation is based on the Council’s medium-term outlook for inflation conditional on economic activity and money and credit. Draghi also stressed that there is no precise deadline for this extended period of time, but that a reasonable period can be estimated by extracting a reaction function. In this note, we use such a reaction function, namely the interest rate rule from Orphanides and Wieland (2013) that matches past ECB interest rate decisions quite well, to project the rate path consistent with inflation and growth forecasts from the survey of professional forecasters published by the ECB on August 8, 2013. This evaluation suggests an increase in ECB interest rates by May 2014 at the latest. We also use the Eurosystem staff projection from June 6, 2013 for comparison. While it would imply a longer period of low rates, it does not match past ECB decisions as well as the reaction function with SPF forecasts.
This note reviews the legal issues and concerns that are likely to play an important role in the ongoing deliberations of the Federal Constitutional Court of Germany concerning the legality of ECB government bond purchases such as those conducted in the context of its earlier Securities Market Programme or potential future Outright Monetary Transactions.
This note reviews the legal issues and concerns that are likely to play an important role in the ongoing deliberations of the Federal Constitutional Court of Germany concerning the legality of ECB government bond purchases such as those conducted in the context of its earlier Securities Market Programme or potential future Outright Monetary Transactions.
Recently, we evaluated a fiscal consolidation strategy for the United States that would bring the government budget into balance by gradually reducing government spending relative to GDP to the ratio that prevailed prior to the crisis (Cogan et al, JEDC 2013). Specifically, we published an analysis of the macroeconomic consequences of the 2013 Budget Resolution that was passed by the U.S. House of Representatives in March 2012. In this note, we provide an update of our research that evaluates this year’s budget reform proposal that is to be discussed and voted on in the House of Representative in March 2013. Contrary to the views voiced by critics of fiscal consolidation, we show that such a reduction in government purchases and transfer payments can increase GDP immediately and permanently relative to a policy without spending restraint. Our research makes use of a modern structural model of the economy that incorporates the long-standing essential features of economics: opportunity costs, efficiency, foresight and incentives. GDP rises because households take into account that spending restraint helps avoid future increases in tax rates. Lower taxes imply less distorted incentives for work, investment and production relative to a scenario without fiscal consolidation and lead to higher growth.
In this paper, we provide some reflections on the development of monetary theory and monetary policy over the last 150 years. Rather than presenting an encompassing overview, which would be overambitious, we simply concentrate on a few selected aspects that we view as milestones in the development of this subject. We also try to illustrate some of the interactions with the political and financial system, academic discussion and the views and actions of central banks.
In this paper we investigate the comparative properties of empirically-estimated monetary models of the U.S. economy using a new database of models designed for such investigations. We focus on three representative models due to Christiano, Eichenbaum, Evans (2005), Smets and Wouters (2007) and Taylor (1993a). Although these models differ in terms of structure, estimation method, sample period, and data vintage, we find surprisingly similar economic impacts of unanticipated changes in the federal funds rate. However, optimized monetary policy rules differ across models and lack robustness. Model averaging offers an effective strategy for improving the robustness of policy rules.
In this paper, we provide some reflections on the development of monetary theory and monetary policy over the last 150 years. Rather than presenting an encompassing overview, which would be overambitious, we simply concentrate on a few selected aspects that we view as milestones in the development of this subject. We also try to illustrate some of the interactions with the political and financial system, academic discussion and the views and actions of central banks.
In the aftermath of the global financial crisis and great recession, many countries face substantial deficits and growing debts. In the United States, federal government outlays as a ratio to GDP rose substantially from about 19.5 percent before the crisis to over 24 percent after the crisis. In this paper we consider a fiscal consolidation strategy that brings the budget to balance by gradually reducing this spending ratio over time to the level that prevailed prior to the crisis. A crucial issue is the impact of such a consolidation strategy on the economy. We use structural macroeconomic models to estimate this impact focussing primarily on a dynamic stochastic general equilibrium model with price and wage rigidities and adjustment costs. We separate out the impact of reductions in government purchases and transfers, and we allow for a reduction in both distortionary taxes and government debt relative to the baseline of no consolidation. According to the model simulations GDP rises in the short run upon announcement and implementation of this fiscal consolidation strategy and remains higher than the baseline in the long run. We explore the role of the mix of expenditure cuts and tax reductions as well as gradualism in achieving this policy outcome. Finally, we conduct sensitivity studies regarding the type of model used and its parameterization.
The complexity resulting from intertwined uncertainties regarding model misspecification and mismeasurement of the state of the economy defines the monetary policy landscape. Using the euro area as laboratory this paper explores the design of robust policy guides aiming to maintain stability in the economy while recognizing this complexity. We document substantial output gap mismeasurement and make use of a new model data base to capture the evolution of model specification. A simple interest rate rule is employed to interpret ECB policy since 1999. An evaluation of alternative policy rules across 11 models of the euro area confirms the fragility of policy analysis optimized for any specific model and shows the merits of model averaging in policy design. Interestingly, a simple difference rule with the same coefficients on inflation and output growth as the one used to interpret ECB policy is quite robust as long as it responds to current outcomes of these variables.
In the aftermath of the global financial crisis and great recession, many countries face substantial deficits and growing debts. In the United States, federal government outlays as a ratio to GDP rose substantially from about 19.5 percent before the crisis to over 24 percent after the crisis. In this paper we consider a fiscal consolidation strategy that brings the budget to balance by gradually reducing this spending ratio over time to the level that prevailed prior to the crisis. A crucial issue is the impact of such a consolidation strategy on the economy. We use structural macroeconomic models to estimate this impact focussing primarily on a dynamic stochastic general equilibrium model with price and wage rigidities and adjustment costs. We separate out the impact of reductions in government purchases and transfers, and we allow for a reduction in both distortionary taxes and government debt relative to the baseline of no consolidation. According to the model simulations GDP rises in the short run upon announcement and implementation of this fiscal consolidation strategy and remains higher than the baseline in the long run. We explore the role of the mix of expenditure cuts and tax reductions as well as gradualism in achieving this policy outcome. Finally, we conduct sensitivity studies regarding the type of model used and its parameterization.
The complexity resulting from intertwined uncertainties regarding model misspecification and mismeasurement of the state of the economy defines the monetary policy landscape. Using the euro area as laboratory this paper explores the design of robust policy guides aiming to maintain stability in the economy while recognizing this complexity. We document substantial output gap mismeasurement and make use of a new model data base to capture the evolution of model specification. A simple interest rate rule is employed to interpret ECB policy since 1999. An evaluation of alternative policy rules across 11 models of the euro area confirms the fragility of policy analysis optimized for any specific model and shows the merits of model averaging in policy design. Interestingly, a simple difference rule with the same coefficients on inflation and output growth as the one used to interpret ECB policy is quite robust as long as it responds to current outcomes of these variables.
This chapter aims to provide a hands-on approach to New Keynesian models and their uses for macroeconomic policy analysis. It starts by reviewing the origins of the New Keynesian approach, the key model ingredients and representative models. Building blocks of current-generation dynamic stochastic general equilibrium (DSGE) models are discussed in detail. These models address the famous Lucas critique by deriving behavioral equations systematically from the optimizing and forward-looking decision-making of households and firms subject to well-defined constraints. State-of-the-art methods for solving and estimating such models are reviewed and presented in examples. The chapter goes beyond the mere presentation of the most popular benchmark model by providing a framework for model comparison along with a database that includes a wide variety of macroeconomic models. Thus, it offers a convenient approach for comparing new models to available benchmarks and for investigating whether particular policy recommendations are robust to model uncertainty. Such robustness analysis is illustrated by evaluating the performance of simple monetary policy rules across a range of recently-estimated models including some with financial market imperfections and by reviewing recent comparative findings regarding the magnitude of government spending multipliers. The chapter concludes with a discussion of important objectives for on-going and future research using the New Keynesian framework.
In the aftermath of the global financial crisis, the state of macroeconomic modeling and the use of macroeconomic models in policy analysis has come under heavy criticism. Macroeconomists in academia and policy institutions have been blamed for relying too much on a particular class of macroeconomic models. This paper proposes a comparative approach to macroeconomic policy analysis that is open to competing modeling paradigms. Macroeconomic model comparison projects have helped produce some very influential insights such as the Taylor rule. However, they have been infrequent and costly, because they require the input of many teams of researchers and multiple meetings to obtain a limited set of comparative findings. This paper provides a new approach that enables individual researchers to conduct model comparisons easily, frequently, at low cost and on a large scale. Using this approach a model archive is built that includes many well-known empirically estimated models that may be used for quantitative analysis of monetary and fiscal stabilization policies. A computational platform is created that allows straightforward comparisons of models’ implications. Its application is illustrated by comparing different monetary and fiscal policies across selected models. Researchers can easily include new models in the data base and compare the effects of novel extensions to established benchmarks thereby fostering a comparative instead of insular approach to model development.
In the aftermath of the global financial crisis, the state of macroeconomicmodeling and the use of macroeconomic models in policy analysis has come under heavy criticism. Macroeconomists in academia and policy institutions have been blamed for relying too much on a particular class of macroeconomic models. This paper proposes a comparative approach to macroeconomic policy analysis that is open to competing modeling paradigms. Macroeconomic model comparison projects have helped produce some very influential insights such as the Taylor rule. However, they have been infrequent and costly, because they require the input of many teams of researchers and multiple meetings to obtain a limited set of comparative findings. This paper provides a new approach that enables individual researchers to conduct model comparisons easily, frequently, at low cost and on a large scale. Using this approach a model archive is built that includes many well-known empirically estimated models that may be used for quantitative analysis of monetary and fiscal stabilization policies. A computational platform is created that allows straightforward comparisons of models’ implications. Its application is illustrated by comparing different monetary and fiscal policies across selected models. Researchers can easily include new models in the data base and compare the effects of novel extensions to established benchmarks thereby fostering a comparative instead of insular approach to model development
This paper proposes a new approach for modeling investor fear after rare disasters. The key element is to take into account that investors’ information about fundamentals driving rare downward jumps in the dividend process is not perfect. Bayesian learning implies that beliefs about the likelihood of rare disasters drop to a much more pessimistic level once a disaster has occurred. Such a shift in beliefs can trigger massive declines in price-dividend ratios. Pessimistic beliefs persist for some time. Thus, belief dynamics are a source of apparent excess volatility relative to a rational expectations benchmark. Due to the low frequency of disasters, even an infinitely-lived investor will remain uncertain about the exact probability. Our analysis is conducted in continuous time and offers closed-form solutions for asset prices. We distinguish between rational and adaptive Bayesian learning. Rational learners account for the possibility of future changes in beliefs in determining their demand for risky assets, while adaptive learners take beliefs as given. Thus, risky assets tend to be lower-valued and price-dividend ratios vary less under adaptive versus rational learning for identical priors. Keywords: beliefs, Bayesian learning, controlled diffusions and jump processes, learning about jumps, adaptive learning, rational learning. JEL classification: D83, G11, C11, D91, E21, D81, C61
This paper investigates the accuracy and heterogeneity of output growth and inflation forecasts during the current and the four preceding NBER-dated U.S. recessions. We generate forecasts from six different models of the U.S. economy and compare them to professional forecasts from the Federal Reserve’s Greenbook and the Survey of Professional Forecasters (SPF). The model parameters and model forecasts are derived from historical data vintages so as to ensure comparability to historical forecasts by professionals. The mean model forecast comes surprisingly close to the mean SPF and Greenbook forecasts in terms of accuracy even though the models only make use of a small number of data series. Model forecasts compare particularly well to professional forecasts at a horizon of three to four quarters and during recoveries. The extent of forecast heterogeneity is similar for model and professional forecasts but varies substantially over time. Thus, forecast heterogeneity constitutes a potentially important source of economic fluctuations. While the particular reasons for diversity in professional forecasts are not observable, the diversity in model forecasts can be traced to different modeling assumptions, information sets and parameter estimates. JEL Classification: C53, D84, E31, E32, E37 Keywords: Forecasting, Business Cycles, Heterogeneous Beliefs, Forecast Distribution, Model Uncertainty, Bayesian Estimation
This paper reviews the rationale for quantitative easing when central bank policy rates reach near zero levels in light of recent announcements regarding direct asset purchases by the Bank of England, the Bank of Japan, the U.S. Federal Reserve and the European Central Bank. Empirical evidence from the previous period of quantitative easing in Japan between 2001 and 2006 is presented. During this earlier period the Bank of Japan was able to expand the monetary base very quickly and significantly. Quantitative easing translated into a greater and more lasting expansion of M1 relative to nominal GDP. Deflation subsided by 2005. As soon as inflation appeared to stabilize near a rate of zero, the Bank of Japan rapidly reduced the monetary base as a share of nominal income as it had announced in 2001. The Bank was able to exit from extensive quantitative easing within less than a year. Some implications for the current situation in Europe and the United States are discussed.
Recent evaluations of the fiscal stimulus packages recently enacted in the United States and Europe such as Cogan, Cwik, Taylor and Wieland (2009) and Cwik and Wieland (2009) suggest that the GDP effects will be modest due to crowding-out of private consumption and investment. Corsetti, Meier and Mueller (2009a,b) argue that spending shocks are typically followed by consolidations with substantive spending cuts, which enhance the short-run stimulus effect. This note investigates the implications of this argument for the estimated impact of recent stimulus packages and the case for discretionary fiscal policy.
The global financial crisis has lead to a renewed interest in discretionary fiscal stimulus. Advocates of discretionary measures emphasize that government spending can stimulate additional private spending — the so-called Keynesian multiplier effect. Thus, we investigate whether the discretionary spending announced by Euro area governments for 2009 and 2010 is likely to boost euro area GDP by more than one for one. Because of modeling uncertainty, it is essential that such policy evaluations be robust to alternative modeling assumptions and different parameterizations. Therefore, we use five different empirical macroeconomic models with Keynesian features such as price and wage rigidities to evaluate the impact of fiscal stimulus. Four of them suggest that the planned increase in government spending will reduce private spending for consumption and investment purposes significantly. If announced government expenditures are implemented with delay the initial effect on euro area GDP, when stimulus is most needed, may even be negative. Traditional Keynesian multiplier effects only arise in a model that ignores the forward-looking behavioral response of consumers and firms. Using a multi-country model, we find that spillovers between euro area countries are negligible or even negative, because direct demand effects are offset by the indirect effect of euro appreciation.
In this paper we investigate the comparative properties of empirically-estimated monetary models of the U.S. economy. We make use of a new data base of models designed for such investigations. We focus on three representative models: the Christiano, Eichenbaum, Evans (2005) model, the Smets and Wouters (2007) model, and the Taylor (1993a) model. Although the three models differ in terms of structure, estimation method, sample period, and data vintage, we find surprisingly similar economic impacts of unanticipated changes in the federal funds rate. However, the optimal monetary policy responses to other sources of economic fluctuations are widely different in the different models. We show that simple optimal policy rules that respond to the growth rate of output and smooth the interest rate are not robust. In contrast, policy rules with no interest rate smoothing and no response to the growth rate, as distinct from the level, of output are more robust. Robustness can be improved further by optimizing rules with respect to the average loss across the three models.
In the New-Keynesian model, optimal interest rate policy under uncertainty is formulated without reference to monetary aggregates as long as certain standard assumptions on the distributions of unobservables are satisfied. The model has been criticized for failing to explain common trends in money growth and inflation, and that therefore money should be used as a cross-check in policy formulation (see Lucas (2007)). We show that the New-Keynesian model can explain such trends if one allows for the possibility of persistent central bank misperceptions. Such misperceptions motivate the search for policies that include additional robustness checks. In earlier work, we proposed an interest rate rule that is near-optimal in normal times but includes a cross-check with monetary information. In case of unusual monetary trends, interest rates are adjusted. In this paper, we show in detail how to derive the appropriate magnitude of the interest rate adjustment following a significant cross-check with monetary information, when the New-Keynesian model is the central bank’s preferred model. The cross-check is shown to be effective in offsetting persistent deviations of inflation due to central bank misperceptions. Keywords: Monetary Policy, New-Keynesian Model, Money, Quantity Theory, European Central Bank, Policy Under Uncertainty
Renewed interest in fiscal policy has increased the use of quantitative models to evaluate policy. Because of modeling uncertainty, it is essential that policy evaluations be robust to alternative assumptions. We find that models currently being used in practice to evaluate fiscal policy stimulus proposals are not robust. Government spending multipliers in an alternative empirically-estimated and widely-cited new Keynesian model are much smaller than in these old Keynesian models; the estimated stimulus is extremely small with GDP and employment effects only one-sixth as large.
Research with Keynesian-style models has emphasized the importance of the output gap for policies aimed at controlling inflation while declaring monetary aggregates largely irrelevant. Critics, however, have argued that these models need to be modified to account for observed money growth and inflation trends, and that monetary trends may serve as a useful cross-check for monetary policy. We identify an important source of monetary trends in form of persistent central bank misperceptions regarding potential output. Simulations with historical output gap estimates indicate that such misperceptions may induce persistent errors in monetary policy and sustained trends in money growth and inflation. If interest rate prescriptions derived from Keynesian-style models are augmented with a cross-check against money-based estimates of trend inflation, inflation control is improved substantially.
This paper introduces adaptive learning and endogenous indexation in the New-Keynesian Phillips curve and studies disinflation under inflation targeting policies. The analysis is motivated by the disinflation performance of many inflation-targeting countries, in particular the gradual Chilean disinflation with temporary annual targets. At the start of the disinflation episode price-setting firms’ expect inflation to be highly persistent and opt for backward-looking indexation. As the central bank acts to bring inflation under control, price-setting firms revise their estimates of the degree of persistence. Such adaptive learning lowers the cost of disinflation. This reduction can be exploited by a gradual approach to disinflation. Firms that choose the rate for indexation also re-assess the likelihood that announced inflation targets determine steady-state inflation and adjust indexation of contracts accordingly. A strategy of announcing and pursuing short-term targets for inflation is found to influence the likelihood that firms switch from backward-looking indexation to the central bank’s targets. As firms abandon backward-looking indexation the costs of disinflation decline further. We show that an inflation targeting strategy that employs temporary targets can benefit from lower disinflation costs due to the reduction in backward-looking indexation.
Monetary policy analysts often rely on rules-of-thumb, such as the Taylor rule, to describe historical monetary policy decisions and to compare current policy to historical norms. Analysis along these lines also permits evaluation of episodes where policy may have deviated from a simple rule and examination of the reasons behind such deviations. One interesting question is whether such rules-of-thumb should draw on policymakers "forecasts of key variables such as inflation and unemployment or on observed outcomes. Importantly, deviations of the policy from the prescriptions of a Taylor rule that relies on outcomes may be due to systematic responses to information captured in policymakers" own projections. We investigate this proposition in the context of FOMC policy decisions over the past 20 years using publicly available FOMC projections from the biannual monetary policy reports to the Congress (Humphrey-Hawkins reports). Our results indicate that FOMC decisions can indeed be predominantly explained in terms of the FOMC´s own projections rather than observed outcomes. Thus, a forecast-based rule-of-thumb better characterizes FOMC decision-making. We also confirm that many of the apparent deviations of the federal funds rate from an outcome-based Taylor-style rule may be considered systematic responses to information contained in FOMC projections.
The European Central Bank has assigned a special role to money in its two pillar strategy and has received much criticism for this decision. The case against including money in the central bank’s interest rate rule is based on a standard model of the monetary transmission process that underlies many contributions to research on monetary policy in the last two decades. In this paper, we develop a justification for including money in the interest rate rule by allowing for imperfect knowledge regarding unobservables such as potential output and equilibrium interest rates. We formulate a novel characterization of ECB-style monetary cross-checking and show that it can generate substantial stabilization benefits in the event of persistent policy misperceptions regarding potential output. JEL Classification: E32, E41, E43, E52, E58
The European Central Bank has assigned a special role to money in its two pillar strategy and has received much criticism for this decision. In this paper, we explore possible justifications. The case against including money in the central bank’s interest rate rule is based on a standard model of the monetary transmission process that underlies many contributions to research on monetary policy in the last two decades. Of course, if one allows for a direct effect of money on output or inflation as in the empirical “two-pillar” Phillips curves estimated in some recent contributions, it would be optimal to include a measure of (long-run) money growth in the rule. In this paper, we develop a justification for including money in the interest rate rule by allowing for imperfect knowledge regarding unobservables such as potential output and equilibrium interest rates. We formulate a novel characterization of ECB-style monetary cross-checking and show that it can generate substantial stabilization benefits in the event of persistent policy misperceptions regarding potential output. Such misperceptions cause a bias in policy setting. We find that cross-checking and changing interest rates in response to sustained deviations of long-run money growth helps the central bank to overcome this bias. Our argument in favor of ECB-style cross-checking does not require direct effects of money on output or inflation. JEL Classification: E32, E41, E43, E52, E58
The European Central Bank
(2007)
The establishment of the ECB and with it the launch of the euro has arguably been a unique endeavor in economic history, representing an important experiment in central banking. This note aims to summarize some of the main lessons learned from this experiment and sketch some of the prospects for the ECB. It is written for "The New Palgrave Dictionary of Economics", 2nd edition. JEL Classification: E52, E58
In this paper, we consider expected value, variance and worst-case optimization of nonlinear models. We present algorithms for computing optimal expected values, and variance, based on iterative Taylor expansions. We establish convergence and consider the relative merits of policies beaded on expected value optimization and worst-case robustness. The latter is a minimax strategy and ensures optimal cover in view of the worst-case scenario(s) while the former is optimal expected performance in a stochastic setting. Both approaches are used with a macroeconomic policy model to illustrate relative performances, robustness and trade-offs between the strategies. Klassifikation: C61, E43
Under a conventional policy rule, a central bank adjusts its policy rate linearly according to the gap between inflation and its target, and the gap between output and its potential. Under "the opportunistic approach to disinflation" a central bank controls inflation aggressively when inflation is far from its target, but concentrates more on output stabilization when inflation is close to its target, allowing supply shocks and unforeseen fluctuations in aggregate demand to move inflation within a certain band. We use stochastic simulations of a small-scale rational expectations model to contrast the behavior of output and inflation under opportunistic and linear rules. Klassifikation: E31, E52, E58, E61. July, 2005.
In this paper, we examine the cost of insurance against model uncertainty for the Euro area considering four alternative reference models, all of which are used for policy-analysis at the ECB.We find that maximal insurance across this model range in terms of aMinimax policy comes at moderate costs in terms of lower expected performance. We extract priors that would rationalize the Minimax policy from a Bayesian perspective. These priors indicate that full insurance is strongly oriented towards the model with highest baseline losses. Furthermore, this policy is not as tolerant towards small perturbations of policy parameters as the Bayesian policy rule. We propose to strike a compromise and use preferences for policy design that allow for intermediate degrees of ambiguity-aversion.These preferences allow the specification of priors but also give extra weight to the worst uncertain outcomes in a given context. JEL Klassifikation: E52, E58, E61
In this paper, we examine the cost of insurance against model uncertainty for the Euro area considering four alternative reference models, all of which are used for policy-analysis at the ECB.We find that maximal insurance across this model range in terms of aMinimax policy comes at moderate costs in terms of lower expected performance. We extract priors that would rationalize the Minimax policy from a Bayesian perspective. These priors indicate that full insurance is strongly oriented towards the model with highest baseline losses. Furthermore, this policy is not as tolerant towards small perturbations of policy parameters as the Bayesian policy rule. We propose to strike a compromise and use preferences for policy design that allow for intermediate degrees of ambiguity-aversion.These preferences allow the specification of priors but also give extra weight to the worst uncertain outcomes in a given context. JEL Klassifikation: E52, E58, E61.
In this paper, we study the effectiveness of monetary policy in a severe recession and deflation when nominal interest rates are bounded at zero. We compare two alternative proposals for ameliorating the effect of the zero bound: an exchange-rate peg and price-level targeting. We conduct this quantitative comparison in an empirical macroeconometric model of Japan, the United States and the euro area. Furthermore, we use a stylized micro-founded two-country model to check our qualitative findings. We find that both proposals succeed in generating inflationary expectations and work almost equally well under full credibility of monetary policy. However, price-level targeting may be less effective under imperfect credibility, because the announced price-level target path is not directly observable. Klassifikation: E31, E52, E58, E61
Price stability and monetary policy effectiveness when nominal interest rates are bounded at zero
(2003)
This paper employs stochastic simulations of a small structural rational expectations model to investigate the consequences of the zero bound on nominal interest rates. We find that if the economy is subject to stochastic shocks similar in magnitude to those experienced in the U.S. over the 1980s and 1990s, the consequences of the zero bound are negligible for target inflation rates as low as 2 percent. However, the effects of the constraint are non-linear with respect to the inflation target and produce a quantitatively significant deterioration of the performance of the economy with targets between 0 and 1 percent. The variability of output increases significantly and that of inflation also rises somewhat. Also, we show that the asymmetry of the policy ineffectiveness induced by the zero bound generates a non-vertical long-run Phillips curve. Output falls increasingly short of potential with lower inflation targets.
In this paper we study the role of the exchange rate in conducting monetary policy in an economy with near-zero nominal interest rates as experienced in Japan since the mid-1990s. Our analysis is based on an estimated model of Japan, the United States and the euro area with rational expectations and nominal rigidities. First, we provide a quantitative analysis of the impact of the zero bound on the effectiveness of interest rate policy in Japan in terms of stabilizing output and inflation. Then we evaluate three concrete proposals that focus on depreciation of the currency as a way to ameliorate the effect of the zero bound and evade a potential liquidity trap. Finally, we investigate the international consequences of these proposals.
In this paper we estimate a small model of the euro area to be used as a laboratory for evaluating the performance of alternative monetary policy strategies. We start with the relationship between output and inflation and investigate the fit of the nominal wage contracting model due to Taylor (1980)and three different versions of the relative real wage contracting model proposed by Buiter and Jewitt (1981)and estimated by Fuhrer and Moore (1995a) for the United States. While Fuhrer and Moore reject the nominal contracting model in favor of the relative contracting model which induces more inflation persistence, we find that both models fit euro area data reasonably well. When considering France, Germany and Italy separately, however, we find that the nominal contracting model fits German data better, while the relative contracting model does quite well in countries which transitioned out of a high inflation regime such as France and Italy. We close the model by estimating an aggregate demand relationship and investigate the consequences of the different wage contracting specifications for the inflation-output variability tradeoff, when interest rates are set according to Taylor 's rule.
In this study, we perform a quantitative assessment of the role of money as an indicator variable for monetary policy in the euro area. We document the magnitude of revisions to euro area-wide data on output, prices, and money, and find that monetary aggregates have a potentially significant role in providing information about current real output. We then proceed to analyze the information content of money in a forward-looking model in which monetary policy is optimally determined subject to incomplete information about the true state of the economy. We show that monetary aggregates may have substantial information content in an environment with high variability of output measurement errors, low variability of money demand shocks, and a strong contemporaneous linkage between money demand and real output. As a practical matter, however, we conclude that money has fairly limited information content as an indicator of contemporaneous aggregate demand in the euro area.
We investigate the performance of forecast-based monetary policy rules using five macroeconomic models that reflect a wide range of views on aggregate dynamics. We identify the key characteristics of rules that are robust to model uncertainty: such rules respond to the one-year-ahead inflation forecast and to the current output gap and incorporate a substantial degree of policy inertia. In contrast, rules with longer forecast horizons are less robust and are prone to generating indeterminacy. Finally, we identify a robust benchmark rule that performs very well in all five models over a wide range of policy preferences.
Inflation-targeting central banks have only imperfect knowledge about the effect of policy decisions on inflation. An important source of uncertainty is the relationship between inflation and unemployment. This paper studies the optimal monetary policy in the presence of uncertainty about the natural unemployment rate, the short-run inflation-unemployment tradeoff and the degree of inflation persistence in a simple macroeconomic model, which incorporates rational learning by the central bank as well as private sector agents. Two conflicting motives drive the optimal policy. In the static version of the model, uncertainty provides a motive for the policymaker to move more cautiously than she would if she knew the true parameters. In the dynamic version, uncertainty also motivates an element of experimentation in policy. I find that the optimal policy that balances the cautionary and activist motives typically exhibits gradualism, that is, it still remains less aggressive than a policy that disregards parameter uncertainty. Exceptions occur when uncertainty is very high and in inflation close to target.