CFS working paper series
https://gfk-cfs.de/working-papers/
Refine
Year of publication
Document Type
- Working Paper (43)
Language
- English (43) (remove)
Has Fulltext
- yes (43)
Is part of the Bibliography
- no (43)
Keywords
- Geldpolitik (43) (remove)
Institute
2009, 19
In the New-Keynesian model, optimal interest rate policy under uncertainty is formulated without reference to monetary aggregates as long as certain standard assumptions on the distributions of unobservables are satisfied. The model has been criticized for failing to explain common trends in money growth and inflation, and that therefore money should be used as a cross-check in policy formulation (see Lucas (2007)). We show that the New-Keynesian model can explain such trends if one allows for the possibility of persistent central bank misperceptions. Such misperceptions motivate the search for policies that include additional robustness checks. In earlier work, we proposed an interest rate rule that is near-optimal in normal times but includes a cross-check with monetary information. In case of unusual monetary trends, interest rates are adjusted. In this paper, we show in detail how to derive the appropriate magnitude of the interest rate adjustment following a significant cross-check with monetary information, when the New-Keynesian model is the central bank’s preferred model. The cross-check is shown to be effective in offsetting persistent deviations of inflation due to central bank misperceptions. Keywords: Monetary Policy, New-Keynesian Model, Money, Quantity Theory, European Central Bank, Policy Under Uncertainty
2009, 30
This paper reviews the rationale for quantitative easing when central bank policy rates reach near zero levels in light of recent announcements regarding direct asset purchases by the Bank of England, the Bank of Japan, the U.S. Federal Reserve and the European Central Bank. Empirical evidence from the previous period of quantitative easing in Japan between 2001 and 2006 is presented. During this earlier period the Bank of Japan was able to expand the monetary base very quickly and significantly. Quantitative easing translated into a greater and more lasting expansion of M1 relative to nominal GDP. Deflation subsided by 2005. As soon as inflation appeared to stabilize near a rate of zero, the Bank of Japan rapidly reduced the monetary base as a share of nominal income as it had announced in 2001. The Bank was able to exit from extensive quantitative easing within less than a year. Some implications for the current situation in Europe and the United States are discussed.
2009, 01
Opting out of the great inflation: German monetary policy after the break down of Bretton Woods
(2009)
During the turbulent 1970s and 1980s the Bundesbank established an outstanding reputation in the world of central banking. Germany achieved a high degree of domestic stability and provided safe haven for investors in times of turmoil in the international financial system. Eventually the Bundesbank provided the role model for the European Central Bank. Hence, we examine an episode of lasting importance in European monetary history. The purpose of this paper is to highlight how the Bundesbank monetary policy strategy contributed to this success. We analyze the strategy as it was conceived, communicated and refined by the Bundesbank itself. We propose a theoretical framework (following Söderström, 2005) where monetary targeting is interpreted, first and foremost, as a commitment device. In our setting, a monetary target helps anchoring inflation and inflation expectations. We derive an interest rate rule and show empirically that it approximates the way the Bundesbank conducted monetary policy over the period 1975-1998. We compare the Bundesbank´s monetary policy rule with those of the FED and of the Bank of England. We find that the Bundesbank´s policy reaction function was characterized by strong persistence of policy rates as well as a strong response to deviations of inflation from target and to the activity growth gap. In contrast, the response to the level of the output gap was not significant. In our empirical analysis we use real-time data, as available to policy-makers at the time. JEL Classification: E31, E32, E41, E52, E58
2008, 29
This paper explores the role of trade integration—or openness—for monetary policy transmission in a medium-scale New Keynesian model. Allowing for strategic complementarities in price-setting, we highlight a new dimension of the exchange rate channel by which monetary policy directly impacts domestic inflation. Although the strength of this effect increases with economic openness, it also requires that import prices respond to exchange rate changes. In this case domestic producers find it optimal to adjust their prices to exchange rate changes which alter the domestic currency price of their foreign competitors. We pin down key parameters of the model by matching impulse responses obtained from a vector autoregression on U.S. time series relative to an aggregate of industrialized countries. While we find evidence for strong complementarities, exchange rate pass-through is limited. Openness has therefore little bearing on monetary transmission in the estimated model.
2008, 16
Monetary policy analysts often rely on rules-of-thumb, such as the Taylor rule, to describe historical monetary policy decisions and to compare current policy to historical norms. Analysis along these lines also permits evaluation of episodes where policy may have deviated from a simple rule and examination of the reasons behind such deviations. One interesting question is whether such rules-of-thumb should draw on policymakers "forecasts of key variables such as inflation and unemployment or on observed outcomes. Importantly, deviations of the policy from the prescriptions of a Taylor rule that relies on outcomes may be due to systematic responses to information captured in policymakers" own projections. We investigate this proposition in the context of FOMC policy decisions over the past 20 years using publicly available FOMC projections from the biannual monetary policy reports to the Congress (Humphrey-Hawkins reports). Our results indicate that FOMC decisions can indeed be predominantly explained in terms of the FOMC´s own projections rather than observed outcomes. Thus, a forecast-based rule-of-thumb better characterizes FOMC decision-making. We also confirm that many of the apparent deviations of the federal funds rate from an outcome-based Taylor-style rule may be considered systematic responses to information contained in FOMC projections.
2008, 30
We study the responses of residential property and equity prices, inflation and economic activity to monetary policy shocks in 17 countries, using data spanning 1986-2006, using single-country VARs and panel VARs in which we distinguish between groups of countries depending on their financial systems. The effect of monetary policy on property prices is about three times as large as its impact on GDP. Using monetary policy to guard against financial instability by offsetting asset-price movements thus has sizable effects on economic activity. While the financial structure influences the impact of policy on asset prices, its importance appears limited.
2007, 14
This paper uses factor-augmented vector autoregressions (FAVAR) estimated using a large data set to disentangle fluctuations in disaggregated consumer and producer prices which are due to macroeconomic factors from those due to sectorial conditions. This allows us to provide consistent estimates of the effects of US monetary policy on disaggregated prices. While sectorial prices respond quickly to sector-specific shocks, we find that for a large number of price series, there is a significant delay in the response of prices to monetary policy shocks. In addition, price responses display little evidence of a “price puzzle,” contrary to existing studies based on traditional VARs. The observed dispersion in the reaction of producer prices is relatively well explained by the degree of market power, as predicted by models with monopolistic competition. JEL Classification: E32, E52
2007, 16
This paper proposes a possible way of assessing the effect of interest rate dynamics on changes in the decision-making approach, communication strategy and operational framework of a Central bank. Through a GARCH specification we show that the USA and Euro area displayed a limited but significant spillover of volatility from money market to longer-term rates. We then checked the stability of this phenomenon in the most recent period of improved policymaking and found empirical evidence that the transmission of overnight volatility along the yield curve vanished soon after specific policy changes of the FED and ECB.
2007, 07
We focus on a quantitative assessment of rigid labor markets in an environment of stable monetary policy. We ask how wages and labor market shocks feed into the inflation process and derive monetary policy implications. Towards that aim, we structurally model matching frictions and rigid wages in line with an optimizing rationale in a New Keynesian closed economy DSGE model. We estimate the model using Bayesian techniques for German data from the late 1970s to present. Given the pre-euro heterogeneity in wage bargaining we take this as the first-best approximation at hand for modelling monetary policy in the presence of labor market frictions in the current European regime. In our framework, we find that labor market structure is of prime importance for the evolution of the business cycle, and for monetary policy in particular. Yet shocks originating in the labor market itself may contain only limited information for the conduct of stabilization policy. JEL Classification: E32, E52, J64, C11
2007, 10
Mortgage markets, collateral constraints, and monetary policy: do institutional factors matter?
(2006)
We study the role of institutional characteristics of mortgage markets in affecting the strength and timing of the effects of monetary policy shocks on house prices and consumption in a sample of OECD countries. We document three facts: (1) there is significant divergence in the structure of mortgage markets across the main industrialised countries; (2) at the business cycle frequency, the correlation between consumption and house prices increases with the degree of flexibility/development of mortgage markets; (3) the transmission of monetary policy shocks on consumption and house prices is stronger in countries with more flexible/developed mortgage markets. We then build a two-sector dynamic general equilibrium model with price stickiness and collateral constraints, where the ability of borrowing is endogenously linked to the nominal value of a durable asset (housing). We study how the response of consumption to monetary policy shocks is affected by alternative values of three key institutional parameters: (i) down-payment rate; (ii) mortgage repayment rate; (iii) interest rate mortgage structure (variable vs. fixed interest rate). In line with our empirical evidence, the sensitivity of consumption to monetary policy shocks increases with lower values of (i) and (ii), and is larger under a variable-rate mortgage structure. JEL Classification: E21, E44, E52
2007, 12
The paper considers optimal monetary stabilization policy in a forward-looking model, when the central bank recognizes that private-sector expectations need not be precisely model-consistent, and wishes to choose a policy that will be as good as possible in the case of any beliefs that are close enough to model-consistency. It is found that commitment continues to be important for optimal policy, that the optimal long-run inflation target is unaffected by the degree of potential distortion of beliefs, and that optimal policy is even more history-dependent than if rational expectations are assumed. JEL Classification: E52, E58, E42
2007, 11
We study the problem of a policymaker who seeks to set policy optimally in an economy where the true economic structure is unobserved, and policymakers optimally learn from their observations of the economy. This is a classic problem of learning and control, variants of which have been studied in the past, but little with forward-looking variables which are a key component of modern policy-relevant models. As in most Bayesian learning problems, the optimal policy typically includes an experimentation component reflecting the endogeneity of information. We develop algorithms to solve numerically for the Bayesian optimal policy (BOP). However the BOP is only feasible in relatively small models, and thus we also consider a simpler specification we term adaptive optimal policy (AOP) which allows policymakers to update their beliefs but shortcuts the experimentation motive. In our setting, the AOP is significantly easier to compute, and in many cases provides a good approximation to the BOP. We provide a simple example to illustrate the role of learning and experimentation in an MJLQ framework. JEL Classification: E42, E52, E58
2007, 17
The European Central Bank has assigned a special role to money in its two pillar strategy and has received much criticism for this decision. In this paper, we explore possible justifications. The case against including money in the central bank’s interest rate rule is based on a standard model of the monetary transmission process that underlies many contributions to research on monetary policy in the last two decades. Of course, if one allows for a direct effect of money on output or inflation as in the empirical “two-pillar” Phillips curves estimated in some recent contributions, it would be optimal to include a measure of (long-run) money growth in the rule. In this paper, we develop a justification for including money in the interest rate rule by allowing for imperfect knowledge regarding unobservables such as potential output and equilibrium interest rates. We formulate a novel characterization of ECB-style monetary cross-checking and show that it can generate substantial stabilization benefits in the event of persistent policy misperceptions regarding potential output. Such misperceptions cause a bias in policy setting. We find that cross-checking and changing interest rates in response to sustained deviations of long-run money growth helps the central bank to overcome this bias. Our argument in favor of ECB-style cross-checking does not require direct effects of money on output or inflation. JEL Classification: E32, E41, E43, E52, E58
2007, 18
The European Central Bank has assigned a special role to money in its two pillar strategy and has received much criticism for this decision. The case against including money in the central bank’s interest rate rule is based on a standard model of the monetary transmission process that underlies many contributions to research on monetary policy in the last two decades. In this paper, we develop a justification for including money in the interest rate rule by allowing for imperfect knowledge regarding unobservables such as potential output and equilibrium interest rates. We formulate a novel characterization of ECB-style monetary cross-checking and show that it can generate substantial stabilization benefits in the event of persistent policy misperceptions regarding potential output. JEL Classification: E32, E41, E43, E52, E58
2006, 30
The paper constructs a global monetary aggregate, namely the sum of the key monetary aggregates of the G5 economies (US, Euro area, Japan, UK, and Canada), and analyses its indicator properties for global output and inflation. Using a structural VAR approach we find that after a monetary policy shock output declines temporarily, with the downward effect reaching a peak within the second year, and the global monetary aggregate drops significantly. In addition, the price level rises permanently in response to a positive shock to the global liquidity aggregate. The similarity of our results with those found in country studies might supports the use of a global monetary aggregate as a summary measure of worldwide monetary trends. JEL Classification: E52, F01
2005, 31
Using a set of regional inflation rates we examine the dynamics of inflation dispersion within the U.S.A., Japan and across U.S. and Canadian regions. We find that inflation rate dispersion is significant throughout the sample period in all three samples. Based on methods applied in the empirical growth literature, we provide evidence in favor of significant mean reversion (ß-convergence) in inflation rates in all considered samples. The evidence on ó-convergence is mixed, however. Observed declines in dispersion are usually associated with decreasing overall inflation levels which indicates a positive relationship between mean inflation and overall inflation rate dispersion. Our findings for the within-distribution dynamics of regional inflation rates show that dynamics are largest for Japanese prefectures, followed by U.S. metropolitan areas. For the combined U.S.-Canadian sample, we find a pattern of within-distribution dynamics that is comparable to that found for regions within the European Monetary Union (EMU). In line with findings in the so-called 'border literature' these results suggest that frictions across European markets are at least as large as they are, e.g., across North American markets. Klassifikation: E31, E52, E58
2005, 01
This study offers a historical review of the monetary policy reform of October 6, 1979, and discusses the influences behind it and its significance. We lay out the record from the start of 1979 through the spring of 1980, relying almost exclusively upon contemporaneous sources, including the recently released transcripts of Federal Open Market Committee (FOMC) meetings during 1979. We then present and discuss in detail the reasons for the FOMC's adoption of the reform and the communications challenge presented to the Committee during this period. Further, we examine whether the essential characteristics of the reform were consistent with monetarism, new, neo, or old-fashioned Keynesianism, nominal income targeting, and inflation targeting. The record suggests that the reform was adopted when the FOMC became convinced that its earlier gradualist strategy using finely tuned interest rate moves had proved inadequate for fighting inflation and reversing inflation expectations. The new plan had to break dramatically with established practice, allow for the possibility of substantial increases in short-term interest rates, yet be politically acceptable, and convince financial markets participants that it would be effective. The new operating procedures were also adopted for the pragmatic reason that they would likely succeed. JEL Klassifikation: E52, E58, E61, E65.
2005, 13
In this paper, we examine the cost of insurance against model uncertainty for the Euro area considering four alternative reference models, all of which are used for policy-analysis at the ECB.We find that maximal insurance across this model range in terms of aMinimax policy comes at moderate costs in terms of lower expected performance. We extract priors that would rationalize the Minimax policy from a Bayesian perspective. These priors indicate that full insurance is strongly oriented towards the model with highest baseline losses. Furthermore, this policy is not as tolerant towards small perturbations of policy parameters as the Bayesian policy rule. We propose to strike a compromise and use preferences for policy design that allow for intermediate degrees of ambiguity-aversion.These preferences allow the specification of priors but also give extra weight to the worst uncertain outcomes in a given context. JEL Klassifikation: E52, E58, E61.
2005, 14
In this paper, we examine the cost of insurance against model uncertainty for the Euro area considering four alternative reference models, all of which are used for policy-analysis at the ECB.We find that maximal insurance across this model range in terms of aMinimax policy comes at moderate costs in terms of lower expected performance. We extract priors that would rationalize the Minimax policy from a Bayesian perspective. These priors indicate that full insurance is strongly oriented towards the model with highest baseline losses. Furthermore, this policy is not as tolerant towards small perturbations of policy parameters as the Bayesian policy rule. We propose to strike a compromise and use preferences for policy design that allow for intermediate degrees of ambiguity-aversion.These preferences allow the specification of priors but also give extra weight to the worst uncertain outcomes in a given context. JEL Klassifikation: E52, E58, E61
2005, 19
Under a conventional policy rule, a central bank adjusts its policy rate linearly according to the gap between inflation and its target, and the gap between output and its potential. Under "the opportunistic approach to disinflation" a central bank controls inflation aggressively when inflation is far from its target, but concentrates more on output stabilization when inflation is close to its target, allowing supply shocks and unforeseen fluctuations in aggregate demand to move inflation within a certain band. We use stochastic simulations of a small-scale rational expectations model to contrast the behavior of output and inflation under opportunistic and linear rules. Klassifikation: E31, E52, E58, E61. July, 2005.