Refine
Year of publication
Document Type
- Working Paper (31) (remove)
Language
- English (31)
Has Fulltext
- yes (31) (remove)
Is part of the Bibliography
- no (31) (remove)
Keywords
- Monetary Policy (31) (remove)
In the aftermath of the global financial crisis, the state of macroeconomicmodeling and the use of macroeconomic models in policy analysis has come under heavy criticism. Macroeconomists in academia and policy institutions have been blamed for relying too much on a particular class of macroeconomic models. This paper proposes a comparative approach to macroeconomic policy analysis that is open to competing modeling paradigms. Macroeconomic model comparison projects have helped produce some very influential insights such as the Taylor rule. However, they have been infrequent and costly, because they require the input of many teams of researchers and multiple meetings to obtain a limited set of comparative findings. This paper provides a new approach that enables individual researchers to conduct model comparisons easily, frequently, at low cost and on a large scale. Using this approach a model archive is built that includes many well-known empirically estimated models that may be used for quantitative analysis of monetary and fiscal stabilization policies. A computational platform is created that allows straightforward comparisons of models’ implications. Its application is illustrated by comparing different monetary and fiscal policies across selected models. Researchers can easily include new models in the data base and compare the effects of novel extensions to established benchmarks thereby fostering a comparative instead of insular approach to model development
The withdrawal of foreign capital from emerging countries at the height of the recent financial crisis and its quick return sparked a debate about the impact of capital flow surges on asset markets. This paper addresses the response of property prices to an inflow of foreign capital. For that purpose we estimate a panel VAR on a set of Asian emerging market economies, for which the waves of inflows were particularly pronounced, and identify capital inflow shocks based on sign restrictions. Our results suggest that capital inflow shocks have a significant effect on the appreciation of house prices and equity prices. Capital inflow shocks account for - roughly - twice the portion of overall house price changes they explain in OECD countries. We also address crosscountry differences in the house price responses to shocks, which are most likely due to differences in the monetary policy response to capital inflows.
The Eurosystem and the Deutsche Bundesbank will incur substantial losses in 2023 that are likely to persist for several years. Due to the massive purchases of securities in the last 10 years, especially of government bonds, the banks' excess reserves have risen sharply. The resulting high interest payments to the banks since the turnaround in monetary policy, with little income for the large-scale securities holdings, led to massive criticism. The banks were said to be making "unfair" profits as a result, while the fiscal authorities had to forego the previously customary transfers of central bank profits. Populist demands to limit bank profits by, for example, drastically increasing the minimum reserve ratios in the Eurosystem to reduce excess reserves are creating new severe problems and are neither justified nor helpful. Ultimately, the EU member states have benefited for a very long time from historically low interest rates because of the Eurosystem's extraordinary loose monetary policy and must now bear the flip side consequences of the massive expansion of central bank balance sheets during the necessary period of monetary policy normalisation.
The complexity resulting from intertwined uncertainties regarding model misspecification and mismeasurement of the state of the economy defines the monetary policy landscape. Using the euro area as laboratory this paper explores the design of robust policy guides aiming to maintain stability in the economy while recognizing this complexity. We document substantial output gap mismeasurement and make use of a new model data base to capture the evolution of model specification. A simple interest rate rule is employed to interpret ECB policy since 1999. An evaluation of alternative policy rules across 11 models of the euro area confirms the fragility of policy analysis optimized for any specific model and shows the merits of model averaging in policy design. Interestingly, a simple difference rule with the same coefficients on inflation and output growth as the one used to interpret ECB policy is quite robust as long as it responds to current outcomes of these variables.
The recent financial crisis has highlighted the limits of the “originate to distribute” model of banking, but its nexus with the macroeconomy and monetary policy remains unexplored. I build a DSGE model with banks (along the lines of Holmström and Tirole [28] and Parlour and Plantin [39] and examine its properties with and without active secondary markets for credit risk transfer. The possibility of transferring credit reduces the impact of liquidity shocks on bank balance sheets, but also reduces the bank incentive to monitor. As a result, secondary markets allow to release bank capital and exacerbate the effect of productivity and other macroeconomic shocks on output and inflation. By offering a possibility of capital recycling and by reducing bank monitoring, secondary credit markets in general equilibrium allow banks to take on more risk. Keywords: Credit Risk Transfer , Dual Moral Hazard , Monetary Policy , Liquidity , Welfare JEL Classification: E3, E5, G3 First Draft: December 2009, This Draft: September 2010
How does the need to preserve government debt sustainability affect the optimal monetary and fiscal policy response to a liquidity trap? To provide an answer, we employ a small stochastic New Keynesian model with a zero bound on nominal interest rates and characterize optimal time-consistent stabilization policies. We focus on two policy tools, the short-term nominal interest rate and debt-financed government spending. The optimal policy response to a liquidity trap critically depends on the prevailing debt burden. While the optimal amount of government spending is decreasing in the level of outstanding government debt, future monetary policy is becoming more accommodative, triggering a change in private sector expectations that helps to dampen the fall in output and inflation at the outset of the liquidity trap.
Monetary policy analysts often rely on rules-of-thumb, such as the Taylor rule, to describe historical monetary policy decisions and to compare current policy to historical norms. Analysis along these lines also permits evaluation of episodes where policy may have deviated from a simple rule and examination of the reasons behind such deviations. One interesting question is whether such rules-of-thumb should draw on policymakers "forecasts of key variables such as inflation and unemployment or on observed outcomes. Importantly, deviations of the policy from the prescriptions of a Taylor rule that relies on outcomes may be due to systematic responses to information captured in policymakers" own projections. We investigate this proposition in the context of FOMC policy decisions over the past 20 years using publicly available FOMC projections from the biannual monetary policy reports to the Congress (Humphrey-Hawkins reports). Our results indicate that FOMC decisions can indeed be predominantly explained in terms of the FOMC´s own projections rather than observed outcomes. Thus, a forecast-based rule-of-thumb better characterizes FOMC decision-making. We also confirm that many of the apparent deviations of the federal funds rate from an outcome-based Taylor-style rule may be considered systematic responses to information contained in FOMC projections.
The paper constructs a global monetary aggregate, namely the sum of the key monetary aggregates of the G5 economies (US, Euro area, Japan, UK, and Canada), and analyses its indicator properties for global output and inflation. Using a structural VAR approach we find that after a monetary policy shock output declines temporarily, with the downward effect reaching a peak within the second year, and the global monetary aggregate drops significantly. In addition, the price level rises permanently in response to a positive shock to the global liquidity aggregate. The similarity of our results with those found in country studies might supports the use of a global monetary aggregate as a summary measure of worldwide monetary trends. JEL Classification: E52, F01
We study the redistributive effects of inflation combining administrative bank data with an information provision experiment during an episode of historic inflation. On average, households are well-informed about prevailing inflation and are concerned about its impact on their wealth; yet, while many households know about inflation eroding nominal assets, most are unaware of nominal-debt erosion. Once they receive information on the debt-erosion channel, households update upwards their beliefs about nominal debt and their own real net wealth. These changes in beliefs causally affect actual consumption and hypothetical debt decisions. Our findings suggest that real wealth mediates the sensitivity of consumption to inflation once households are aware of the wealth effects of inflation.
This paper investigates the role that imperfect knowledge about the structure of the economy plays in the formation of expectations, macroeconomic dynamics, and the efficient formulation of monetary policy. Economic agents rely on an adaptive learning technology to form expectations and to update continuously their beliefs regarding the dynamic structure of the economy based on incoming data. The process of perpetual learning introduces an additional layer of dynamic interaction between monetary policy and economic outcomes. We find that policies that would be efficient under rational expectations can perform poorly when knowledge is imperfect. In particular, policies that fail to maintain tight control over inflation are prone to episodes in which the public's expectations of inflation become uncoupled from the policy objective and stagflation results, in a pattern similar to that experienced in the United States during the 1970s. Our results highlight the value of effective communication of a central bank's inflation objective and of continued vigilance against inflation in anchoring inflation expectations and fostering macroeconomic stability. July 2003.
As of today, estimating interest rate reaction functions for the Euro Area is hampered by the short time span since the conduct of a single monetary policy. In this paper we circumvent the common use of aggregated data before 1999 by estimating interest rate reaction functions based on a panel including actual EMU Member States. We find that exploiting the cross-section dimen- sion of a multi-country panel and accounting for cross-country heterogeneity in advance of the single monetary policy pays off with regard to the estimated reaction functions' ability to describe actual interest rate dynamics. We retrieve a panel reaction function which is demonstrated to be a valuable tool for evaluating episodes of monetary policy since 1999. JEL - Klassifikation: E43 , E58 , C33
No. And not only for the reason you think. In a world with multiple inefficiencies the single policy tool the central bank has control over will not undo all inefficiencies; this is well understood. We argue that the world is better characterized by multiple inefficiencies and multiple policy makers with various objectives. Asking the policy question only in terms of optimal monetary policy effectively turns the central bank into the residual claimant of all policy and gives the other policymakers a free hand in pursuing their own goals. This further worsens the tradeoffs faced by the central bank. The optimal monetary policy literature and the optimal simple rules often labeled flexible inflation targeting assign all of the cyclical policymaking duties to central banks. This distorts the policy discussion and narrows the policy choices to a suboptimal set. We highlight this issue and call for a broader thinking of optimal policies.
On 15 August 2017, the Bundesverfassungsgericht (BVerfG) referred the case against the European Central Bank’s policy of Quantitative Easing (QE) to the European Court of Justice (ECJ). The author argues that this event differs in several aspects from the OMT case in 2015 – in content as well as in form. The BVerfG recognizes that it is a legitimate goal of the ECB’s monetary policy to bring inflation up close to 2%, and that the instrument employed for QE is one of monetary policy. However, it doubts whether the sheer volume of QE would not distort the character of the program as one of monetary policy. The ECJ will now have to clarify the extent to which the ECJ’s findings in its OMT judgment are relevant for QE as well as the standard of review applicable to monetary policy. The author raises the questions of whether the principle of democracy under German constitutional law can actually provide the standard by which the ECB is to be measured, and how tight judicial review could be exercised over the ECB without encroaching upon its autonomy in monetary policy matters – and thus upon the very essence of central bank independence.
This paper introduces adaptive learning and endogenous indexation in the New-Keynesian Phillips curve and studies disinflation under inflation targeting policies. The analysis is motivated by the disinflation performance of many inflation-targeting countries, in particular the gradual Chilean disinflation with temporary annual targets. At the start of the disinflation episode price-setting firms’ expect inflation to be highly persistent and opt for backward-looking indexation. As the central bank acts to bring inflation under control, price-setting firms revise their estimates of the degree of persistence. Such adaptive learning lowers the cost of disinflation. This reduction can be exploited by a gradual approach to disinflation. Firms that choose the rate for indexation also re-assess the likelihood that announced inflation targets determine steady-state inflation and adjust indexation of contracts accordingly. A strategy of announcing and pursuing short-term targets for inflation is found to influence the likelihood that firms switch from backward-looking indexation to the central bank’s targets. As firms abandon backward-looking indexation the costs of disinflation decline further. We show that an inflation targeting strategy that employs temporary targets can benefit from lower disinflation costs due to the reduction in backward-looking indexation.
In a plain-vanilla New Keynesian model with two-period staggered price-setting, discretionary monetary policy leads to multiple equilibria. Complementarity between the pricing decisions of forward-looking firms underlies the multiplicity, which is intrinsically dynamic in nature. At each point in time, the discretionary monetary authority optimally accommodates the level of predetermined prices when setting the money supply because it is concerned solely about real activity. Hence, if other firms set a high price in the current period, an individual firm will optimally choose a high price because it knows that the monetary authority next period will accommodate with a high money supply. Under commitment, the mechanism generating complementarity is absent: the monetary authority commits not to respond to future predetermined prices. Multiple equilibria also arise in other similar contexts where (i) a policymaker cannot commit, and (ii) forward-looking agents determine a state variable to which future policy respond. JEL Klassifikation: E5, E61, D78
The development of tractable forward looking models of monetary policy has lead to an explosion of research on the implications of adopting Taylor-type interest rate rules. Indeterminacies have been found to arise for some specifications of the interest rate rule, raising the possibility of inefficient fluctuations due to the dependence of expectations on extraneous "sunspots ". Separately, recent work by a number of authors has shown that sunspot equilibria previously thought to be unstable under private agent learning can in some cases be stable when the observed sunspot has a suitable time series structure. In this paper we generalize the "common factor "technique, used in this analysis, to examine standard monetary models that combine forward looking expectations and predetermined variables. We consider a variety of specifications that incorporate both lagged and expected inflation in the Phillips Curve, and both expected inflation and inertial elements in the policy rule. We find that some policy rules can indeed lead to learnable sunspot solutions and we investigate the conditions under which this phenomenon arises.
In the New-Keynesian model, optimal interest rate policy under uncertainty is formulated without reference to monetary aggregates as long as certain standard assumptions on the distributions of unobservables are satisfied. The model has been criticized for failing to explain common trends in money growth and inflation, and that therefore money should be used as a cross-check in policy formulation (see Lucas (2007)). We show that the New-Keynesian model can explain such trends if one allows for the possibility of persistent central bank misperceptions. Such misperceptions motivate the search for policies that include additional robustness checks. In earlier work, we proposed an interest rate rule that is near-optimal in normal times but includes a cross-check with monetary information. In case of unusual monetary trends, interest rates are adjusted. In this paper, we show in detail how to derive the appropriate magnitude of the interest rate adjustment following a significant cross-check with monetary information, when the New-Keynesian model is the central bank’s preferred model. The cross-check is shown to be effective in offsetting persistent deviations of inflation due to central bank misperceptions. Keywords: Monetary Policy, New-Keynesian Model, Money, Quantity Theory, European Central Bank, Policy Under Uncertainty
Mortgage markets, collateral constraints, and monetary policy: do institutional factors matter?
(2006)
We study the role of institutional characteristics of mortgage markets in affecting the strength and timing of the effects of monetary policy shocks on house prices and consumption in a sample of OECD countries. We document three facts: (1) there is significant divergence in the structure of mortgage markets across the main industrialised countries; (2) at the business cycle frequency, the correlation between consumption and house prices increases with the degree of flexibility/development of mortgage markets; (3) the transmission of monetary policy shocks on consumption and house prices is stronger in countries with more flexible/developed mortgage markets. We then build a two-sector dynamic general equilibrium model with price stickiness and collateral constraints, where the ability of borrowing is endogenously linked to the nominal value of a durable asset (housing). We study how the response of consumption to monetary policy shocks is affected by alternative values of three key institutional parameters: (i) down-payment rate; (ii) mortgage repayment rate; (iii) interest rate mortgage structure (variable vs. fixed interest rate). In line with our empirical evidence, the sensitivity of consumption to monetary policy shocks increases with lower values of (i) and (ii), and is larger under a variable-rate mortgage structure. JEL Classification: E21, E44, E52
Central banks have recently introduced new policy initiatives, including a policy called ‘Quantitative Easing’ (QE). Since it has been argued by the Bank of England that “Standard economic models are of limited use in these unusual circumstances, and the empirical evidence is extremely limited” (Bank of England, 2009b), we have taken an entirely empirical approach and have focused on the QE-experience, on which substantial data is available, namely that of Japan (2001-2006). Recent literature on the effectiveness of QE has neglected any reference to final policy goals. In this paper, we adopt the view that ultimately effectiveness will be measured by whether it will be able to “boost spending” (Bank of England, 2009b) and “will ultimately be judged by their impact on the wider macroeconomy” (Bank of England, 2010). In line with a widely held view among leading macroeconomists from various persuasions, while attempting to stay agnostic and open-minded on the distribution of demand changes between real output and inflation, we have thus identified nominal GDP growth as the key final policy goal of monetary policy. The empirical research finds that the policy conducted by the Bank of Japan between 2001 and 2006 makes little empirical difference while an alternative policy targeting credit creation (the original definition of QE) would likely have been more successful.
This paper undertakes a quantitative investigation of the effects of anticipated inflation on the distribution of household wealth and welfare. Consumer Finance Data on household financial wealth suggests that about a third of the US population holds all its financial assets in transaction accounts. The remaining two-third of the US population holds most of their financial assets outside transaction accounts. To account for this evidence, I introduce a portfolio choice in a standard incomplete markets model with heterogeneous agents. I calibrate the model economy to SCF 2010 US data and use this environment to study the distributive effects of changes in anticipated inflation. An increase in anticipated inflation leads households to reshuffle their portfolio towards real assets. This crowding-in of supply for real assets lowers equilibrium interest rates and thereby redistributes wealth from creditors to borrowers. Because borrowers have a higher marginal utility, this redistribution improves aggregate welfare. First, this paper shows that inflation acts not only a regressive consumption tax as in Erosa and Ventura (2002), but also as a progressive tax. Second, this paper shows that the welfare cost of inflation are even lower than the estimates computed by Lucas (2000) and Ireland (2009). Finally, this paper offers insights into why deflationary environments should be avoided.