Universitätspublikationen
Refine
Year of publication
Document Type
- Working Paper (175)
- Part of Periodical (11)
- Article (8)
- Report (8)
- Book (2)
- Doctoral Thesis (1)
Has Fulltext
- yes (205) (remove)
Is part of the Bibliography
- no (205)
Keywords
- monetary policy (13)
- DSGE (7)
- Federal Reserve (6)
- Monetary Policy (6)
- Numerical accuracy (6)
- Solution methods (6)
- Bayesian estimation (5)
- DSGE models (5)
- Geldpolitik (5)
- banking union (5)
Institute
- Institute for Monetary and Financial Stability (IMFS) (205) (remove)
The bail-in tool as implemented in the European bank resolution framework suffers from severe shortcomings. To some extent, the regulatory framework can remedy the impediments to the desirable incentive effect of private sector involvement (PSI) that emanate from a lack of predictability of outcomes, if it compels banks to issue a sufficiently sized minimum of high-quality, easy to bail-in (subordinated) liabilities. Yet, even the limited improvements any prescription of bail-in capital can offer for PSI’s operational effectiveness seem compromised in important respects.
The main problem, echoing the general concerns voiced against the European bail-in regime, is that the specifications for minimum requirements for own funds and eligible liabilities (MREL) are also highly detailed and discretionary and thus alleviate the predicament of investors in bail-in debt, at best, only insufficiently. Quite importantly, given the character of typical MREL instruments as non-runnable long-term debt, even if investors are able to gauge the relevant risk of PSI in a bank’s failure correctly at the time of purchase, subsequent adjustment of MREL-prescriptions by competent or resolution authorities potentially change the risk profile of the pertinent instruments. Therefore, original pricing decisions may prove inadequate and so may market discipline that follows from them.
The pending European legislation aims at an implementation of the already complex specifications of the Financial Stability Board (FSB) for Total Loss Absorbing Capacity (TLAC) by very detailed and case specific amendments to both the regulatory capital and the resolution regime with an exorbitant emphasis on proportionality and technical fine-tuning. What gets lost in this approach, however, is the key policy objective of enhanced market discipline through predictable PSI: it is hardly conceivable that the pricing of MREL-instruments reflects an accurate risk-assessment of investors because of the many discretionary choices a multitude of agencies are supposed to make and revisit in the administration of the new regime. To prove this conclusion, this chapter looks in more detail at the regulatory objectives of the BRRD’s prescriptions for MREL and their implementation in the prospectively amended European supervisory and resolution framework.
The purpose of the data presented in this article is to use it in ex post estimations of interest rate decisions by the European Central Bank (ECB), as it is done by Bletzinger and Wieland (2017) [1]. The data is of quarterly frequency from 1999 Q1 until 2013 Q2 and consists of the ECB's policy rate, inflation rate, real output growth and potential output growth in the euro area. To account for forward-looking decision making in the interest rate rule, the data consists of expectations about future inflation and output dynamics. While potential output is constructed based on data from the European Commission's annual macro-economic database, inflation and real output growth are taken from two different sources both provided by the ECB: the Survey of Professional Forecasters and projections made by ECB staff. Careful attention was given to the publication date of the collected data to ensure a real-time dataset only consisting of information which was available to the decision makers at the time of the decision.
The currrent debate on monetary and fiscal policy is heavily influenced by estimates of the equilibrium real interest rate. Beyer and Wieland re-estimate the U.S. equilibrium rate with the methodology of Laubach and Williams and further modifications. They provide new estimates for the United States, the euro area and Germany and subject them to sensitivity tests. Beyer and Wieland conclude that due to the great uncertainty and sensitivity, the observed decline in the estimates is not a reliable indicator of a need for expansionary monetary and fiscal policy. Yet, if those estimates are employed to determine the appropriate monetary policy stance, such estimates are better used together with the consistent estimate of the level of potential output.
During the 1970s, industrial countries, including the US and continental Europa, experienced a combination of slow productivity growth and high unemplyoment. Subsequent research has shown that the standard model of unemployment actually gives counterfactual predictions. Motivated by the observation that the 1970s were also characterized by high and rising inflation, Tesfaselassie and Wolters examine the effect of growth on unemployment in the presence of nominal price rigidity.
The authors demonstrate that the effect of growth on unemployment may be positive or negative. Faster growth leads to lower unemployment if the rate of inflation is high enough. There is a threshold level of inflation below which faster growth leads to higher unemployment and above which faster growth leads to lower unemployment. The threshold level in turn depends on labor market characteristics, such as hiring efficiency, the job destruction rate, workers' relative bargaining power and the opportunity cost of work.
For some time now, structural macroeconomic models used at central banks have been predominantly New Keynesian DSGE models featuring nominal rigidities and forwardlooking decision-making. While these features are widely deemed crucial for policy evaluation exercises, most central banks have added more detailed characterizations of the financial sector to these models following the Great Recession in order to improve their fit to the data and their forecasting performance. We employ a comparative approach to investigate the characteristics of this new generation of New Keynesian DSGE models and document an elevated degree of model uncertainty relative to earlier model generations. Policy transmission is highly heterogeneous across types of financial frictions and monetary policy causes larger effects, on average. The New Keynesian DSGE models we analyze suggest that a simple policy rule robust to model uncertainty involves a weaker response to inflation and the output gap in the presence of financial frictions as compared to earlier generations of such models. Leaning-against-the-wind policies in models of this class estimated for the Euro Area do not lead to substantial gains. With regard to forecasting performance, the inclusion of financial frictions can generate improvements, if conditioned on appropriate data. Looking forward, we argue that model-averaging and embracing alternative modelling paradigms is likely to yield a more robust framework for the conduct of monetary policy.
Since 2014 the ECB has implemented a massive expansion of monetary policy including large-scale asset purchases and negative policy rates. As the euro area economy has improved and inflation has risen, questions concerning the future normalization of monetary policy are starting to dominate the public debate.
The study argues that the ECB should develop a strategy for policy normalization and communicate it very soon to prepare the ground for subsequent steps towards tightening. It provides analysis and makes proposals concerning key aspects of this strategy. The aim is to facilitate the emergence of expectations among market participants that are consistent with a smooth process of policy normalization.
Financial market interactions can lead to large and persistent booms and recessions. Instability is an inherent threat to economies with speculative financial markets. A central bank’s interest rate setting can amplify the expectation feedback in the financial market and this can lead to unstable dynamics and excess volatility. The paper suggests that policy institutions may be well-advised to handle tools like asset price targeting with care since such instruments might add a structural link between asset prices and macroeconomic aggregates. Neither stock prices nor indices are a good indicator to base decisions on.
I propose a dynamic stochastic general equilibrium model in which the leverage of borrowers as well as banks and housing finance play a crucial role in the model dynamics. The model is used to evaluate the relative effectiveness of a policy to inject capital into banks versus a policy to relieve households of mortgage debt. In normal times, when the economy is near the steady state and policy rates are set according to a Taylor-type rule, capital injections to banks are more effective in stimulating the economy in the long-run. However, in the middle of a housing debt crisis, when households are highly leveraged, the short-run output effects of the debt relief are more substantial. When the zero lower bound (ZLB) is additionally considered, the debt relief policy can be much more powerful in boosting the economy both in the short-run and in the longrun. Moreover, the output effects of the debt relief become increasingly larger, the longer the ZLB is binding.