Refine
Year of publication
Document Type
- Working Paper (1504)
- Part of Periodical (578)
- Article (207)
- Report (141)
- Book (100)
- Doctoral Thesis (70)
- Contribution to a Periodical (44)
- Conference Proceeding (21)
- Part of a Book (13)
- Periodical (12)
Is part of the Bibliography
- no (2719) (remove)
Keywords
- Deutschland (98)
- Financial Institutions (92)
- Capital Markets Union (67)
- ECB (67)
- Financial Markets (59)
- Banking Regulation (53)
- Banking Union (52)
- Household Finance (47)
- Monetary Policy (41)
- Banking Supervision (40)
Institute
- Wirtschaftswissenschaften (2719) (remove)
We theoretically and empirically study large-scale portfolio allocation problems when transaction costs are taken into account in the optimization problem. We show that transaction costs act on the one hand as a turnover penalization and on the other hand as a regularization, which shrinks the covariance matrix. As an empirical framework, we propose a flexible econometric setting for portfolio optimization under transaction costs, which incorporates parameter uncertainty and combines predictive distributions of individual models using optimal prediction pooling. We consider predictive distributions resulting from highfrequency based covariance matrix estimates, daily stochastic volatility factor models and regularized rolling window covariance estimates, among others. Using data capturing several hundred Nasdaq stocks over more than 10 years, we illustrate that transaction cost regularization (even to small extent) is crucial in order to produce allocations with positive Sharpe ratios. We moreover show that performance differences between individual models decline when transaction costs are considered. Nevertheless, it turns out that adaptive mixtures based on high-frequency and low-frequency information yield the highest performance. Portfolio bootstrap reveals that naive 1=N-allocations and global minimum variance allocations (with and without short sales constraints) are significantly outperformed in terms of Sharpe ratios and utility gains.
A counterparty credit limit (CCL) is a limit imposed by a financial institution to cap its maximum possible exposure to a specified counterparty. Although CCLs are designed to help institutions mitigate counterparty risk by selective diversification of their exposures, their implementation restricts the liquidity that institutions can access in an otherwise centralized pool. We address the question of how this mechanism impacts trade prices and volatility, both empirically and via a new model of trading with CCLs. We find empirically that CCLs cause little impact on trade. However, our model highlights that in extreme situations, CCLs could serve to destabilize prices and thereby influence systemic risk.
We show an ambivalent role of high-frequency traders (HFTs) in the Eurex Bund Futures market around high-impact macroeconomic announcements and extreme events. Around macroeconomic announcements, HFTs serve as market makers, post competitive spreads, and earn most of their profits through liquidity supply. Right before the announcement, however, HFTs significantly widen spreads and cause a rapid but short-lived drying-out of liquidity. In turbulent periods, such as after the U.K. Brexit announcement, HFTs shift their focus from market making activities to aggressive (but not necessarily profitable) directional strategies. Then, HFT activity becomes dominant and market quality can degrade.
Optimal trend inflation
(2017)
We present a sticky-price model incorporating heterogeneous Firms and systematic firm-level productivity trends. Aggregating the model in closed form, we show that it delivers radically different predictions for the optimal inflation rate than canonical sticky price models featuring homogenous Firms:
(1) the optimal steady-state inflation rate generically differs from zero and,
(2) inflation optimally responds to productivity disturbances.
Using micro data from the US Census Bureau to estimate the inflation-relevant productivity trends at the firm level, we find that the optimal US inflation rate is positive. It was slightly above 2 percent in the year 1986, but continuously declined thereafter, reaching about 1 percent in the year 2013.
Monetary policy communication is particularly important during unconventional times, because high uncertainty about the economy, the introduction of new policy tools and possible limits to the central bank’s toolkit could hamper the predictability of policy actions. We study how monetary policy communication should and has worked under such circumstances. Our main results relate to announcements of asset purchase programmes and the use of forward guidance. We show that announcements of asset purchase programmes have lowered market uncertainty, particularly when accompanied by a contextual release of implementation details such as the envisaged size of the programme. We also show that forward guidance reduces uncertainty more effectively when it is state‐contingent or when it provides guidance about a long horizon than when it is open‐ended or covers only a short horizon, and that the credibility of forward guidance is strengthened if the central bank also has embarked on an asset purchase programme.
Recent work has analyzed the forecasting performance of standard dynamic stochastic general equilibrium (DSGE) models, but little attention has been given to DSGE models that incorporate nonlinearities in exogenous driving processes. Against that background, we explore whether incorporating stochastic volatility improves DSGE forecasts (point, interval, and density). We examine real-time forecast accuracy for key macroeconomic variables including output growth, inflation, and the policy rate. We find that incorporating stochastic volatility in DSGE models of macroeconomic fundamentals markedly improves their density forecasts, just as incorporating stochastic volatility in models of financial asset returns improves their density forecasts.
Commodity connectedness
(2017)
We use variance decompositions from high-dimensional vector autoregressions to characterize connectedness in 19 key commodity return volatilities, 2011-2016. We study both static (full-sample) and dynamic (rolling-sample) connectedness. We summarize and visualize the results using tools from network analysis. The results reveal clear clustering of commodities into groups that match traditional industry groupings, but with some notable differences. The energy sector is most important in terms of sending shocks to others, and energy, industrial metals, and precious metals are themselves tightly connected.
We analyze older individuals’ debt and financial vulnerability using data from the Health and Retirement Study (HRS) and the National Financial Capability Study (NFCS). Specifically, in the HRS we examine three different cohorts (individuals age 56–61) in 1992, 2004, and 2010 to evaluate cross-cohort changes in debt over time. We also use two waves of the NFCS (2012 and 2015) to gain additional insights into debt management and older individuals’ capacity to shield themselves against shocks. We show that recent cohorts have taken on more debt and face more financial insecurity, mostly due to having purchased more expensive homes with smaller down payments.
The growth and popularity of defined contribution pensions, along with the government’s increasing attention to retirement plan costs and investment choices provided, make it important to understand how people select their retirement plan investments. This paper shows how employees in a large firm altered their fund allocations when the employer streamlined its pension fund menu and deleted nearly half of the offered funds. Using administrative data, we examine the changes in plan participant investment choices that resulted from the streamlining and how these changes might affect participants’ eventual retirement wellbeing. We show that streamlined participants’ new allocations exhibited significantly lower within-fund turnover rates and expense ratios, and we estimate this could lead to aggregate savings for these participants over a 20-year period of $20.2M, or in excess of $9,400 per participant. Moreover, after the reform, streamlined participants’ portfolios held significantly less equity and exhibited significantly lower risks by way of reduced exposures to most systematic risk factors, compared to their non-streamlined counterparts.
The long-run consumption risk model provides a theoretically appealing explanation for prominent asset pricing puzzles, but its intricate structure presents a challenge for econometric analysis. This paper proposes a two-step indirect inference approach that disentangles the estimation of the model's macroeconomic dynamics and the investor's preference parameters. A Monte Carlo study explores the feasibility and efficiency of the estimation strategy. We apply the method to recent U.S. data and provide a critical re-assessment of the long-run risk model's ability to reconcile the real economy and financial markets. This two-step indirect inference approach is potentially useful for the econometric analysis of other prominent consumption-based asset pricing models that are equally difficult to estimate.