- Center for Financial Studies (CFS) (18) (remove)
- A blocking and regularization approach to high dimensional realized covariance estimation (2009)
- We introduce a regularization and blocking estimator for well-conditioned high-dimensional daily covariances using high-frequency data. Using the Barndorff-Nielsen, Hansen, Lunde, and Shephard (2008a) kernel estimator, we estimate the covariance matrix block-wise and regularize it. A data-driven grouping of assets of similar trading frequency ensures the reduction of data loss due to refresh time sampling. In an extensive simulation study mimicking the empirical features of the S&P 1500 universe we show that the ’RnB’ estimator yields efficiency gains and outperforms competing kernel estimators for varying liquidity settings, noise-to-signal ratios, and dimensions. An empirical application of forecasting daily covariances of the S&P 500 index confirms the simulation results. Keywords: Covariance Estimation, Blocking, Realized Kernel, Regularization, Microstructure, Asynchronous Trading
- Analyzing interest rate risk : stochastic volatility in the term structure of government bond yields (2009)
- We propose a Nelson-Siegel type interest rate term structure model where the underlying yield factors follow autoregressive processes with stochastic volatility. The factor volatilities parsimoniously capture risk inherent to the term structure and are associated with the time-varying uncertainty of the yield curve’s level, slope and curvature. Estimating the model based on U.S. government bond yields applying Markov chain Monte Carlo techniques we find that the factor volatilities follow highly persistent processes. We show that slope and curvature risk have explanatory power for bond excess returns and illustrate that the yield and volatility factors are closely related to industrial capacity utilization, inflation, monetary policy and employment growth. JEL Classification: C5, E4, G1
- Capturing common components in high-frequency financial time series : a multivariate stochastic multiplicative error model (2007)
- We introduce a multivariate multiplicative error model which is driven by componentspecific observation driven dynamics as well as a common latent autoregressive factor. The model is designed to explicitly account for (information driven) common factor dynamics as well as idiosyncratic effects in the processes of high-frequency return volatilities, trade sizes and trading intensities. The model is estimated by simulated maximum likelihood using efficient importance sampling. Analyzing five minutes data from four liquid stocks traded at the New York Stock Exchange, we find that volatilities, volumes and intensities are driven by idiosyncratic dynamics as well as a highly persistent common factor capturing most causal relations and cross-dependencies between the individual variables. This confirms economic theory and suggests more parsimonious specifications of high-dimensional trading processes. It turns out that common shocks affect the return volatility and the trading volume rather than the trading intensity. JEL Classification: C15, C32, C52
- Capturing the zero: a new class of zero-augmented distributions and multiplicative error processes (2010)
- We propose a novel approach to model serially dependent positive-valued variables which realize a non-trivial proportion of zero outcomes. This is a typical phenomenon in financial time series observed on high frequencies, such as cumulated trading volumes or the time between potentially simultaneously occurring market events. We introduce a flexible pointmass mixture distribution and develop a semiparametric specification test explicitly tailored for such distributions. Moreover, we propose a new type of multiplicative error model (MEM) based on a zero-augmented distribution, which incorporates an autoregressive binary choice component and thus captures the (potentially different) dynamics of both zero occurrences and of strictly positive realizations. Applying the proposed model to high-frequency cumulated trading volumes of liquid NYSE stocks, we show that the model captures both the dynamic and distribution properties of the data very well and is able to correctly predict future distributions. Keywords: High-frequency Data , Point-mass Mixture , Multiplicative Error Model , Excess Zeros , Semiparametric Specification Test , Market Microstructure JEL Classification: C22, C25, C14, C16, C51
- Capturing the zero: a new class of zero-augmented distributions and multiplicative error processes (2011)
- We propose a novel approach to model serially dependent positive-valued variables which realize a non-trivial proportion of zero outcomes. This is a typical phenomenon in financial time series observed at high frequencies, such as cumulated trading volumes. We introduce a flexible point-mass mixture distribution and develop a semiparametric specification test explicitly tailored for such distributions. Moreover, we propose a new type of multiplicative error model (MEM) based on a zero-augmented distribution, which incorporates an autoregressive binary choice component and thus captures the (potentially different) dynamics of both zero occurrences and of strictly positive realizations. Applying the proposed model to high-frequency cumulated trading volumes of both liquid and illiquid NYSE stocks, we show that the model captures the dynamic and distributional properties of the data well and is able to correctly predict future distributions.
- Copula-based dynamic conditional correlation multiplicative error processes : [Version 18 April 2013] (2013)
- We introduce a copula-based dynamic model for multivariate processes of (non-negative) high-frequency trading variables revealing time-varying conditional variances and correlations. Modeling the variables’ conditional mean processes using a multiplicative error model we map the resulting residuals into a Gaussian domain using a Gaussian copula. Based on high-frequency volatility, cumulative trading volumes, trade counts and market depth of various stocks traded at the NYSE, we show that the proposed copula-based transformation is supported by the data and allows capturing (multivariate) dynamics in higher order moments. The latter are modeled using a DCC-GARCH specification. We suggest estimating the model by composite maximum likelihood which is sufficiently flexible to be applicable in high dimensions. Strong empirical evidence for time-varying conditional (co-)variances in trading processes supports the usefulness of the approach. Taking these higher-order dynamics explicitly into account significantly improves the goodness-of-fit of the multiplicative error model and allows capturing time-varying liquidity risks.
- Efficient iterative maximum likelihood estimation of high-parameterized time series models (2014)
- We propose an iterative procedure to efficiently estimate models with complex log-likelihood functions and the number of parameters relative to the observations being potentially high. Given consistent but inefficient estimates of sub-vectors of the parameter vector, the procedure yields computationally tractable, consistent and asymptotic efficient estimates of all parameters. We show the asymptotic normality and derive the estimator's asymptotic covariance in dependence of the number of iteration steps. To mitigate the curse of dimensionality in high-parameterized models, we combine the procedure with a penalization approach yielding sparsity and reducing model complexity. Small sample properties of the estimator are illustrated for two time series models in a simulation study. In an empirical application, we use the proposed method to estimate the connectedness between companies by extending the approach by Diebold and Yilmaz (2014) to a high-dimensional non-Gaussian setting.
- Estimating the spot covariation of asset prices – statistical theory and empirical evidence (2014)
- We propose a new estimator for the spot covariance matrix of a multi-dimensional continuous semi-martingale log asset price process which is subject to noise and non-synchronous observations. The estimator is constructed based on a local average of block-wise parametric spectral covariance estimates. The latter originate from a local method of moments (LMM) which recently has been introduced by Bibinger et al. (2014). We extend the LMM estimator to allow for autocorrelated noise and propose a method to adaptively infer the autocorrelations from the data. We prove the consistency and asymptotic normality of the proposed spot covariance estimator. Based on extensive simulations we provide empirical guidance on the optimal implementation of the estimator and apply it to high-frequency data of a cross-section of NASDAQ blue chip stocks. Employing the estimator to estimate spot covariances, correlations and betas in normal but also extreme-event periods yields novel insights into intraday covariance and correlation dynamics. We show that intraday (co-)variations (i) follow underlying periodicity patterns, (ii) reveal substantial intraday variability associated with (co-)variation risk, (iii) are strongly serially correlated, and (iv) can increase strongly and nearly instantaneously if new information arrives.
- Financial network systemic risk contributions (2013)
- We propose the realized systemic risk beta as a measure for financial companies’ contribution to systemic risk given network interdependence between firms’ tail risk exposures. Conditional on statistically pre-identified network spillover effects and market as well as balance sheet information, we define the realized systemic risk beta as the total time-varying marginal effect of a firm’s Value-at-risk (VaR) on the system’s VaR. Statistical inference reveals a multitude of relevant risk spillover channels and determines companies’ systemic importance in the U.S. financial system. Our approach can be used to monitor companies’ systemic importance allowing for a transparent macroprudential supervision.
- Modelling and forecasting liquidity supply using semiparametric factor dynamics (2009)
- We model the dynamics of ask and bid curves in a limit order book market using a dynamic semiparametric factor model. The shape of the curves is captured by a factor structure which is estimated nonparametrically. Corresponding factor loadings are assumed to follow multivariate dynamics and are modelled using a vector autoregressive model. Applying the framework to four stocks traded at the Australian Stock Exchange (ASX) in 2002, we show that the suggested model captures the spatial and temporal dependencies of the limit order book. Relating the shape of the curves to variables reflecting the current state of the market, we show that the recent liquidity demand has the strongest impact. In an extensive forecasting analysis we show that the model is successful in forecasting the liquidity supply over various time horizons during a trading day. Moreover, it is shown that the model’s forecasting power can be used to improve optimal order execution strategies. Keywords: Limit Order Book, Liquidity Risk, Semiparametric Model, Factor Structure, Prediction