Refine
Year of publication
Document Type
- Working Paper (29)
Language
- English (29)
Has Fulltext
- yes (29)
Is part of the Bibliography
- no (29)
Keywords
- Excess Zeros (2)
- High-frequency Data (2)
- Market Microstructure (2)
- Multiplicative Error Model (2)
- Semiparametric Specification Test (2)
- Value-at-Risk (2)
- limit order book (2)
- "magnet effect" (1)
- Abnormal Returns (1)
- Angebot (1)
This paper addresses the open debate about the usefulness of high-frequency (HF) data in large-scale portfolio allocation. Daily covariances are estimated based on HF data of the S&P 500 universe employing a blocked realized kernel estimator. We propose forecasting covariance matrices using a multi-scale spectral decomposition where volatilities, correlation eigenvalues and eigenvectors evolve on different frequencies. In an extensive out-of-sample forecasting study, we show that the proposed approach yields less risky and more diversified portfolio allocations as prevailing methods employing daily data. These performance gains hold over longer horizons than previous studies have shown.
We develop a model of an order-driven exchange competing for order flow with off-exchange trading mechanisms. Liquidity suppliers face a trade-off between benefits and costs of order exposure. If they display trading intentions, they attract additional trade demand. We show, in equilibrium, hiding trade intentions can induce mis-coordination between liquidity supply and demand, generate excess price fluctuations and harm price efficiency. Econometric high-frequency analysis based on unique data on hidden orders from NASDAQ reveals strong empirical support for these predictions: We find abnormal reactions in prices and order flow after periods of high excess-supply of hidden liquidity.
We model the dynamics of ask and bid curves in a limit order book market using a dynamic semiparametric factor model. The shape of the curves is captured by a factor structure which is estimated nonparametrically. Corresponding factor loadings are assumed to follow multivariate dynamics and are modelled using a vector autoregressive model. Applying the framework to four stocks traded at the Australian Stock Exchange (ASX) in 2002, we show that the suggested model captures the spatial and temporal dependencies of the limit order book. Relating the shape of the curves to variables reflecting the current state of the market, we show that the recent liquidity demand has the strongest impact. In an extensive forecasting analysis we show that the model is successful in forecasting the liquidity supply over various time horizons during a trading day. Moreover, it is shown that the model’s forecasting power can be used to improve optimal order execution strategies.
Capturing the zero: a new class of zero-augmented distributions and multiplicative error processes
(2011)
We propose a novel approach to model serially dependent positive-valued variables which realize a non-trivial proportion of zero outcomes. This is a typical phenomenon in financial time series observed at high frequencies, such as cumulated trading volumes. We introduce a flexible point-mass mixture distribution and develop a semiparametric specification test explicitly tailored for such distributions. Moreover, we propose a new type of multiplicative error model (MEM) based on a zero-augmented distribution, which incorporates an autoregressive binary choice component and thus captures the (potentially different) dynamics of both zero occurrences and of strictly positive realizations. Applying the proposed model to high-frequency cumulated trading volumes of both liquid and illiquid NYSE stocks, we show that the model captures the dynamic and distributional properties of the data well and is able to correctly predict future distributions.
We extend the classical ”martingale-plus-noise” model for high-frequency prices by an error correction mechanism originating from prevailing mispricing. The speed of price reversal is a natural measure for informational efficiency. The strength of the price reversal relative to the signal-to-noise ratio determines the signs of the return serial correlation and the bias in standard realized variance estimates. We derive the model’s properties and locally estimate it based on mid-quote returns of the NASDAQ 100 constituents. There is evidence of mildly persistent local regimes of positive and negative serial correlation, arising from lagged feedback effects and sluggish price adjustments. The model performance is decidedly superior to existing stylized microstructure models. Finally, we document intraday periodicities in the speed of price reversion and noise-to-signal ratios.
We show an ambivalent role of high-frequency traders (HFTs) in the Eurex Bund Futures market around high-impact macroeconomic announcements and extreme events. Around macroeconomic announcements, HFTs serve as market makers, post competitive spreads, and earn most of their profits through liquidity supply. Right before the announcement, however, HFTs significantly widen spreads and cause a rapid but short-lived drying-out of liquidity. In turbulent periods, such as after the U.K. Brexit announcement, HFTs shift their focus from market making activities to aggressive (but not necessarily profitable) directional strategies. Then, HFT activity becomes dominant and market quality can degrade.
We theoretically and empirically study large-scale portfolio allocation problems when transaction costs are taken into account in the optimization problem. We show that transaction costs act on the one hand as a turnover penalization and on the other hand as a regularization, which shrinks the covariance matrix. As an empirical framework, we propose a flexible econometric setting for portfolio optimization under transaction costs, which incorporates parameter uncertainty and combines predictive distributions of individual models using optimal prediction pooling. We consider predictive distributions resulting from highfrequency based covariance matrix estimates, daily stochastic volatility factor models and regularized rolling window covariance estimates, among others. Using data capturing several hundred Nasdaq stocks over more than 10 years, we illustrate that transaction cost regularization (even to small extent) is crucial in order to produce allocations with positive Sharpe ratios. We moreover show that performance differences between individual models decline when transaction costs are considered. Nevertheless, it turns out that adaptive mixtures based on high-frequency and low-frequency information yield the highest performance. Portfolio bootstrap reveals that naive 1=N-allocations and global minimum variance allocations (with and without short sales constraints) are significantly outperformed in terms of Sharpe ratios and utility gains.
A counterparty credit limit (CCL) is a limit imposed by a financial institution to cap its maximum possible exposure to a specified counterparty. Although CCLs are designed to help institutions mitigate counterparty risk by selective diversification of their exposures, their implementation restricts the liquidity that institutions can access in an otherwise centralized pool. We address the question of how this mechanism impacts trade prices and volatility, both empirically and via a new model of trading with CCLs. We find empirically that CCLs cause little impact on trade. However, our model highlights that in extreme situations, CCLs could serve to destabilize prices and thereby influence systemic risk.
Exploiting NASDAQ order book data and difference-in-differences methodology, we identify the distinct effects of trading pause mechanisms introduced on U.S. stock exchanges after May 2010. We show that the mere existence of such a regulation constitutes a safeguard which makes market participants behave differently in anticipation of a pause. Pauses tend to break local price trends, make liquidity suppliers revise positions, and enhance price discovery. In contrast, pauses do not have a “cool off” effect on markets, but rather accelerate volatility and bid-ask spreads. This implies a regulatory trade-off between the protective role of trading pauses and their adverse effects on market quality.
Despite the impressive success of deep neural networks in many application areas, neural network models have so far not been widely adopted in the context of volatility forecasting. In this work, we aim to bridge the conceptual gap between established time series approaches, such as the Heterogeneous Autoregressive (HAR) model (Corsi, 2009), and state-of-the-art deep neural network models. The newly introduced HARNet is based on a hierarchy of dilated convolutional layers, which facilitates an exponential growth of the receptive field of the model in the number of model parameters. HARNets allow for an explicit initialization scheme such that before optimization, a HARNet yields identical predictions as the respective baseline HAR model. Particularly when considering the QLIKE error as a loss function, we find that this approach significantly stabilizes the optimization of HARNets. We evaluate the performance of HARNets with respect to three different stock market indexes. Based on this evaluation, we formulate clear guidelines for the optimization of HARNets and show that HARNets can substantially improve upon the forecasting accuracy of their respective HAR baseline models. In a qualitative analysis of the filter weights learnt by a HARNet, we report clear patterns regarding the predictive power of past information. Among information from the previous week, yesterday and the day before, yesterday's volatility makes by far the most contribution to today's realized volatility forecast. Moroever, within the previous month, the importance of single weeks diminishes almost linearly when moving further into the past.