Refine
Year of publication
Document Type
- Working Paper (28)
Language
- English (28)
Has Fulltext
- yes (28) (remove)
Is part of the Bibliography
- no (28)
Keywords
- Excess Zeros (2)
- High-frequency Data (2)
- Market Microstructure (2)
- Multiplicative Error Model (2)
- Semiparametric Specification Test (2)
- Value-at-Risk (2)
- limit order book (2)
- "magnet effect" (1)
- Abnormal Returns (1)
- Angebot (1)
Institute
- Center for Financial Studies (CFS) (28) (remove)
We extend the classical ”martingale-plus-noise” model for high-frequency prices by an error correction mechanism originating from prevailing mispricing. The speed of price reversal is a natural measure for informational efficiency. The strength of the price reversal relative to the signal-to-noise ratio determines the signs of the return serial correlation and the bias in standard realized variance estimates. We derive the model’s properties and locally estimate it based on mid-quote returns of the NASDAQ 100 constituents. There is evidence of mildly persistent local regimes of positive and negative serial correlation, arising from lagged feedback effects and sluggish price adjustments. The model performance is decidedly superior to existing stylized microstructure models. Finally, we document intraday periodicities in the speed of price reversion and noise-to-signal ratios.
We propose a framework for estimating network-driven time-varying systemic risk contributions that is applicable to a high-dimensional financial system. Tail risk dependencies and contributions are estimated based on a penalized two-stage fixed-effects quantile approach, which explicitly links bank interconnectedness to systemic risk contributions. The framework is applied to a system of 51 large European banks and 17 sovereigns through the period 2006 to 2013, utilizing both equity and CDS prices. We provide new evidence on how banking sector fragmentation and sovereign-bank linkages evolved over the European sovereign debt crisis and how it is reflected in network statistics and systemic risk measures. Illustrating the usefulness of the framework as a monitoring tool, we provide indication for the fragmentation of the European financial system having peaked and that recovery has started.
We propose a new estimator for the spot covariance matrix of a multi-dimensional continuous semi-martingale log asset price process which is subject to noise and non-synchronous observations. The estimator is constructed based on a local average of block-wise parametric spectral covariance estimates. The latter originate from a local method of moments (LMM) which recently has been introduced by Bibinger et al. (2014). We extend the LMM estimator to allow for autocorrelated noise and propose a method to adaptively infer the autocorrelations from the data. We prove the consistency and asymptotic normality of the proposed spot covariance estimator. Based on extensive simulations we provide empirical guidance on the optimal implementation of the estimator and apply it to high-frequency data of a cross-section of NASDAQ blue chip stocks. Employing the estimator to estimate spot covariances, correlations and betas in normal but also extreme-event periods yields novel insights into intraday covariance and correlation dynamics. We show that intraday (co-)variations (i) follow underlying periodicity patterns, (ii) reveal substantial intraday variability associated with (co-)variation risk, (iii) are strongly serially correlated, and (iv) can increase strongly and nearly instantaneously if new information arrives.
We introduce a copula-based dynamic model for multivariate processes of (non-negative) high-frequency trading variables revealing time-varying conditional variances and correlations. Modeling the variables’ conditional mean processes using a multiplicative error model we map the resulting residuals into a Gaussian domain using a Gaussian copula. Based on high-frequency volatility, cumulative trading volumes, trade counts and market depth of various stocks traded at the NYSE, we show that the proposed copula-based transformation is supported by the data and allows capturing (multivariate) dynamics in higher order moments. The latter are modeled using a DCC-GARCH specification. We suggest estimating the model by composite maximum likelihood which is sufficiently flexible to be applicable in high dimensions. Strong empirical evidence for time-varying conditional (co-)variances in trading processes supports the usefulness of the approach. Taking these higher-order dynamics explicitly into account significantly improves the goodness-of-fit of the multiplicative error model and allows capturing time-varying liquidity risks.
We develop a model of an order-driven exchange competing for order flow with off-exchange trading mechanisms. Liquidity suppliers face a trade-off between benefits and costs of order exposure. If they display trading intentions, they attract additional trade demand. We show, in equilibrium, hiding trade intentions can induce mis-coordination between liquidity supply and demand, generate excess price fluctuations and harm price efficiency. Econometric high-frequency analysis based on unique data on hidden orders from NASDAQ reveals strong empirical support for these predictions: We find abnormal reactions in prices and order flow after periods of high excess-supply of hidden liquidity.
Revisiting the stealth trading hypothesis: does time-varying liquidity explain the size-effect?
(2019)
Large trades have a smaller price impact per share than medium-sized trades. So far, the literature has attributed this effect to the informational content of trades. In this paper, we show that this effect can arise from strategic order placement. We introduce the concept of a liquidity elasticity, measuring the responsiveness of liquidity demand with respect to changes in liquidity supply, as a major driver for a declining price impact per share. Empirical evidence based on Nasdaq stocks strongly supports theoretical predictions and shows that the aspect of liquidity coordination is an important complement to rationales based on asymmetric information.
A counterparty credit limit (CCL) is a limit imposed by a financial institution to cap its maximum possible exposure to a specified counterparty. Although CCLs are designed to help institutions mitigate counterparty risk by selective diversification of their exposures, their implementation restricts the liquidity that institutions can access in an otherwise centralized pool. We address the question of how this mechanism impacts trade prices and volatility, both empirically and via a new model of trading with CCLs. We find empirically that CCLs cause little impact on trade. However, our model highlights that in extreme situations, CCLs could serve to destabilize prices and thereby influence systemic risk.
We examine intra-day market reactions to news in stock-specific sentiment disclosures. Using pre-processed data from an automated news analytics tool based on linguistic pattern recognition we extract information on the relevance as well as the direction of company-specific news. Information-implied reactions in returns, volatility as well as liquidity demand and supply are quantified by a high-frequency VAR model using 20 second intervals. Analyzing a cross-section of stocks traded at the London Stock Exchange (LSE), we find market-wide robust news-dependent responses in volatility and trading volume. However, this is only true if news items are classified as highly relevant. Liquidity supply reacts less distinctly due to a stronger influence of idiosyncratic noise. Furthermore, evidence for abnormal highfrequency returns after news in sentiments is shown. JEL-Classification: G14, C32
We introduce a multivariate multiplicative error model which is driven by componentspecific observation driven dynamics as well as a common latent autoregressive factor. The model is designed to explicitly account for (information driven) common factor dynamics as well as idiosyncratic effects in the processes of high-frequency return volatilities, trade sizes and trading intensities. The model is estimated by simulated maximum likelihood using efficient importance sampling. Analyzing five minutes data from four liquid stocks traded at the New York Stock Exchange, we find that volatilities, volumes and intensities are driven by idiosyncratic dynamics as well as a highly persistent common factor capturing most causal relations and cross-dependencies between the individual variables. This confirms economic theory and suggests more parsimonious specifications of high-dimensional trading processes. It turns out that common shocks affect the return volatility and the trading volume rather than the trading intensity. JEL Classification: C15, C32, C52
We propose a multivariate dynamic intensity peaks-over-threshold model to capture extreme events in a multivariate time series of returns. The random occurrence of extreme events exceeding a threshold is modeled by means of a multivariate dynamic intensity model allowing for feedback effects between the individual processes. We propose alternative specifications of the multivariate intensity process using autoregressive conditional intensity and Hawkes-type specifications. Likewise, temporal clustering of the size of exceedances is captured by an autoregressive multiplicative error model based on a generalized Pareto distribution. We allow for spillovers between both the intensity processes and the process of marks. The model is applied to jointly model extreme returns in the daily returns of three major stock indexes. We find strong empirical support for a temporal clustering of both the occurrence of extremes and the size of exceedances. Moreover, significant feedback effects between both types of processes are observed. Backtesting Value-at-Risk (VaR) and Expected Shortfall (ES) forecasts show that the proposed model does not only produce a good in-sample fit but also reliable out-of-sample predictions. We show that the inclusion of temporal clustering of the size of exceedances and feedback with the intensity thereof results in better forecasts of VaR and ES.
Bayesian learning provides the core concept of processing noisy information. In standard Bayesian frameworks, assessing the price impact of information requires perfect knowledge of news’ precision. In practice, however, precision is rarely dis- closed. Therefore, we extend standard Bayesian learning, suggesting traders infer news’ precision from magnitudes of surprises and from external sources. We show that interactions of the different precision signals may result in highly nonlinear price responses. Empirical tests based on intra-day T-bond futures price reactions to employment releases confirm the model’s predictions and show that the effects are statistically and economically significant.
We study the impact of the arrival of macroeconomic news on the informational and noise-driven components in high-frequency quote processes and their conditional variances. Bid and ask returns are decomposed into a common ("efficient return") factor and two market-side-specific components capturing market microstructure effects. The corresponding variance components reflect information-driven and noise-induced volatilities. We find that all volatility components reveal distinct dynamics and are positively influenced by news. The proportion of noise-induced variances is highest before announcements and significantly declines thereafter. Moreover, news-affected responses in all volatility components are influenced by order flow imbalances. JEL Classification: C32, G14, E44
Exploiting NASDAQ order book data and difference-in-differences methodology, we identify the distinct effects of trading pause mechanisms introduced on U.S. stock exchanges after May 2010. We show that the mere existence of such a regulation constitutes a safeguard which makes market participants behave differently in anticipation of a pause. Pauses tend to break local price trends, make liquidity suppliers revise positions, and enhance price discovery. In contrast, pauses do not have a “cool off” effect on markets, but rather accelerate volatility and bid-ask spreads. This implies a regulatory trade-off between the protective role of trading pauses and their adverse effects on market quality.
Trading under limited pre-trade transparency becomes increasingly popular on financial markets. We provide first evidence on traders’ use of (completely) hidden orders which might be placed even inside of the (displayed) bid-ask spread. Employing TotalView-ITCH data on order messages at NASDAQ, we propose a simple method to conduct statistical inference on the location of hidden depth and to test economic hypotheses. Analyzing a wide cross-section of stocks, we show that market conditions reflected by the (visible) bid-ask spread, (visible) depth, recent price movements and trading signals significantly affect the aggressiveness of ’dark’ liquidity supply and thus the ’hidden spread’. Our evidence suggests that traders balance hidden order placements to (i) compete for the provision of (hidden) liquidity and (ii) protect themselves against adverse selection, front-running as well as ’hidden order detection strategies’ used by high-frequency traders. Accordingly, our results show that hidden liquidity locations are predictable given the observable state of the market.
Despite their importance in modern electronic trading, virtually no systematic empirical evidence on the market impact of incoming orders is existing. We quantify the short-run and long-run price effect of posting a limit order by proposing a high-frequency cointegrated VAR model for ask and bid quotes and several levels of order book depth. Price impacts are estimated by means of appropriate impulse response functions. Analyzing order book data of 30 stocks traded at Euronext Amsterdam, we show that limit orders have significant market impacts and cause a dynamic (and typically asymmetric) rebalancing of the book. The strength and direction of quote and spread responses depend on the incoming orders’ aggressiveness, their size and the state of the book. We show that the effects are qualitatively quite stable across the market. Cross-sectional variations in the magnitudes of price impacts are well explained by the underlying trading frequency and relative tick size.
This paper addresses the open debate about the usefulness of high-frequency (HF) data in large-scale portfolio allocation. Daily covariances are estimated based on HF data of the S&P 500 universe employing a blocked realized kernel estimator. We propose forecasting covariance matrices using a multi-scale spectral decomposition where volatilities, correlation eigenvalues and eigenvectors evolve on different frequencies. In an extensive out-of-sample forecasting study, we show that the proposed approach yields less risky and more diversified portfolio allocations as prevailing methods employing daily data. These performance gains hold over longer horizons than previous studies have shown.
We introduce a regularization and blocking estimator for well-conditioned high-dimensional daily covariances using high-frequency data. Using the Barndorff-Nielsen, Hansen, Lunde, and Shephard (2008a) kernel estimator, we estimate the covariance matrix block-wise and regularize it. A data-driven grouping of assets of similar trading frequency ensures the reduction of data loss due to refresh time sampling. In an extensive simulation study mimicking the empirical features of the S&P 1500 universe we show that the ’RnB’ estimator yields efficiency gains and outperforms competing kernel estimators for varying liquidity settings, noise-to-signal ratios, and dimensions. An empirical application of forecasting daily covariances of the S&P 500 index confirms the simulation results.
Capturing the zero: a new class of zero-augmented distributions and multiplicative error processes
(2011)
We propose a novel approach to model serially dependent positive-valued variables which realize a non-trivial proportion of zero outcomes. This is a typical phenomenon in financial time series observed at high frequencies, such as cumulated trading volumes. We introduce a flexible point-mass mixture distribution and develop a semiparametric specification test explicitly tailored for such distributions. Moreover, we propose a new type of multiplicative error model (MEM) based on a zero-augmented distribution, which incorporates an autoregressive binary choice component and thus captures the (potentially different) dynamics of both zero occurrences and of strictly positive realizations. Applying the proposed model to high-frequency cumulated trading volumes of both liquid and illiquid NYSE stocks, we show that the model captures the dynamic and distributional properties of the data well and is able to correctly predict future distributions.
Capturing the zero: a new class of zero-augmented distributions and multiplicative error processes
(2010)
We propose a novel approach to model serially dependent positive-valued variables which realize a non-trivial proportion of zero outcomes. This is a typical phenomenon in financial time series observed on high frequencies, such as cumulated trading volumes or the time between potentially simultaneously occurring market events. We introduce a flexible pointmass mixture distribution and develop a semiparametric specification test explicitly tailored for such distributions. Moreover, we propose a new type of multiplicative error model (MEM) based on a zero-augmented distribution, which incorporates an autoregressive binary choice component and thus captures the (potentially different) dynamics of both zero occurrences and of strictly positive realizations. Applying the proposed model to high-frequency cumulated trading volumes of liquid NYSE stocks, we show that the model captures both the dynamic and distribution properties of the data very well and is able to correctly predict future distributions. Keywords: High-frequency Data , Point-mass Mixture , Multiplicative Error Model , Excess Zeros , Semiparametric Specification Test , Market Microstructure JEL Classification: C22, C25, C14, C16, C51
We show an ambivalent role of high-frequency traders (HFTs) in the Eurex Bund Futures market around high-impact macroeconomic announcements and extreme events. Around macroeconomic announcements, HFTs serve as market makers, post competitive spreads, and earn most of their profits through liquidity supply. Right before the announcement, however, HFTs significantly widen spreads and cause a rapid but short-lived drying-out of liquidity. In turbulent periods, such as after the U.K. Brexit announcement, HFTs shift their focus from market making activities to aggressive (but not necessarily profitable) directional strategies. Then, HFT activity becomes dominant and market quality can degrade.