Refine
Document Type
- Working Paper (8)
Language
- English (8)
Has Fulltext
- yes (8)
Is part of the Bibliography
- no (8)
Keywords
Institute
- Center for Financial Studies (CFS) (8) (remove)
We revisit the role of time in measuring the price impact of trades using a new empirical method that combines spread decomposition and dynamic duration modeling. Previous studies which have addressed the issue in a vector-autoregressive framework conclude that times when markets are most active are times when there is an increased presence of informed trading. Our empirical analysis based on recent European and U.S. data offers challenging new evidence. We find that as trade intensity increases, the informativeness of trades tends to decrease. This result is consistent with the predictions of Admati and Pfleiderer’s (1988) rational expectations model, and also with models of dynamic trading like those proposed by Parlour (1998) and Foucault (1999). Our results cast doubt on the common wisdom that fast markets bear particularly high adverse selection risks for uninformed market participants. JEL Classification: G10, C32 Keywords: Price Impact of Trades, Trading Intensity, Dynamic Duration Models, Spread Decomposition Models, Adverse Selection Risk
This paper studies the market quality of an internalization system which is designed as part of an open limit order book (the Xetra system operated by Deutsche Börse AG). The internalization sys-tem (Xetra BEST) guarantees a price improvement over the inside spread in the Xetra order book. We develop a structural model of this unique dual market environment and show that, while adverse selection costs of internalized trades are significantly lower than those of regular order book trades, the realized spreads (the revenue earned by the suppliers of liquidity) is significantly larger. The cost savings of the internalizer are larger than the mandatory price improvement. This suggests that internalization can be profitable both for the customer and the internalizer. JEL Classification: G10
This paper addresses and resolves the issue of microstructure noise when measuring the relative importance of home and U.S. market in the price discovery process of Canadian interlisted stocks. In order to avoid large bounds for information shares, previous studies applying the Cholesky decomposition within the Hasbrouck (1995) framework had to rely on high frequency data. However, due to the considerable amount of microstructure noise inherent in return data at very high frequencies, these estimators are distorted. We offer a modified approach that identifies unique information shares based on distributional assumptions and thereby enables us to control for microstructure noise. Our results indicate that the role of the U.S. market in the price discovery process of Canadian interlisted stocks has been underestimated so far. Moreover, we suggest that rather than stock specific factors, market characteristics determine information shares.
In the microstructure literature, information asymmetry is an important determinant of market liquidity. The classic setting is that uninformed dedicated liquidity suppliers charge price concessions when incoming market orders are likely to be informationally motivated. In limit order book markets, however, this relationship is less clear, as market participants can switch roles, and freely choose to immediately demand or patiently supply liquidity by submitting either market or limit orders. We study the importance of information asymmetry in limit order books based on a recent sample of thirty German DAX stocks. We find that Hasbrouck’s (1991) measure of trade informativeness Granger-causes book liquidity, in particular that required to fill large market orders. Picking-off risk due to public news induced volatility is more important for top-of-the book liquidity supply. In our multivariate analysis we control for volatility, trading volume, trading intensity and order imbalance to isolate the effect of trade informativeness on book liquidity. JEL Classification: G14 Keywords: Price Impact of Trades , Trading Intensity , Dynamic Duration Models, Spread Decomposition Models , Adverse Selection Risk
The long-run consumption risk model provides a theoretically appealing explanation for prominent asset pricing puzzles, but its intricate structure presents a challenge for econometric analysis. This paper proposes a two-step indirect inference approach that disentangles the estimation of the model's macroeconomic dynamics and the investor's preference parameters. A Monte Carlo study explores the feasibility and efficiency of the estimation strategy. We apply the method to recent U.S. data and provide a critical re-assessment of the long-run risk model's ability to reconcile the real economy and financial markets. This two-step indirect inference approach is potentially useful for the econometric analysis of other prominent consumption-based asset pricing models that are equally difficult to estimate.
Consumption-based asset pricing with rare disaster risk : a simulated method of moments approach
(2014)
The rare disaster hypothesis suggests that the extraordinarily high postwar U.S. equity premium resulted because investors ex ante demanded compensation for unlikely but calamitous risks that they happened not to incur. Although convincing in theory, empirical tests of the rare disaster explanation are scarce. We estimate a disaster-including consumption-based asset pricing model (CBM) using a combination of the simulated method of moments and bootstrapping. We consider several methodological alternatives that differ in the moment matches and the way to account for disasters in the simulated consumption growth and return series. Whichever specification is used, the estimated preference parameters are of an economically plausible size, and the estimation precision is much higher than in previous studies that use the canonical CBM. Our results thus provide empirical support for the rare disaster hypothesis, and help reconcile the nexus between real economy and financial markets implied by the consumption-based asset pricing paradigm.
The long-run consumption risk (LRR) model is a promising approach to resolve prominent asset pricing puzzles. The simulated method of moments (SMM) provides a natural framework to estimate its deep parameters, but caveats concern model solubility and weak identification. We propose a two-step estimation strategy that combines GMM and SMM, and for which we elicit informative macroeconomic and financial moment matches from the LRR model structure. In particular, we exploit the persistent serial correlation of consumption and dividend growth and the equilibrium conditions for market return and risk-free rate, as well as the model-implied predictability of the risk-free rate. We match analytical moments when possible and simulated moments when necessary and determine the crucial factors required for both identification and reasonable estimation precision. A simulation study – the first in the context of long-run risk modeling – delineates the pitfalls associated with SMM estimation of a non-linear dynamic asset pricing model. Our study provides a blueprint for successful estimation of the LRR model.
Non-standard errors
(2021)
In statistics, samples are drawn from a population in a data-generating process (DGP). Standard errors measure the uncertainty in sample estimates of population parameters. In science, evidence is generated to test hypotheses in an evidence-generating process (EGP). We claim that EGP variation across researchers adds uncertainty: non-standard errors. To study them, we let 164 teams test six hypotheses on the same sample. We find that non-standard errors are sizeable, on par with standard errors. Their size (i) co-varies only weakly with team merits, reproducibility, or peer rating, (ii) declines significantly after peer-feedback, and (iii) is underestimated by participants.