Refine
Year of publication
Document Type
- Working Paper (15)
- Report (3)
Has Fulltext
- yes (18)
Is part of the Bibliography
- no (18)
Keywords
- Börsenkurs (4)
- asset pricing (4)
- Schätzung (3)
- Adverse Selection Risk (2)
- Bias (2)
- Deutschland (2)
- Dynamic Duration Models (2)
- Exponential smoothing (2)
- Price Impact of Trades (2)
- Schätzfunktion (2)
We revisit the role of time in measuring the price impact of trades using a new empirical method that combines spread decomposition and dynamic duration modeling. Previous studies which have addressed the issue in a vector-autoregressive framework conclude that times when markets are most active are times when there is an increased presence of informed trading. Our empirical analysis based on recent European and U.S. data offers challenging new evidence. We find that as trade intensity increases, the informativeness of trades tends to decrease. This result is consistent with the predictions of Admati and Pfleiderer’s (1988) rational expectations model, and also with models of dynamic trading like those proposed by Parlour (1998) and Foucault (1999). Our results cast doubt on the common wisdom that fast markets bear particularly high adverse selection risks for uninformed market participants. JEL Classification: G10, C32 Keywords: Price Impact of Trades, Trading Intensity, Dynamic Duration Models, Spread Decomposition Models, Adverse Selection Risk
This paper studies the market quality of an internalization system which is designed as part of an open limit order book (the Xetra system operated by Deutsche Börse AG). The internalization sys-tem (Xetra BEST) guarantees a price improvement over the inside spread in the Xetra order book. We develop a structural model of this unique dual market environment and show that, while adverse selection costs of internalized trades are significantly lower than those of regular order book trades, the realized spreads (the revenue earned by the suppliers of liquidity) is significantly larger. The cost savings of the internalizer are larger than the mandatory price improvement. This suggests that internalization can be profitable both for the customer and the internalizer. JEL Classification: G10
This paper addresses and resolves the issue of microstructure noise when measuring the relative importance of home and U.S. market in the price discovery process of Canadian interlisted stocks. In order to avoid large bounds for information shares, previous studies applying the Cholesky decomposition within the Hasbrouck (1995) framework had to rely on high frequency data. However, due to the considerable amount of microstructure noise inherent in return data at very high frequencies, these estimators are distorted. We offer a modified approach that identifies unique information shares based on distributional assumptions and thereby enables us to control for microstructure noise. Our results indicate that the role of the U.S. market in the price discovery process of Canadian interlisted stocks has been underestimated so far. Moreover, we suggest that rather than stock specific factors, market characteristics determine information shares.
This paper provides an empirical assessment of hypotheses that identify causes of demand side constraints of individual labour supply. In a comparative study for the USA and the FRG we focus on analysing the effect of productivity gaps (industry wage growth beyond productivity growth), industry investment intensity and regional labour market conditions on individual employment probabilities. Furthermore, we investigate whether demand side constraints of labour supply can be caused by a spill over from commodity markets. Efficiency wage theory and the theory of inter-industry wage differentials are utilised to derive identifying restrictions that are applicable to the labour supply models for both countries. The econometric contribution of the paper is the derivation and application of a two step estimation method for the class of simultaneous random effects double hurdle models, of which the labour supply model employed in this paper is a special case. To provide the empirical basis for the comparative study, the Panel Study of Income Dynamics and the German Socio-Economic Panel are linked to the OECD’s International Sectoral Database. JEL classification: C33, C34, J64, O57
Modelling consumer behaviour in a profile design using a three equation generalised Tobit model
(1997)
We propose the application of a three equation generalised Tobit to model different aspects of consumer behaviour in a full profile study design. The model takes into account that consumer behaviour can be measured by preference scores, purchase probability and purchase volume. We aim to avoid the drawbacks of traditional conjoint analysis where the latter two aspects are disregarded. Starting from a full profile design, we develop the appropriate questionnaire layout, the econometric model, the likelihood function and tests. The model is applied in a market entry study for an innovative medicament after a reform of Germany´s public health system in 1993-1994. JEL Classification: C35,M31,L65
In the microstructure literature, information asymmetry is an important determinant of market liquidity. The classic setting is that uninformed dedicated liquidity suppliers charge price concessions when incoming market orders are likely to be informationally motivated. In limit order book markets, however, this relationship is less clear, as market participants can switch roles, and freely choose to immediately demand or patiently supply liquidity by submitting either market or limit orders. We study the importance of information asymmetry in limit order books based on a recent sample of thirty German DAX stocks. We find that Hasbrouck’s (1991) measure of trade informativeness Granger-causes book liquidity, in particular that required to fill large market orders. Picking-off risk due to public news induced volatility is more important for top-of-the book liquidity supply. In our multivariate analysis we control for volatility, trading volume, trading intensity and order imbalance to isolate the effect of trade informativeness on book liquidity. JEL Classification: G14 Keywords: Price Impact of Trades , Trading Intensity , Dynamic Duration Models, Spread Decomposition Models , Adverse Selection Risk
Kursänderungen auf Aktienmärkten können informationsinduziert durch neu zu verarbeitende Informationen oder liquiditätsinduziert durch kurzfristige Angebots- bzw. Nachfrageüberhänge auftreten. Diese zwei so unterschiedlich verursachten Kursreaktionen sind in empirischen Untersuchungen nur schwer zu trennen. Das Modell von Easley, Kiefer, O’Hara und Paperman (1996) bietet eine theoretische Basis zur separaten Erfassung von liquiditätsorientiertem und informationsbasiertem Handel und eröffnet darüber hinaus auch einen Weg zur empirischen Quantifizierung dieser Größen.
In der vorliegenden Untersuchung nutzen wir diesen Ansatz zur Analyse des Handels deutscher Aktien über das Computerhandelssystem IBIS. Dabei zeigt sich, daß innerhalb der DAX-Werte Informationsereignisse bei den sehr stark gehandelten Aktien nicht häufiger als bei weniger oft gehandelten Werten auftreten. Die Unterschiede im Handelsvolumen sind auf unterschiedlich starke Handelsaktivität sowohl informierter als auch uninformierter Marktteilnehmer zurückzuführen. Weiterhin zeigt sich, daß das Risiko, mit informierten Marktteilnehmern zu handeln, bei den sehr umsatzstarken Aktien am geringsten ist.
In Einklang mit dem sogenannten Montagseffekt ist die Wahrscheinlichkeit für das Auftreten von negativen Informationsereignissen zu Wochenanfang besonders groß. Dieses Ergebnis könnte durch eine Tendenz von Managern erklärt werden, negative Informationen freitags nach Börsenschluß zu veröffentlichen. Eine getrennte Untersuchung für Handelstage mit niedriger und solche mit hoher Volatilität zeigt, daß an Handelstagen mit höherer Volatilität die Handelsintensität sowohl informierter als auch uninformierter Investoren größer ist. Auch die Wahrscheinlichkeit, an solchen Tagen mit besser informierten Marktteilnehmern zu handeln, steigt. Dieser Anstieg ist allerdings nicht statistisch signifikant.
The long-run consumption risk model provides a theoretically appealing explanation for prominent asset pricing puzzles, but its intricate structure presents a challenge for econometric analysis. This paper proposes a two-step indirect inference approach that disentangles the estimation of the model's macro-economic dynamics and the investor's preference parameters. A Monte Carlo study explores the feasibility and efficiency of the estimation strategy. We apply the method to recent U.S.\data and provide a critical re-assessment of the long-run risk model's ability to reconcile the real economy and financial markets. This two-step indirect inference approach is potentially useful for the econometric analysis of other prominent consumption-based asset pricing models that are equally difficult to estimate.
The long-run consumption risk model provides a theoretically appealing explanation for prominent asset pricing puzzles, but its intricate structure presents a challenge for econometric analysis. This paper proposes a two-step indirect inference approach that disentangles the estimation of the model's macroeconomic dynamics and the investor's preference parameters. A Monte Carlo study explores the feasibility and efficiency of the estimation strategy. We apply the method to recent U.S. data and provide a critical re-assessment of the long-run risk model's ability to reconcile the real economy and financial markets. This two-step indirect inference approach is potentially useful for the econometric analysis of other prominent consumption-based asset pricing models that are equally difficult to estimate.
Consumption-based asset pricing with rare disaster risk : a simulated method of moments approach
(2014)
The rare disaster hypothesis suggests that the extraordinarily high postwar U.S. equity premium resulted because investors ex ante demanded compensation for unlikely but calamitous risks that they happened not to incur. Although convincing in theory, empirical tests of the rare disaster explanation are scarce. We estimate a disaster-including consumption-based asset pricing model (CBM) using a combination of the simulated method of moments and bootstrapping. We consider several methodological alternatives that differ in the moment matches and the way to account for disasters in the simulated consumption growth and return series. Whichever specification is used, the estimated preference parameters are of an economically plausible size, and the estimation precision is much higher than in previous studies that use the canonical CBM. Our results thus provide empirical support for the rare disaster hypothesis, and help reconcile the nexus between real economy and financial markets implied by the consumption-based asset pricing paradigm.
The long-run consumption risk (LRR) model is a promising approach to resolve prominent asset pricing puzzles. The simulated method of moments (SMM) provides a natural framework to estimate its deep parameters, but caveats concern model solubility and weak identification. We propose a two-step estimation strategy that combines GMM and SMM, and for which we elicit informative macroeconomic and financial moment matches from the LRR model structure. In particular, we exploit the persistent serial correlation of consumption and dividend growth and the equilibrium conditions for market return and risk-free rate, as well as the model-implied predictability of the risk-free rate. We match analytical moments when possible and simulated moments when necessary and determine the crucial factors required for both identification and reasonable estimation precision. A simulation study – the first in the context of long-run risk modeling – delineates the pitfalls associated with SMM estimation of a non-linear dynamic asset pricing model. Our study provides a blueprint for successful estimation of the LRR model.
Wir verwenden eine neue, auf der Burr-Verteilung basierende Spezifikation aus der Familie der Autoregressive Conditional Duration (ACD) Modelle zur ökonometrischen Analyse der Transaktionsintensitäten während der Börseneinführung (IPO) der Deutsche Telekom Aktie. In diesem Fallbeispiel wird die Leistungsfähigkeit des neu entwickelten Burr-ACD-Modells mit den Standardmodellen von Engle und Russell verglichen, die im Burr-ACD Modell als Spezialfälle enthalten sind. Wir diskutieren außerdem alternative Möglichkeiten, Intra- Tagessaisonalitäten der Handelsintensität in ACD Modellen zu berücksichtigen.
We analyze exchange rates along with equity quotes for 3 German firms from New York (NYSE) and Frankfurt (XETRA) during overlapping trading hours to see where price discovery occurs and how stock prices adjust to an exchange rate shock. Findings include: (a) the exchange rate is exogenous with respect to the stock prices; (b) exchange rate innovations are more important in understanding the evolution of NYSE prices than XETRA prices; and (c) most (but not all) of the fundamental or random walk component of firm value is determined in Frankfurt.
Non-standard errors
(2021)
In statistics, samples are drawn from a population in a data-generating process (DGP). Standard errors measure the uncertainty in sample estimates of population parameters. In science, evidence is generated to test hypotheses in an evidence-generating process (EGP). We claim that EGP variation across researchers adds uncertainty: non-standard errors. To study them, we let 164 teams test six hypotheses on the same sample. We find that non-standard errors are sizeable, on par with standard errors. Their size (i) co-varies only weakly with team merits, reproducibility, or peer rating, (ii) declines significantly after peer-feedback, and (iii) is underestimated by participants.