Refine
Year of publication
Document Type
- Working Paper (12) (remove)
Has Fulltext
- yes (12)
Is part of the Bibliography
- no (12)
Keywords
- Börsenkurs (4)
- Schätzung (3)
- asset pricing (3)
- Bias (2)
- Deutschland (2)
- Exponential smoothing (2)
- Schätzfunktion (2)
- Schätztheorie (2)
- Zeitreihenanalyse (2)
- simulated method of moments (2)
- ACD (1)
- Adverse Selection Costs (1)
- Aktienanalyse (1)
- Aktienbewertung (1)
- Börsenzulassung (1)
- Cross-listing (1)
- Deutsche Börse (1)
- Execution Quality (1)
- Fernmeldewesen (1)
- Finanzanalyse (1)
- Going Public (1)
- Handelsvolumen (1)
- High Frequency Data in Finance (1)
- Informationsökonomie (1)
- Internalization (1)
- International stock markets (1)
- Intra Day (1)
- Mikrostrukturtheorie <Kapitalmarkttheorie> (1)
- Nichtlineare Analysis (1)
- Nichtparametrische Statistik (1)
- Nichtparametrisches Verfahren (1)
- Price discovery (1)
- Seasonality of the Trading Process (1)
- Technische Aktienanalyse (1)
- Telekommunikation (1)
- Telekommunikationswirtschaft (1)
- Transaction durations (1)
- Volatilität (1)
- Wahrscheinlichkeitsverteilung (1)
- Wertpapieranalyse (1)
- Wertpapieremission (1)
- Wertpapierhandel (1)
- Wertpapiermarkt (1)
- Xetra-Handelssystem (1)
- beta kernel (1)
- boundary bias (1)
- demand side constraints of labour supply (1)
- equity premium (1)
- financial transaction data (1)
- generalised tobit (1)
- indirect inference estimation (1)
- international comparison (1)
- labour markets in USA and FRG (1)
- liquidity (1)
- long-run risk (1)
- longrun risk (1)
- market entry study (1)
- non-parametric methods (1)
- panel data (1)
- rare disaster risk (1)
- trading intensity (1)
Non-standard errors
(2021)
In statistics, samples are drawn from a population in a data-generating process (DGP). Standard errors measure the uncertainty in sample estimates of population parameters. In science, evidence is generated to test hypotheses in an evidence-generating process (EGP). We claim that EGP variation across researchers adds uncertainty: non-standard errors. To study them, we let 164 teams test six hypotheses on the same sample. We find that non-standard errors are sizeable, on par with standard errors. Their size (i) co-varies only weakly with team merits, reproducibility, or peer rating, (ii) declines significantly after peer-feedback, and (iii) is underestimated by participants.
The long-run consumption risk model provides a theoretically appealing explanation for prominent asset pricing puzzles, but its intricate structure presents a challenge for econometric analysis. This paper proposes a two-step indirect inference approach that disentangles the estimation of the model's macroeconomic dynamics and the investor's preference parameters. A Monte Carlo study explores the feasibility and efficiency of the estimation strategy. We apply the method to recent U.S. data and provide a critical re-assessment of the long-run risk model's ability to reconcile the real economy and financial markets. This two-step indirect inference approach is potentially useful for the econometric analysis of other prominent consumption-based asset pricing models that are equally difficult to estimate.
Consumption-based asset pricing with rare disaster risk : a simulated method of moments approach
(2014)
The rare disaster hypothesis suggests that the extraordinarily high postwar U.S. equity premium resulted because investors ex ante demanded compensation for unlikely but calamitous risks that they happened not to incur. Although convincing in theory, empirical tests of the rare disaster explanation are scarce. We estimate a disaster-including consumption-based asset pricing model (CBM) using a combination of the simulated method of moments and bootstrapping. We consider several methodological alternatives that differ in the moment matches and the way to account for disasters in the simulated consumption growth and return series. Whichever specification is used, the estimated preference parameters are of an economically plausible size, and the estimation precision is much higher than in previous studies that use the canonical CBM. Our results thus provide empirical support for the rare disaster hypothesis, and help reconcile the nexus between real economy and financial markets implied by the consumption-based asset pricing paradigm.
The long-run consumption risk (LRR) model is a promising approach to resolve prominent asset pricing puzzles. The simulated method of moments (SMM) provides a natural framework to estimate its deep parameters, but caveats concern model solubility and weak identification. We propose a two-step estimation strategy that combines GMM and SMM, and for which we elicit informative macroeconomic and financial moment matches from the LRR model structure. In particular, we exploit the persistent serial correlation of consumption and dividend growth and the equilibrium conditions for market return and risk-free rate, as well as the model-implied predictability of the risk-free rate. We match analytical moments when possible and simulated moments when necessary and determine the crucial factors required for both identification and reasonable estimation precision. A simulation study – the first in the context of long-run risk modeling – delineates the pitfalls associated with SMM estimation of a non-linear dynamic asset pricing model. Our study provides a blueprint for successful estimation of the LRR model.
This paper studies the market quality of an internalization system which is designed as part of an open limit order book (the Xetra system operated by Deutsche Börse AG). The internalization sys-tem (Xetra BEST) guarantees a price improvement over the inside spread in the Xetra order book. We develop a structural model of this unique dual market environment and show that, while adverse selection costs of internalized trades are significantly lower than those of regular order book trades, the realized spreads (the revenue earned by the suppliers of liquidity) is significantly larger. The cost savings of the internalizer are larger than the mandatory price improvement. This suggests that internalization can be profitable both for the customer and the internalizer. JEL Classification: G10
We analyze exchange rates along with equity quotes for 3 German firms from New York (NYSE) and Frankfurt (XETRA) during overlapping trading hours to see where price discovery occurs and how stock prices adjust to an exchange rate shock. Findings include: (a) the exchange rate is exogenous with respect to the stock prices; (b) exchange rate innovations are more important in understanding the evolution of NYSE prices than XETRA prices; and (c) most (but not all) of the fundamental or random walk component of firm value is determined in Frankfurt.
This paper provides an empirical assessment of hypotheses that identify causes of demand side constraints of individual labour supply. In a comparative study for the USA and the FRG we focus on analysing the effect of productivity gaps (industry wage growth beyond productivity growth), industry investment intensity and regional labour market conditions on individual employment probabilities. Furthermore, we investigate whether demand side constraints of labour supply can be caused by a spill over from commodity markets. Efficiency wage theory and the theory of inter-industry wage differentials are utilised to derive identifying restrictions that are applicable to the labour supply models for both countries. The econometric contribution of the paper is the derivation and application of a two step estimation method for the class of simultaneous random effects double hurdle models, of which the labour supply model employed in this paper is a special case. To provide the empirical basis for the comparative study, the Panel Study of Income Dynamics and the German Socio-Economic Panel are linked to the OECD’s International Sectoral Database. JEL classification: C33, C34, J64, O57
Kursänderungen auf Aktienmärkten können informationsinduziert durch neu zu verarbeitende Informationen oder liquiditätsinduziert durch kurzfristige Angebots- bzw. Nachfrageüberhänge auftreten. Diese zwei so unterschiedlich verursachten Kursreaktionen sind in empirischen Untersuchungen nur schwer zu trennen. Das Modell von Easley, Kiefer, O’Hara und Paperman (1996) bietet eine theoretische Basis zur separaten Erfassung von liquiditätsorientiertem und informationsbasiertem Handel und eröffnet darüber hinaus auch einen Weg zur empirischen Quantifizierung dieser Größen.
In der vorliegenden Untersuchung nutzen wir diesen Ansatz zur Analyse des Handels deutscher Aktien über das Computerhandelssystem IBIS. Dabei zeigt sich, daß innerhalb der DAX-Werte Informationsereignisse bei den sehr stark gehandelten Aktien nicht häufiger als bei weniger oft gehandelten Werten auftreten. Die Unterschiede im Handelsvolumen sind auf unterschiedlich starke Handelsaktivität sowohl informierter als auch uninformierter Marktteilnehmer zurückzuführen. Weiterhin zeigt sich, daß das Risiko, mit informierten Marktteilnehmern zu handeln, bei den sehr umsatzstarken Aktien am geringsten ist.
In Einklang mit dem sogenannten Montagseffekt ist die Wahrscheinlichkeit für das Auftreten von negativen Informationsereignissen zu Wochenanfang besonders groß. Dieses Ergebnis könnte durch eine Tendenz von Managern erklärt werden, negative Informationen freitags nach Börsenschluß zu veröffentlichen. Eine getrennte Untersuchung für Handelstage mit niedriger und solche mit hoher Volatilität zeigt, daß an Handelstagen mit höherer Volatilität die Handelsintensität sowohl informierter als auch uninformierter Investoren größer ist. Auch die Wahrscheinlichkeit, an solchen Tagen mit besser informierten Marktteilnehmern zu handeln, steigt. Dieser Anstieg ist allerdings nicht statistisch signifikant.