Refine
Year of publication
- 2018 (3) (remove)
Document Type
- Article (3)
Has Fulltext
- yes (3) (remove)
Is part of the Bibliography
- no (3)
Keywords
- Fisher information (1)
- Greeks (1)
- Heston model (1)
- complementary information (1)
- fractional Fourier transform (1)
- information decomposition (1)
- mutual information (1)
- option pricing (1)
- redundancy (1)
- redundant information (1)
Institute
- Frankfurt Institute for Advanced Studies (FIAS) (3)
- Informatik (2)
- Medizin (1)
- Präsidium (1)
"Prognosen sind schwierig, besonders, wenn sie die Zukunft betreffen", sagt ein geflügeltes Wort. Die letzte Finanzkrise ist dafür ein gutes Beispiel, denn die wenigsten Analysten und Wirtschaftsweisen haben sie kommen sehen. Da Finanzkrisen glücklicherweise selten sind, ist es allerdings schwierig, Modelle zu entwickeln, die rechtzeitig vor einem Crash warnen.
Volatility is a widely recognized measure of market risk. As volatility is not observed it has to be estimated from market prices, i.e., as the implied volatility from option prices. The volatility index VIX making volatility a tradeable asset in its own right is computed from near- and next-term put and call options on the S&P 500 with more than 23 days and less than 37 days to expiration and non-vanishing bid. In the present paper we quantify the information content of the constituents of the VIX about the volatility of the S&P 500 in terms of the Fisher information matrix. Assuming that observed option prices are centered on the theoretical price provided by Heston's model perturbed by additive Gaussian noise we relate their Fisher information matrix to the Greeks in the Heston model. We find that the prices of options contained in the VIX basket allow for reliable estimates of the volatility of the S&P 500 with negligible uncertainty as long as volatility is large enough. Interestingly, if volatility drops below a critical value of roughly 3%, inferences from option prices become imprecise because Vega, the derivative of a European option w.r.t. volatility, and thereby the Fisher information nearly vanishes.
The formulation of the Partial Information Decomposition (PID) framework by Williams and Beer in 2010 attracted a significant amount of attention to the problem of defining redundant (or shared), unique and synergistic (or complementary) components of mutual information that a set of source variables provides about a target. This attention resulted in a number of measures proposed to capture these concepts, theoretical investigations into such measures, and applications to empirical data (in particular to datasets from neuroscience). In this Special Issue on “Information Decomposition of Target Effects from Multi-Source Interactions” at Entropy, we have gathered current work on such information decomposition approaches from many of the leading research groups in the field. We begin our editorial by providing the reader with a review of previous information decomposition research, including an overview of the variety of measures proposed, how they have been interpreted and applied to empirical investigations. We then introduce the articles included in the special issue one by one, providing a similar categorisation of these articles into: i. proposals of new measures; ii. theoretical investigations into properties and interpretations of such approaches, and iii. applications of these measures in empirical studies. We finish by providing an outlook on the future of the field.