CFS working paper series
https://gfk-cfs.de/working-papers/
Refine
Year of publication
- 2008 (56) (remove)
Document Type
- Working Paper (56)
Language
- English (56)
Has Fulltext
- yes (56)
Is part of the Bibliography
- no (56)
Keywords
- USA (7)
- Haushalt (4)
- Liquidität (4)
- Aging (3)
- Bank (3)
- Financial Literacy (3)
- Geldpolitik (3)
- Liquidity (3)
- Transparency (3)
- Algorithmic Trading (2)
Institute
- Center for Financial Studies (CFS) (56) (remove)
2008, 46
This paper considers a trading game in which sequentially arriving liquidity traders either opt for a market order or for a limit order. One class of traders is considered to have an extended trading horizon, implying their impatience is linked to their trading orientation. More specifically, sellers are considered to have a trading horizon of two periods, whereas buyers only have a single-period trading scope (the extended buyer-horizon case is completely symmetric). Clearly, as the life span of their submitted limit orders is longer, this setting implies sellers are granted a natural advantage in supplying liquidity. This benefit is hampered, however, by the direct competition arising between consecutively arriving sellers. Closed-form characterizations for the order submission strategies are obtained when solving for the equilibrium of this dynamic game. These allow to examine how these forces affect traders´ order placement decisions. Further, the analysis yields insight into the dynamic process of price formation and into the market clearing process of a non-intermediated, order driven market.
2008, 12
In this paper we consider the dynamics of spot and futures prices in the presence of arbitrage. We propose a partially linear error correction model where the adjustment coefficient is allowed to depend non-linearly on the lagged price difference. We estimate our model using data on the DAX index and the DAX futures contract. We find that the adjustment is indeed nonlinear. The linear alternative is rejected. The speed of price adjustment is increasing almost monotonically with the magnitude of the price difference.
2008, 49
Innovative automated execution strategies like Algorithmic Trading gain significant market share on electronic market venues worldwide, although their impact on market outcome has not been investigated in depth yet. In order to assess the impact of such concepts, e.g. effects on the price formation or the volatility of prices, a simulation environment is presented that provides stylized implementations of algorithmic trading behavior and allows for modeling latency. As simulations allow for reproducing exactly the same basic situation, an assessment of the impact of algorithmic trading models can be conducted by comparing different simulation runs including and excluding a trader constituting an algorithmic trading model in its trading behavior. By this means the impact of Algorithmic Trading on different characteristics of market outcome can be assessed. The results indicate that large volumes to execute by the algorithmic trader have an increasing impact on market prices. On the other hand, lower latency appears to lower market volatility.
2008, 07
An asymmetric multivariate generalization of the recently proposed class of normal mixture GARCH models is developed. Issues of parametrization and estimation are discussed. Conditions for covariance stationarity and the existence of the fourth moment are derived, and expressions for the dynamic correlation structure of the process are provided. In an application to stock market returns, it is shown that the disaggregation of the conditional (co)variance process generated by the model provides substantial intuition. Moreover, the model exhibits a strong performance in calculating out–of–sample Value–at–Risk measures.
2008, 25
Research with Keynesian-style models has emphasized the importance of the output gap for policies aimed at controlling inflation while declaring monetary aggregates largely irrelevant. Critics, however, have argued that these models need to be modified to account for observed money growth and inflation trends, and that monetary trends may serve as a useful cross-check for monetary policy. We identify an important source of monetary trends in form of persistent central bank misperceptions regarding potential output. Simulations with historical output gap estimates indicate that such misperceptions may induce persistent errors in monetary policy and sustained trends in money growth and inflation. If interest rate prescriptions derived from Keynesian-style models are augmented with a cross-check against money-based estimates of trend inflation, inflation control is improved substantially.
2008, 42
Central counterparties
(2008)
Central counterparties (CCPs) have increasingly become a cornerstone of financial markets infrastructure. We present a model where trades are time-critical, liquidity is limited and there is limited enforcement of trades. We show a CCP novating trades implements efficient trading behaviour. It is optimal for the CCP to face default losses to achieve the efficient level of trade. To cover these losses, the CCP optimally uses margin calls, and, as the default problem becomes more severe, also requires default funds and then imposes position limits.
2008, 35
We study the relation between cognitive abilities and stockholding using the recent Survey of Health, Ageing and Retirement in Europe (SHARE), which has detailed data on wealth and portfolio composition of individuals aged 50+ in 11 European countries and three indicators of cognitive abilities: mathematical, verbal fluency, and recall skills. We find that the propensity to invest in stocks is strongly associated with cognitive abilities, for both direct stock market participation and indirect participation through mutual funds and retirement accounts. Since the decision to invest in less information-intensive assets (such as bonds) is less strongly related to cognitive abilities, we conclude that the association between cognitive abilities and stockholding is driven by information constraints, rather than by features of preferences or psychological traits.
2008, 24
Modern macroeconomics empirically addresses economy-wide incentives behind economic actions by using insights from the way a single representative household would behave. This analytical approach requires that incentives of the poor and the rich are strictly aligned. In empirical analysis a challenging complication is that consumer and income data are typically available at the household level, and individuals living in multimember households have the potential to share goods within the household. The analytical approach of modern macroeconomics would require that intra-household sharing is also strictly aligned across the rich and the poor. Here we have designed a survey method that allows the testing of this stringent property of intra-household sharing and find that it holds: once expenditures for basic needs are subtracted from disposable household income, household-size economies implied by the remainder household incomes are the same for the rich and the poor.
2008, 11
This study develops a novel 2-step hedonic approach, which is used to construct a price index for German paintings. This approach enables the researcher to use every single auction record, instead of only those auction records that belong to a sub-sample of selected artists. This results in a substantially larger sample available for research and it lowers the selection bias that is inherent in the traditional hedonic and repeat sales methodologies. Using a unique sample of 61,135 auction records for German artworks created by 5,115 different artists over the period 1985 to 2007, we find that the geometric annual return on German art is just 3.8 percent, with a standard deviation of 17.87 percent. Although our results indicate that art underperforms the market portfolio and is not proportionally rewarded for downside risk, under some circumstances art should be included in an optimal portfolio for diversification purposes.
2008, 54
The paper provides novel insights on the effect of a firm’s risk management objective on the optimal design of risk transfer instruments. I analyze the interrelation between the structure of the optimal insurance contract and the firm’s objective to minimize the required equity it has to hold to accommodate losses in the presence of multiple risks and moral hazard. In contrast to the case of risk aversion and moral hazard, the optimal insurance contract involves a joint deductible on aggregate losses in the present setting.