Refine
Year of publication
- 2008 (137) (remove)
Document Type
- Working Paper (137) (remove)
Is part of the Bibliography
- no (137)
Keywords
- USA (7)
- Deutschland (6)
- Bank (5)
- Geldpolitik (5)
- Lambda-Kalkül (5)
- Operationale Semantik (5)
- Programmiersprache (5)
- Haushalt (4)
- Liquidität (4)
- Aging (3)
Institute
Motivated by the prominent role of electronic limit order book (LOB) markets in today’s stock market environment, this paper provides the basis for understanding, reconstructing and adopting Hollifield, Miller, Sandas, and Slive’s (2006) (henceforth HMSS) methodology for estimating the gains from trade to the Xetra LOB market at the Frankfurt Stock Exchange (FSE) in order to evaluate its performance in this respect. Therefore this paper looks deeply into HMSS’s base model and provides a structured recipe for the planned implementation with Xetra LOB data. The contribution of this paper lies in the modification of HMSS’s methodology with respect to the particularities of the Xetra trading system that are not yet considered in HMSS’s base model. The necessary modifications, as expressed in terms of empirical caveats, are substantial to derive unbiased market efficiency measures for Xetra in the end.
We explore the pattern of elderly homeownership using microeconomic surveys of 15 OECD countries, merging 60 national household surveys on about 300,000 individuals. In all countries the survey is repeated over time, permitting construction of an international dataset of repeated cross-sectional data. We find that ownership rates decline considerably after age 60 in all countries. However, a large part of the decline depends on cohort effects. Adjusting for them, we find that ownership rates start falling after age 70 and reach a percentage point per year decline after age 75. We find that differences across country ownership trajectories are correlated with indicators measuring the degree of market regulations.
This paper introduces adaptive learning and endogenous indexation in the New-Keynesian Phillips curve and studies disinflation under inflation targeting policies. The analysis is motivated by the disinflation performance of many inflation-targeting countries, in particular the gradual Chilean disinflation with temporary annual targets. At the start of the disinflation episode price-setting firms’ expect inflation to be highly persistent and opt for backward-looking indexation. As the central bank acts to bring inflation under control, price-setting firms revise their estimates of the degree of persistence. Such adaptive learning lowers the cost of disinflation. This reduction can be exploited by a gradual approach to disinflation. Firms that choose the rate for indexation also re-assess the likelihood that announced inflation targets determine steady-state inflation and adjust indexation of contracts accordingly. A strategy of announcing and pursuing short-term targets for inflation is found to influence the likelihood that firms switch from backward-looking indexation to the central bank’s targets. As firms abandon backward-looking indexation the costs of disinflation decline further. We show that an inflation targeting strategy that employs temporary targets can benefit from lower disinflation costs due to the reduction in backward-looking indexation.
Monetary policy analysts often rely on rules-of-thumb, such as the Taylor rule, to describe historical monetary policy decisions and to compare current policy to historical norms. Analysis along these lines also permits evaluation of episodes where policy may have deviated from a simple rule and examination of the reasons behind such deviations. One interesting question is whether such rules-of-thumb should draw on policymakers "forecasts of key variables such as inflation and unemployment or on observed outcomes. Importantly, deviations of the policy from the prescriptions of a Taylor rule that relies on outcomes may be due to systematic responses to information captured in policymakers" own projections. We investigate this proposition in the context of FOMC policy decisions over the past 20 years using publicly available FOMC projections from the biannual monetary policy reports to the Congress (Humphrey-Hawkins reports). Our results indicate that FOMC decisions can indeed be predominantly explained in terms of the FOMC´s own projections rather than observed outcomes. Thus, a forecast-based rule-of-thumb better characterizes FOMC decision-making. We also confirm that many of the apparent deviations of the federal funds rate from an outcome-based Taylor-style rule may be considered systematic responses to information contained in FOMC projections.
Risk transfer with CDOs
(2008)
Modern bank management comprises both classical lending business and transfer of asset risk to capital markets through securitization. Sound knowledge of the risks involved in securitization transactions is a prerequisite for solid risk management. This paper aims to resolve a part of the opaqueness surrounding credit-risk allocation to tranches that represent claims of different seniority on a reference portfolio. In particular, this paper analyzes the allocation of credit risk to different tranches of a CDO transaction when the underlying asset returns are driven by a common macro factor and an idiosyncratic component. Junior and senior tranches are found to be nearly orthogonal, motivating a search for the whereabout of systematic risk in CDO transactions. We propose a metric for capturing the allocation of systematic risk to tranches. First, in contrast to a widely-held claim, we show that (extreme) tail risk in standard CDO transactions is held by all tranches. While junior tranches take on all types of systematic risk, senior tranches take on almost no non-tail risk. This is in stark contrast to an untranched bond portfolio of the same rating quality, which on average suffers substantial losses for all realizations of the macro factor. Second, given tranching, a shock to the risk of the underlying asset portfolio (e.g. a rise in asset correlation or in mean portfolio loss) has the strongest impact, in relative terms, on the exposure of senior tranche CDO-investors. Our findings can be used to explain major stylized facts observed in credit markets.
We show that the use of correlations for modeling dependencies may lead to counterintuitive behavior of risk measures, such as Value-at-Risk (VaR) and Expected Short- fall (ES), when the risk of very rare events is assessed via Monte-Carlo techniques. The phenomenon is demonstrated for mixture models adapted from credit risk analysis as well as for common Poisson-shock models used in reliability theory. An obvious implication of this finding pertains to the analysis of operational risk. The alleged incentive suggested by the New Basel Capital Accord (Basel II), amely decreasing minimum capital requirements by allowing for less than perfect correlation, may not necessarily be attainable.
The paper proposes a panel cointegration analysis of the joint development of government expenditures and economic growth in 23 OECD countries. The empirical evidence provides indication of a structural positive correlation between public spending and per-capita GDP which is consistent with the so-called Wagner´s law. A long-run elasticity larger than one suggests a more than proportional increase of government expenditures with respect to economic activity. In addition, according to the spirit of the law, we found that the correlation is usually higher in countries with lower per-capita GDP, suggesting that the catching-up period is characterized by a stronger development of government activities with respect to economies in a more advanced state of development.
Risk transfer with CDOs
(2008)
Modern bank management comprises both classical lending business and transfer of asset risk to capital markets through securitization. Sound knowledge of the risks involved in securitization transactions is a prerequisite for solid risk management. This paper aims to resolve a part of the opaqueness surrounding credit-risk allocation to tranches that represent claims of different seniority on a reference portfolio. In particular, this paper analyzes the allocation of credit risk to different tranches of a CDO transaction when the underlying asset returns are driven by a common macro factor and an idiosyncratic component. Junior and senior tranches are found to be nearly orthogonal, motivating a search for the where about of systematic risk in CDO transactions. We propose a metric for capturing the allocation of systematic risk to tranches. First, in contrast to a widely-held claim, we show that (extreme) tail risk in standard CDO transactions is held by all tranches. While junior tranches take on all types of systematic risk, senior tranches take on almost no non-tail risk. This is in stark contrast to an untranched bond portfolio of the same rating quality, which on average suffers substantial losses for all realizations of the macro factor. Second, given tranching, a shock to the risk of the underlying asset portfolio (e.g. a rise in asset correlation or in mean portfolio loss) has the strongest impact, in relative terms, on the exposure of senior tranche CDO-investors. Our findings can be used to explain major stylized facts observed in credit markets.
Do we measure what we get?
(2008)
Performance measures shall enhance the performance of companies by directing the attention of decision makers towards the achievement of organizational goals. Therefore, goal congruence is regarded in literature as a major factor in the quality of such measures. As reality is affected by many variables, in practice one has tried to achieve a high degree of goal congruence by incorporating an increasing number of these variables into performance measures. However, a goal congruent measure does not lead automatically to superior decisions, because decision makers’ restricted cognitive abilities can counteract the intended effects. This paper addresses the interplay between goal congruence and complexity of performance measures considering cognitively-restricted decision makers. Two types of decision quality are derived which allow a differentiated view on the influence of this interplay on decision quality and learning. The simulation experiments based on this differentiation provide results which allow a critical reflection on costs and benefits of goal congruence and the assumptions regarding the goal congruence of incentive systems.
A new global crop water model was developed to compute blue (irrigation) water requirements and crop evapotranspiration from green (precipitation) water at a spatial resolution of 5 arc minutes by 5 arc minutes for 26 different crop classes. The model is based on soil water balances performed for each crop and each grid cell. For the first time a new global data set was applied consisting of monthly growing areas of irrigated crops and related cropping calendars. Crop water use was computed for irrigated land and the period 1998 – 2002. In this documentation report the data sets used as model input and methods used in the model calculations are described, followed by a presentation of the first results for blue and green water use at the global scale, for countries and specific crops. Additionally the simulated seasonal distribution of water use on irrigated land is presented. The computed model results are compared to census based statistical information on irrigation water use and to results of another crop water model developed at FAO.
A data set of monthly growing areas of 26 irrigated crops (MGAG-I) and related crop calendars (CC-I) was compiled for 402 spatial entities. The selection of the crops consisted of all major food crops including regionally important ones (wheat, rice, maize, barley, rye, millet, sorghum, soybeans, sunflower, potatoes, cassava, sugar cane, sugar beets, oil palm, rapeseed/canola, groundnuts/peanuts, pulses, citrus, date palm, grapes/vine, cocoa, coffee), major water-consuming crops (cotton), and unspecified other crops (other perennial crops, other annual crops, managed grassland). The data set refers to the time period 1998-2002 and has a spatial resolution of 5 arc minutes by 5 arc minutes which is 8 km by 8 km at the equator. This is the first time that a data set of cell-specific irrigated growing areas of irrigated crops with this spatial resolution was created. The data set is consistent to the irrigated area and water use statistics of the AQUASTAT programme of the Food and Agriculture Organization of the United Nations (FAO) (http://www.fao.org/ag/agl/aglw/aquastat/main/index.stm) and the Global Map of Irrigation Areas (GMIA) (http://www.fao.org/ag/agl/aglw/aquastat/irrigationmap/index.stm). At the cell-level it was tried to maximise consistency to the cropland extent and cropland harvested area from the Department of Geography and Earth System Science Program of the McGill University at Montreal, Quebec, Canada and the Center for Sustainability and the Global Environment (SAGE) of the University of Wisconsin at Madison, USA (http://www.geog.mcgill.ca/~nramankutty/ Datasets/Datasets.html and http://geomatics.geog.mcgill.ca/~navin/pub/Data/175crops2000/). The consistency between the grid product and the input data was quantified. MGAG-I and CC-I are fully consistent to each other on entity level. For input data other than CC-I, the consistency of MGAG-I on cell level was calculated. The consistency of MGAG-I with respect to the area equipped for irrigation (AEI) of GMIA and to the cropland extent of SAGE was characterised by the sum of the cell-specific maximum difference between the MGAG-I monthly total irrigated area and the reference area when the latter was exceeded in the grid cell. The consistency of the harvested area contained in MGAG-I with respect to SAGE harvested area was characterised by the crop-specific sum of the cell-specific difference between MGAG-I harvested area and the SAGE harvested area when the latter was exceeded in the grid cell. In all three cases, the sums are the excess areas that should not have been distributed under the assumption that the input data were correct. Globally, this cell-level excess of MGAG-I as compared to AEI is 331,304 ha or only about 0.12 % of the global AEI of 278.9 Mha found in the original grid. The respective cell-level excess of MGAG-I as compared to the SAGE cropland extent is 32.2 Mha, corresponding to about 2.2 % of the total cropland area. The respective cell-level excess of MGAG-I as compared to the SAGE harvested area is 27 % of the irrigated harvested area, or 11.5 % of the AEI. In a further step that will be published later also rainfed areas were compiled in order to form the Global data set of monthly irrigated and rainfed crop areas around the year 2000 (MIRCA2000). The data set can be used for global and continental-scale studies on food security and water use. In the future, it will be improved, e.g. with a better spatial resolution of crop calendars and an improved crop distribution algorithm. The MIRCA2000 data set, its full documentation together with future updates will be freely available through the following long-term internet site: http://www.geo.uni-frankfurt.de/ipg/ag/dl/forschung/MIRCA/index.html. The research presented here was funded by the German Research Foundation (Deutsche Forschungsgemeinschaft, DFG) within the framework of the research project entitled "Consistent assessment of global green, blue and virtual water fluxes in the context of food production: regional stresses and worldwide teleconnections". The authors thank Navin Ramankutty and Chad Monfreda for making available the current SAGE datasets on cropland extent (Ramankutty et al., 2008) and harvested area (Monfreda et al., 2008) prior to their publication.
The introduction of a common currency as well as the harmonization of rules and regulations in Europe has significantly reduced distance in all its guises. With reduced costs of overcoming space, this emphasizes centripetal forces and it should foster consolidation of financial activity. In a national context, as a rule, this led to the emergence of one financial center. Hence, Europeanization of financial and monetary affairs could foretell the relegation of some European financial hubs such as Frankfurt and Paris to third-rank status. Frankfurt’s financial history is interesting insofar as it has lost (in the 1870s) and regained (mainly in the 1980s) its preeminent place in the German context. Because Europe is still characterized by local pockets of information-sensitive assets as well as a demand for variety the national analogy probably does not hold. There is room in Europe for a number of financial hubs of an international dimension, including Frankfurt.
In this paper we consider the dynamics of spot and futures prices in the presence of arbitrage. We propose a partially linear error correction model where the adjustment coefficient is allowed to depend non-linearly on the lagged price difference. We estimate our model using data on the DAX index and the DAX futures contract. We find that the adjustment is indeed nonlinear. The linear alternative is rejected. The speed of price adjustment is increasing almost monotonically with the magnitude of the price difference.
This study develops a novel 2-step hedonic approach, which is used to construct a price index for German paintings. This approach enables the researcher to use every single auction record, instead of only those auction records that belong to a sub-sample of selected artists. This results in a substantially larger sample available for research and it lowers the selection bias that is inherent in the traditional hedonic and repeat sales methodologies. Using a unique sample of 61,135 auction records for German artworks created by 5,115 different artists over the period 1985 to 2007, we find that the geometric annual return on German art is just 3.8 percent, with a standard deviation of 17.87 percent. Although our results indicate that art underperforms the market portfolio and is not proportionally rewarded for downside risk, under some circumstances art should be included in an optimal portfolio for diversification purposes.
While companies have emerged as very proactive donors in the wake of recent major disasters like Hurricane Katrina, it remains unclear whether that corporate generosity generates benefits to firms themselves. The literature on strategic philanthropy suggests that such philanthropic behavior may be valuable because it can generate direct and indirect benefits to the firm, yet it is not known whether investors interpret donations in this way. We develop hypotheses linking the strategic character of donations to positive abnormal returns. Using event study methodology, we investigate stock market reactions to corporate donation announcements by 108 US firms made in response to Hurricane Katrina. We then use regression analysis to examine if our hypothesized predictors are associated with positive abnormal returns. Our results show that overall, corporate donations were linked to neither positive nor negative abnormal returns. We do, however, see that a number of factors moderate the relationship between donation announcements and abnormal stock returns. Implications for theory and practice are discussed.
We estimate the degree of 'stickiness' in aggregate consumption growth (sometimes interpreted as reflecting consumption habits) for thirteen advanced economies. We find that, after controlling for measurement error, consumption growth has a high degree of autocorrelation, with a stickiness parameter of about 0.7 on average across countries. The sticky-consumption-growth model outperforms the random walk model of Hall (1978), and typically fits the data better than the popular Campbell and Mankiw (1989) model. In several countries, the sticky-consumption-growth and Campbell-Mankiw models work about equally well.
Sowohl die Diversifikation als auch die Fokussierung von Unternehmensaktivitäten werden häufig mit der Maximierung des Unternehmenswertes begründet. Wir untersuchen die Auswirkungen auf den Aktienkurs für 184 Akquisitionen sowie 139 Desinvestitionen deutscher Konzerne im Zeitraum von 1996-2005. Unternehmensdiversifikationen üben, entgegen der oft geäußerten Kritik, keinen signifikant negativen Einfluss auf den Marktwert aus. Fokussierende Unternehmensakquisitionen hingegen sind mit einem signifikanten Wertaufschlag verbunden. Der Verkauf von Unternehmensteilen führt generell zu einer Marktwertsteigerung. Dabei führen Abspaltungen außerhalb des Kerngeschäfts zu einer – allerdings insignifikant – höheren Wertsteigerung als Desinvestitionen von Kerngeschäftsaktivitäten. Statt eines systematischen Diversifikationsabschlags finden wir somit einen „Fokussierungsaufschlag“ für den deutschen Markt.
Generally, information provision and certifcation have been identified as the major economic functions of rating agencies. This paper analyzes whether the “watchlist” (rating review) instrument has extended the agencies' role towards a monitoring position, as proposed by Boot, Milbourn, and Schmeits (2006). Using a data set of Moody's rating history between 1982 and 2004, we find that the overall information content of rating action has indeed increased since the introduction of the watchlist procedure. Our findings suggest that rating reviews help to establish implicit monitoring contracts between agencies and borrowers and as such enable a finer partition of rating information, thereby contributing to a higher information quality.