Center for Financial Studies (CFS)
Refine
Year of publication
Document Type
- Working Paper (1469)
- Report (77)
- Part of Periodical (58)
- Article (9)
- Conference Proceeding (3)
- Periodical (3)
- Book (1)
- Doctoral Thesis (1)
Is part of the Bibliography
- no (1621) (remove)
Keywords
- Deutschland (54)
- Geldpolitik (54)
- USA (45)
- monetary policy (41)
- Europäische Union (30)
- Monetary Policy (27)
- Schätzung (24)
- Währungsunion (22)
- Bank (21)
- Venture Capital (21)
Institute
- Center for Financial Studies (CFS) (1621)
- Wirtschaftswissenschaften (1136)
- Sustainable Architecture for Finance in Europe (SAFE) (777)
- House of Finance (HoF) (716)
- Institute for Monetary and Financial Stability (IMFS) (109)
- Rechtswissenschaft (48)
- Foundation of Law and Finance (41)
- Institute for Law and Finance (ILF) (8)
- Frankfurt MathFinance Institute (FMFI) (3)
- E-Finance Lab e.V. (1)
- Exzellenzcluster Die Herausbildung normativer Ordnungen (1)
- Gesellschaftswissenschaften (1)
Against the background of the European debt crisis, the Research Center SAFE, in the fall of 2013, had issued a call for papers on the topic “Austerity and Economic Growth: Concepts for Europe”, with the objective of soliciting research proposals focusing on the nature of the relationship between austerity, debt sustainability and growth. Each of the five funded projects brought forth an academic paper and a shortened, non-technical policy brief. These policy papers are presented in the present collection of policy letters, edited by Alfons Weichenrieder.
The first paper by Alberto Alesina, Carlo Favero and Francesco Giavazzi looks into the question of how fiscal consolidations influence the real economy. Harris Dellas and Dirk Niepelt emphasize that fiscal austerity is a signal that investors use to tell apart governments with high and low default costs that accordingly will have a high or low probability of repayment.The paper by Benjamin Born, Gernot Müller and Johannes Pfeiffer,looks at the impact of austerity measures on government bond spreads. Oscar Jorda and Alan M. Taylor, in the fourth contribution, put into question whether the narrative records of fiscal consolidation plans are really exogenous. The final study by Enrique Mendoza, Linda Tesar and Jing Zhang suggests that fiscal consolidation should largely depend on expenditure cuts, rather than tax increases that may fail, when fiscal space is exhausted.
Austerity
(2014)
We shed light on the function, properties and optimal size of austerity using the standard sovereign model augmented to include incomplete information about credit risk. Austerity is defined as the shortfall of consumption from the level desired by a country and supported by its repayment capacity. We find that austerity serves as a tool for securing a more favorable loan package; that it is associated with over-investment even when investment does not create collateral; and that low risk borrowers may favor more to less severe austerity. These findings imply that the amount of fresh funds obtained by a sovereign is not a reliable measure of austerity suffered; and that austerity may actually be associated with higher growth. Our analysis accommodates costly signalling for gaining credibility and also assigns a novel role to spending multipliers in the determination of optimal austerity.
Asymmetric social norms
(2017)
Studies of cooperation in infinitely repeated matching games focus on homogeneous economies, where full cooperation is efficient and any defection is collectively sanctioned. Here we study heterogeneous economies where occasional defections are part of efficient play, and show how to support those outcomes through contagious punishments.
An asymmetric multivariate generalization of the recently proposed class of normal mixture GARCH models is developed. Issues of parametrization and estimation are discussed. Conditions for covariance stationarity and the existence of the fourth moment are derived, and expressions for the dynamic correlation structure of the process are provided. In an application to stock market returns, it is shown that the disaggregation of the conditional (co)variance process generated by the model provides substantial intuition. Moreover, the model exhibits a strong performance in calculating out–of–sample Value–at–Risk measures.
Based on a unique data set of driving behavior we find direct evidence that private information has significant effects on contract choice and risk in automobile insurance. The number of car rides and the relative distance driven on weekends are significant risk factors. While the number of car rides and average speeding are negatively related to the level of liability coverage, the number of car rides and the relative distance driven at night are positively related to the level of first-party coverage. These results indicate multiple and counteracting effects of private information based on risk preferences and driving behavior.
We analyze the equilibrium in a two-tree (sector) economy with two regimes. The output of each tree is driven by a jump-diffusion process, and a downward jump in one sector of the economy can (but need not) trigger a shift to a regime where the likelihood of future jumps is generally higher. Furthermore, the true regime is unobservable, so that the representative Epstein-Zin investor has to extract the probability of being in a certain regime from the data. These two channels help us to match the stylized facts of countercyclical and excessive return volatilities and correlations between sectors. Moreover, the model reproduces the predictability of stock returns in the data without generating consumption growth predictability. The uncertainty about the state also reduces the slope of the term structure of equity. We document that heterogeneity between the two sectors with respect to shock propagation risk can lead to highly persistent aggregate price-dividend ratios. Finally, the possibility of jumps in one sector triggering higher overall jump probabilities boosts jump risk premia while uncertainty about the regime is the reason for sizeable diffusive risk premia.
This paper analyzes how the combination of borrowing constraints and idiosyncratic risk affects the equity premium in an overlapping generations economy. I find that introducing a zero-borrowing constraint in an economy without idiosyncratic risk increases the equity premium by 70 percent, which means that the mechanism described in Constantinides, Donaldson, and Mehra (2002) is dampened because of the large number of generations and production. With social security the effect of the zero-borrowing constraint is a lot weaker. More surprisingly, when I introduce idiosyncratic labor income risk in an economy without a zero-borrowing constraint, the equity premium increases by 50 percent, even though the income shocks are independent of aggregate risk and are not permanent. The reason is that idiosyncratic risk makes the endogenous natural borrowing limits much tighter, so that they have a similar effect to an exogenously imposed zero-borrowing constraint. This intuition is confirmed when I add idiosyncratic risk in an economy with a zero-borrowing constraint: neither the equity premium nor the Sharpe ratio change, because the zero-borrowing constraint is already tighter than the natural borrowing limits that result when idiosyncratic risk is added.
We study consumption-portfolio and asset pricing frameworks with recursive preferences and unspanned risk. We show that in both cases, portfolio choice and asset pricing, the value function of the investor/ representative agent can be characterized by a specific semilinear partial differential equation. To date, the solution to this equation has mostly been approximated by Campbell-Shiller techniques, without addressing general issues of existence and uniqueness. We develop a novel approach that rigorously constructs the solution by a fixed point argument. We prove that under regularity conditions a solution exists and establish a fast and accurate numerical method to solve consumption-portfolio and asset pricing problems with recursive preferences and unspanned risk. Our setting is not restricted to affine asset price dynamics. Numerical examples illustrate our approach.
We study consumption-portfolio and asset pricing frameworks with recursive preferences and unspanned risk. We show that in both cases, portfolio choice and asset pricing, the value function of the investor/representative agent can be characterized by a specific semilinear partial differential equation. To date, the solution to this equation has mostly been approximated by Campbell-Shiller techniques, without addressing general issues of existence and uniqueness. We develop a novel approach that rigorously constructs the solution by a fixed point argument. We prove that under regularity conditions a solution exists and establish a fast and accurate numerical method to solve consumption-portfolio and asset pricing problems with recursive preferences and unspanned risk. Our setting is not restricted to affine asset price dynamics. Numerical examples illustrate our approach.
In this paper, we study the effect of proportional transaction costs on consumption-portfolio decisions and asset prices in a dynamic general equilibrium economy with a financial market that has a single-period bond and two risky stocks, one of which incurs the transaction cost. Our model has multiple investors with stochastic labor income, heterogeneous beliefs, and heterogeneous Epstein-Zin-Weil utility functions. The transaction cost gives rise to endogenous variations in liquidity. We show how equilibrium in this incomplete-markets economy can be characterized and solved for in a recursive fashion. We have three main findings. One, costs for trading a stock lead to a substantial reduction in the trading volume of that stock, but have only a small effect on the trading volume of the other stock and the bond. Two, even in the presence of stochastic labor income and heterogeneous beliefs, transaction costs have only a small effect on the consumption decisions of investors, and hence, on equity risk premia and the liquidity premium. Three, the effects of transaction costs on quantities such as the liquidity premium are overestimated in partial equilibrium relative to general equilibrium.
We study the life cycle of portfolio allocation following for 15 years a large random sample of Norwegian households using error-free data on all components of households’ investments drawn from the Tax Registry. Both, participation in the stock market and the portfolio share in stocks, have important life cycle patterns. Participation is limited at all ages but follows a hump-shaped profile which peaks around retirement; the share invested in stocks among the participants is high and flat for the young but investors start reducing it as retirement comes into sight. Our data suggest a double adjustment as people age: a rebalancing of the portfolio away from stocks as they approach retirement, and stock market exit after retirement. Existing calibrated life cycle models can account for the first behavior but not the second. We show that incorporating in these models a reasonable per period participation cost can generate limited participation among the young but not enough exit from the stock market among the elderly. Adding also a small probability of a large loss when investing in stocks, produces a joint pattern of participation and of the risky asset share that resembles the one observed in the data. A structural estimation of the relevant parameters that target simultaneously the portfolio, participation and asset accumulation age profiles of the model reveals that the parameter combination that fits the data best is one with a relatively large risk aversion, small participation cost and a yearly large loss probability in line with the frequency of stock market crashes in Norway.
Innovative automated execution strategies like Algorithmic Trading gain significant market share on electronic market venues worldwide, although their impact on market outcome has not been investigated in depth yet. In order to assess the impact of such concepts, e.g. effects on the price formation or the volatility of prices, a simulation environment is presented that provides stylized implementations of algorithmic trading behavior and allows for modeling latency. As simulations allow for reproducing exactly the same basic situation, an assessment of the impact of algorithmic trading models can be conducted by comparing different simulation runs including and excluding a trader constituting an algorithmic trading model in its trading behavior. By this means the impact of Algorithmic Trading on different characteristics of market outcome can be assessed. The results indicate that large volumes to execute by the algorithmic trader have an increasing impact on market prices. On the other hand, lower latency appears to lower market volatility.
We outline a procedure for consistent estimation of marginal and joint default risk in the euro area financial system. We interpret the latter risk as the intrinsic financial system fragility and derive several systemic fragility indicators for euro area banks and sovereigns, based on CDS prices. Our analysis documents that although the fragility of the euro area banking system had started to deteriorate before Lehman Brothers' file for bankruptcy, investors did not expect the crisis to affect euro area sovereigns' solvency until September 2008. Since then, and especially after November 2009, joint sovereign default risk has outpaced the rise of systemic risk within the banking system.
We outline a procedure for consistent estimation of marginal and joint default risk in the euro area financial system. We interpret the latter risk as the intrinsic financial system fragility and derive several systemic fragility indicators for euro area banks and sovereigns, based on CDS prices. Our analysis documents that although the fragility of the euro area banking system had started to deteriorate before Lehman Brothers' file for bankruptcy, investors did not expect the crisis to affect euro area sovereigns' solvency until September 2008. Since then, and especially after November 2009, joint sovereign default risk has outpaced the rise of systemic risk within the banking system.
Financial markets embed expectations of central bank policy into asset prices. This paper compares two approaches that extract a probability density of market beliefs. The first is a simulatedmoments estimator for option volatilities described in Mizrach (2002); the second is a new approach developed by Haas, Mittnik and Paolella (2004a) for fat-tailed conditionally heteroskedastic time series. In an application to the 1992-93 European Exchange Rate Mechanism crises, that both the options and the underlying exchange rates provide useful information for policy makers. JEL Klassifikation: G12, G14, F31.
In more and more situations, artificially intelligent algorithms have to model humans’ (social) preferences on whose behalf they increasingly make decisions. They can learn these preferences through the repeated observation of human behavior in social encounters. In such a context, do individuals adjust the selfishness or prosociality of their behavior when it is common knowledge that their actions produce various externalities through the training of an algorithm? In an online experiment, we let participants’ choices in dictator games train an algorithm. Thereby, they create an externality on future decision making of an intelligent system that affects future participants. We show that individuals who are aware of the consequences of their training on the pay- offs of a future generation behave more prosocially, but only when they bear the risk of being harmed themselves by future algorithmic choices. In that case, the externality of artificially intelligence training induces a significantly higher share of egalitarian decisions in the present.
With Big Data, decisions made by machine learning algorithms depend on training data generated by many individuals. In an experiment, we identify the effect of varying individual responsibility for the moral choices of an artificially intelligent algorithm. Across treatments, we manipulated the sources of training data and thus the impact of each individual’s decisions on the algorithm. Diffusing such individual pivotality for algorithmic choices increased the share of selfish decisions and weakened revealed prosocial preferences. This does not result from a change in the structure of incentives. Rather, our results show that Big Data offers an excuse for selfish behavior through lower responsibility for one’s and others’ fate.
We provide a novel benefit of "Alternative Risk Transfer" (ART) products with parametric or index triggers. When a reinsurer has private information about his client's risk, outside reinsurers will price their reinsurance offer less aggressively. Outsiders are subject to adverse selection as only a high-risk insurer might find it optimal to change reinsurers. This creates a hold-up problem that allows the incumbent to extract an information rent. An information-insensitive ART product with a parametric or index trigger is not subject to adverse selection. It can therefore be used to compete against an informed reinsurer, thereby reducing the premium that a low-risk insurer has to pay for the indemnity contract. However, ART products exhibit an interesting fate in our model as they are useful, but not used in equilibrium because of basis-risk. Klassifikation: D82, G22
Employing the art-collection records of Burton and Emily Hall Tremaine, we consider whether early-stage art investors can be understood as venture capitalists. Because the Tremaines bought artists’ work very close to an artwork’s creation, with 69% of works in our study purchased within one year of the year when they were made, their collecting practice can best be framed as venture-capital investment in art. The Tremaines also illustrate art collecting as social-impact investment, owing to their combined strategy of art sales and museum donations for which the collectors received a tax credit under US rules. Because the Tremaines’ museum donations took place at a time that U.S. marginal tax rates from 70% to 91%, the near “donation parity” with markets, creating a parallel to ESG investment in the management of multiple forms of value.
This chapter analyzes the risk and return characteristics of investments in artists from the Middle East and Northern Africa (MENA) region over the sample period 2000 to 2012. With hedonic regression modeling we create an annual index that is based on 3,544 paintings created by 663 MENA artists. Our empirical results prove that investing in such a hypothetical index provides strong financial returns. While the results show an exponential growth in sales since 2006, the geometric annual return of the MENA art index is a stable13.9 percent over the whole period. We conclude that investing in MENA paintings would have been profitable but also note that we examined the performance of an emerging art market that has only seen an upward trend without any correction, yet.
The pressure on tax haven countries to engage in tax information exchange shows first effects on capital markets. Empirical research suggests that investors do react to information exchange and partially withdraw from previous secrecy jurisdictions that open up to information exchange. While some of the economic literature emphasizes possible positive effects of tax havens, the present paper argues that proponents of positive effects may have started from questionable premises, in particular when it comes to the effects that tax havens have for emerging markets like China and India.
The issuance of sustainability-linked loans (SLLs) has grown exponentially in recent years. Using a scoring methodology, we examine the underlying key performance indicators of a large sample of SLLs and analyze whether their design creates effective incentives for improving corporate sustainability performance. We demonstrate that the majority of loans fails to meet key requirements that would make them credible instruments for generating effective sustainability incentives. These findings call into question the actual sustainability impact that may be achieved through the issuance of ESG-linked debt.
Earlier studies of the seigniorage inflation model have found that the high-inflation steady state is not stable under adaptive learning. We reconsider this issue and analyze the full set of solutions for the linearized model. Our main focus is on stationary hyperinflationary paths near the high-inflation steady state. The hyperinflationary paths are stable under learning if agents can utilize contemporaneous data. However, in an economy populated by a mixture of agents, some of whom only have access to lagged data, stable inflationary paths emerge only if the proportion of agents with access to contemporaneous data is sufficiently high. JEL Klassifikation: C62, D83, D84, E31
Are rules and boundaries sufficient to limit harmful central bank discretion? Lessons from Europe
(2014)
Marvin Goodfriend’s (2014) insightful, informative and provocative work explains concisely and convincingly why the Fed needs rules and boundaries. This paper reviews the broader institutional design problem regarding the effectiveness of the central bank in practice and confirms the need for rules and boundaries. The framework proposed for improving the Fed incorporates key elements that have already been adopted in the European Union. The case of ELA provision by the ECB and the Central Bank of Cyprus to Marfin-Laiki Bank during the crisis, however, suggests that the existence of rules and boundaries may not be enough to limit harmful discretion. During a crisis, novel interpretations of the legal authority of the central bank may be introduced to create a grey area that might be exploited to justify harmful discretionary decisions even in the presence of rules and boundaries. This raises the question how to ensure that rules and boundaries are respected in practice
This paper investigates whether preference interactions can explain why risk preferences change over time and across contexts. We conduct an experiment in which subjects accept or reject gambles involving real money gains and losses. We introduce within-subject variation by alternating subjectively liked music and disliked music in the background. We find that favourite music increases risk-taking, and disliked music suppresses risk-taking, compared to a baseline of no music. Several theories in psychology propose mechanisms by which mood affects risktaking, but none of them fully explain our results. The results are, however, consistent with preference complementarities that extend to risk preference.
Are product spreads useful for forecasting? An empirical evaluation of the Verleger hypothesis
(2013)
Notwithstanding a resurgence in research on out-of-sample forecasts of the price of oil in recent years, there is one important approach to forecasting the real price of oil which has not been studied systematically to date. This approach is based on the premise that demand for crude oil derives from the demand for refined products such as gasoline or heating oil. Oil industry analysts such as Philip Verleger and financial analysts widely believe that there is predictive power in the product spread, defined as the difference between suitably weighted refined product market prices and the price of crude oil. Our objective is to evaluate this proposition. We derive from first principles a number of alternative forecasting model specifications involving product spreads and compare these models to the no-change forecast of the real price of oil. We show that not all product spread models are useful for out-of-sample forecasting, but some models are, even at horizons between one and two years. The most accurate model is a time-varying parameter model of gasoline and heating oil spot spreads that allows the marginal product market to change over time. We document MSPE reductions as high as 20% and directional accuracy as high as 63% at the two-year horizon, making product spread models a good complement to forecasting models based on economic fundamentals, which work best at short horizons.
The objective of this study is to determine whether specific industries across countries or within countries are more likely to reach a stage of profitability and make a successful exit. In particular, we assess whether firms in certain industries are more prone to exit via IPO, be acquired, or exit through a leveraged buy-out. We are also interested in analyzing whether substantial differences across industries and countries arise when looking separately at the success’ rate of firms which have received venture funding at the early seed and start-up stages, vis-à-vis firms that received funding at later stages. Our results suggest that, inasmuch as some of the differences in performance can be explained by country-specific factors, there are also important idiosyncratic differences across industries: In particular, firms in the biotech and the medical / health / life science sectors tend to be significantly more likely to have a successful exit via IPO, while firms in the computer industry and communications and media are more prone to exit via merger or acquisition. Key differences across industries also emerge when considering infant versus mature firms, and their preferred exit. JEL Classification: G24, G3 Keywords:
This paper aims to analyze the impact of different types of venture capitalists on the performance of their portfolio firms around and after the IPO. We thereby investigate the hypothesis that different governance structures, objectives and track record of different types of VCs have a significant impact on their respective IPOs. We explore this hypothesis by using a data set embracing all IPOs which occurred on Germany's Neuer Markt. Our main finding is that significant differences among the different VCs exist. Firms backed by independent VCs perform significantly better two years after the IPO compared to all other IPOs and their share prices fluctuate less than those of their counterparts in this period of time. Obviously, independent VCs, which concentrated mainly on growth stocks (low book-to-market ratio) and large firms (high market value), were able to add value by leading to less post-IPO idiosyncratic risk and more return (after controlling for all other effects). On the contrary, firms backed by public VCs (being small and having a high book-to-market ratio) showed relative underperformance. Klassifikation: G10, G14, G24 . 29th January 2004 .
This paper sets out to analyze the influence of different types of venture capitalists on the performance of their portfolio firms around and after IPO. We investigate the hypothesis that different governance structures, objectives, and track records of different types of VCs have a significant impact on their respective IPOs. We explore this hypothesis using a data set embracing all IPOs that have occurred on Germany's Neuer Markt. Our main finding is that significant differences among the different VCs exist. Firms backed by independent VCs perform significantly better two years after IPO as compared to all other IPOs, and their share prices fluctuate less than those of their counterparts in this period of time. On the contrary, firms backed by public VCs show relative underperformance. The fact that this could occur implies that market participants did not correctly assess the role played by different types of VCs.
Traditional least squares estimates of the responsiveness of gasoline consumption to changes in gasoline prices are biased toward zero, given the endogeneity of gasoline prices. A seemingly natural solution to this problem is to instrument for gasoline prices using gasoline taxes, but this approach tends to yield implausibly large price elasticities. We demonstrate that anticipatory behavior provides an important explanation for this result. We provide evidence that gasoline buyers increase gasoline purchases before tax increases and delay gasoline purchases before tax decreases. This intertemporal substitution renders the tax instrument endogenous, invalidating conventional IV analysis. We show that including suitable leads and lags in the regression restores the validity of the IV estimator, resulting in much lower and more plausible elasticity estimates. Our analysis has implications more broadly for the IV analysis of markets in which buyers may store purchases for future consumption.
Sondierungsstudie im Auftrag des Bundesministeriums für Bildung und Forschung: Die jüngste Finanzkrise und die darauf folgende Staatsschuldenkrise hat sowohl wirtschaftlich als auch gesellschaftlich tiefgreifende Spuren hinterlassen. Dabei wurden auch sehr deutliche Lücken in der Forschung offenbar. Ziel dieser Studie ist es, aufbauend auf dem aktuellen Forschungsstand weiteren Forschungsbedarf in den wesentlich mit Finanzkrisen verbundenen Bereichen aufzuzeigen. Es werden fünf Forschungsbereiche mit jeweiligen Unterthemen vorgeschlagen. Diese fünf Forschungsbereiche gehen unmittelbar aus der Struktur und den Mechanismen der Finanz- und Staatsschuldenkrise hervor. Dabei wird besonderes Augenmerk auf die wirtschafts- und regulierungspolitische Relevanz der Themen sowie dem Umstand getragen, dass die Beantwortung vieler der Fragen interdisziplinäre Zusammenarbeit erfordert.
Finanzkrisen sind inherent mit dem Bankenmodell verbunden. Aufgrund von Verbindungen der Banken untereinander können Probleme einzelner Institute auf andere Institute übertragen werden. Diese systemischen Risiken können das gesamte Finanzsystem destabilisieren. Das Finanzsystem nimmt durch die Kreditvergabe und Bereitstellung von Transaktionssystemen eine herausragende Stellung in einer Volkswirtschaft ein, wodurch stabilisierende Eingriffe der Politik notwendig werden können. Eingriffe zur Wiederherstellung von Stabilität können sehr kostspielig sein und, wie aktuell eindrucksvoll belegt, die stabilisierenden Staaten selbst destabilisieren. Die alternativen Eingriffe vorab betreffen neben der Geldpolitik vor allem regulatorische Eingriffe. Im besonderen sind die Corporate Governance von Finanzinstituten und die Informationsbereitstellung bzw. Transparenz innerhalb des Finanzsektors von Bedeutung. In den vergangen Jahren wuchs vor dem Hintergrund von Regulierung zudem ein paralleles Schattenbankensystems heran, das in seiner Bedeutung dem traditionellen Bankensystem nur unwesentlich nachsteht.
Zwar sind die groben Zusammenhänge und Auswirkungen in den einzelnen Bereichen bekannt, jedoch ist für ein tiefgreifendes Verständnis als Grundlage zur Vermeidung bzw. Eindämmung zukünftiger Krisen sowie zur Folgenabschätzung von Regulierung weitere Forschung unabdingbar.
Die Studie untersucht die Frage, ob der Gesetzgeber des ARUG die Ziele erreicht hat, die mit der Reform des Rechts der Anfechtung von HVBeschlüssen verfolgt wurden. Darüber hinaus gehend soll die Entwicklung der Beschlußmängelklagen seit der letzten Studie der Verfasser hierzu nachgezeichnet werden. Unsere Studie zeigt, daß seit Inkrafttreten des ARUG ein deutlicher Rückgang der Beschlußmängelklagen und Freigabeverfahren zu verzeichnen ist. Dagegen ist der Anteil der von „Berufsklägern“ erhobenen Klagen und Nebeninterventionen gleich geblieben, wobei sich die Anzahl der Personen in der Gruppe der „Berufskläger“ nochmals vergrößert hat. Das ARUG hat insoweit keine erkennbare Wirkung gehabt...
We investigate the relationship between anchoring and the emergence of bubbles in experimental asset markets. We show that setting a visual anchor at the fundamental value (FV) in the first period only is sufficient to eliminate or to significantly reduce bubbles in laboratory asset markets. If no FV-anchor is set, bubble-crash patterns emerge. Our results indicate that bubbles in laboratory environments are primarily sparked in the first period. If prices are initiated around the FV, they stay close to the FV over the entire trading horizon. Our insights can be related to initial public offerings and the interaction between prices set on pre-opening markets and subsequent intra-day price dynamics.
This paper constructs a dynamic model of health insurance to evaluate the short- and long run effects of policies that prevent firms from conditioning wages on health conditions of their workers, and that prevent health insurance companies from charging individuals with adverse health conditions higher insurance premia. Our study is motivated by recent US legislation that has tightened regulations on wage discrimination against workers with poorer health status (Americans with Disability Act of 2009, ADA, and ADA Amendments Act of 2008, ADAAA) and that will prohibit health insurance companies from charging different premiums for workers of different health status starting in 2014 (Patient Protection and Affordable Care Act, PPACA). In the model, a trade-off arises between the static gains from better insurance against poor health induced by these policies and their adverse dynamic incentive effects on household efforts to lead a healthy life. Using household panel data from the PSID we estimate and calibrate the model and then use it to evaluate the static and dynamic consequences of no-wage discrimination and no-prior conditions laws for the evolution of the cross-sectional health and consumption distribution of a cohort of households, as well as ex-ante lifetime utility of a typical member of this cohort. In our quantitative analysis we find that although a combination of both policies is effective in providing full consumption insurance period by period, it is suboptimal to introduce both policies jointly since such policy innovation induces a more rapid deterioration of the cohort health distribution over time. This is due to the fact that combination of both laws severely undermines the incentives to lead healthier lives. The resulting negative effects on health outcomes in society more than offset the static gains from better consumption insurance so that expected discounted lifetime utility is lower under both policies, relative to only implementing wage nondiscrimination legislation.
Analyzing interest rate risk: stochastic volatility in the term structure of government bond yields
(2009)
We propose a Nelson-Siegel type interest rate term structure model where the underlying yield factors follow autoregressive processes with stochastic volatility. The factor volatilities parsimoniously capture risk inherent to the term structure and are associated with the time-varying uncertainty of the yield curve’s level, slope and curvature. Estimating the model based on U.S. government bond yields applying Markov chain Monte Carlo techniques we find that the factor volatilities follow highly persistent processes. We show that slope and curvature risk have explanatory power for bond excess returns and illustrate that the yield and volatility factors are closely related to industrial capacity utilization, inflation, monetary policy and employment growth. JEL Classification: C5, E4, G1
This paper proposes the Shannon entropy as an appropriate one-dimensional measure of behavioural trading patterns in financial markets. The concept is applied to the illustrative example of algorithmic vs. non-algorithmic trading and empirical data from Deutsche Börse's electronic cash equity trading system, Xetra. The results reveal pronounced differences between algorithmic and non-algorithmic traders. In particular, trading patterns of algorithmic traders exhibit a medium degree of regularity while non-algorithmic trading tends towards either very regular or very irregular trading patterns. JEL Classification: C40, D0, G14, G15, G20
In this exploratory article, we consider the future of Deutsche Bank and Commerzbank and develop a new approach to the topic: instead of a merger of DB and CB we propose to consider a partial merger of the IT and related back office functions in order to create the basis for an Open Banking platform in Germany. Such a platform would act as a cross-institutional infrastructure company in which the participating banks develop a common data and IT platform (while respecting the data protection regulations). Significant parts of the transaction processes would be pooled by the institutions and executed by the Open Banking platform. Moreover, the institutions remain legally independent and compete with each other at the level of products and services that are developed and produced using just this common data and IT platform – “national champions” would not be created.
But such an “Open Banking Platform” could become even the nucleus of a European Banking platform that could be competitive with existing global data platforms from the USA and China which are already offering financial services and are likely to expand their offerings in the foreseeable future. The proposed model of an open data platform for banks prevents the emergence of national champions and supports the main goal of the banking union: creation of a financial system, in which single banks can be resolved without provoking a systemic crisis and forcing taxpayers to finance bailouts.
Broad, long-term financial and economic datasets are a scarce resource, in particular in the European context. In this paper, we present an approach for an extensible, i.e. adaptable to future changes in technologies and sources, data model that may constitute a basis for digitized and structured long- term, historical datasets. The data model covers specific peculiarities of historical financial and economic data and is flexible enough to reach out for data of different types (quantitative as well as qualitative) from different historical sources, hence achieving extensibility. Furthermore, based on historical German company and stock market data, we discuss a relational implementation of this approach.
We study the behavioral underpinnings of adopting cash versus electronic payments in retail transactions. A novel theoretical and experimental framework is developed to primarily assess the impact of sellers’ service fees and buyers’ rewards from using electronic payments. Buyers and sellers face a coordination problem, independently choosing a payment method before trading. In the experiment, sellers readily adopt electronic payments but buyers do not. Eliminating service fees or introducing rewards significantly boosts the adoption of electronic payments. Hence, buyers’ incentives play a pivotal role in the diffusion of electronic payments but monetary incentives cannot fully explain their adoption choices. Findings from this experiment complement empirical findings based on surveys and field data.
We relate time-varying aggregate ambiguity (V-VSTOXX) to individual investor trading. We use the trading records of more than 100,000 individual investors from a large German online brokerage from March 2010 to December 2015. We find that an increase in ambiguity is associated with increased investor activity. It also leads to a reduction in risk-taking which does not reverse over the following days. When ambiguity is high, the effect of sentiment looms larger. Survey evidence reveals that ambiguity averse investors are more prone to ambiguity shocks. Our results are robust to alternative survey-, newspaper- or market-based ambiguity measures.
In this study we investigate which economic ideas were prevalent in the macroprudential discourse post-crises in order to understand the availability of ideas for reform minded agents. We base our analysis on new findings in the field of ideational shifts and regulatory science, which posit that change-agents engage with new ideas pragmatically and strategically in their effort to have their economic ideas institutionalized. We argue that in these epistemic battles over new regulation, scientific backing by academia is the key resource determining the outcome. We show that the present reforms implemented internationally follow this pattern. In our analysis we contrast the entire discourse on systemic risk and macroprudential regulation with Borio’s initial 2003 proposal for a macroprudential framework. We find that mostly cross-sectional measures targeted towards increasing the resilience of the financial system rather than inter-temporal measures dampening the financial cycle have been implemented. We provide evidence for the lacking support of new macroprudential thinking within academia and argue that this is partially responsible for the lack of anti-cyclical macroprudential regulation. Most worryingly, the financial cycle is largely absent in the academic discourse and is only tacitly assumed instead of fully fledged out in technocratic discourses, pointing to the possibility that no anti-cyclical measures will be forthcoming.
Algorithmic trading engines versus human traders – do they behave different in securities markets?
(2009)
After exchanges and alternative trading venues have introduced electronic execution mechanisms worldwide, the focus of the securities trading industry shifted to the use of fully electronic trading engines by banks, brokers and their institutional customers. These Algorithmic Trading engines enable order submissions without human intervention based on quantitative models applying historical and real-time market data. Although there is a widespread discussion on the pros and cons of Algorithmic Trading and on its impact on market volatility and market quality, little is known on how algorithms actually place their orders in the market and whether and in which respect this differs form other order submissions. Based on a dataset that – for the first time – includes a specific flag to enable the identification of orders submitted by Algorithmic Trading engines, the paper investigates the extent of Algorithmic Trading activity and specifically their order placement strategies in comparison to human traders in the Xetra trading system. It is shown that Algorithmic Trading has become a relevant part of overall market activity and that Algorithmic Trading engines fundamentally differ from human traders in their order submission, modification and deletion behavior as they exploit real-time market data and latest market movements.
Projected demographic changes in industrialized and developing countries vary in extent and timing but will reduce the share of the population in working age everywhere. Conventional wisdom suggests that this will increase capital intensity with falling rates of return to capital and increasing wages. This decreases welfare for middle aged asset rich households. This paper takes the perspective of the three demographically oldest European nations — France, Germany and Italy — to address three important adjustment channels to dampen these detrimental effects of aging in these countries: investing abroad, endogenous human capital formation and increasing the retirement age. Our quantitative finding is that endogenous human capital formation in combination with an increase in the retirement age has strong implications for economic aggregates and welfare, in particular in the open economy. These adjustments reduce the maximum welfare losses of demographic change for households alive in 2010 by about 2.2 percentage points in terms of a consumption equivalent variation.
The importance of agile methods has increased in recent years, not only to manage software development processes but also to establish flexible and adaptive organisational structures, which are essential to deal with disruptive changes and build successful digital business strategies. This paper takes an industry-specific perspective by analysing the dissemination, objectives and relative popularity of agile frameworks in the German banking sector. The data provides insights into expectations and experiences associated with agile methods and indicates possible implementation hurdles and success factors. Our research provides the first comprehensive analysis of agile methods in the German banking sector. The comparison with a selected number of fintechs has revealed some differences between banks and fintechs. We found that almost all banks and fintechs apply agile methods in IT-related projects. However, fintechs have relatively more experience with agile methods than banks and use them more intensively. Scrum is the most relevant framework used in practice. Scaled agile frameworks are so far negligible in the German banking sector. Acceleration of projects is apparently the most important objective of deploying agile methods. In addition, agile methods can contribute to cost savings and lead to improved quality and innovation performance, though for banks it is evidently more challenging to reach their respective targets than for fintechs. Overall our findings suggest that German banks are still in a maturing process of becoming more agile and that there is room for an accelerated adoption of agile methods in general and scaled agile frameworks in particular.
We analyze the macroeconomic implications of increasing the top marginal income tax rate using a dynamic general equilibrium framework with heterogeneous agents and a fiscal structure resembling the actual U.S. tax system. The wealth and income distributions generated by our model replicate the empirical ones. In two policy experiments, we increase the statutory top marginal tax rate from 35 to 70 percent and redistribute the additional tax revenue among households, either by decreasing all other marginal tax rates or by paying out a lump-sum transfer to all households. We find that increasing the top marginal tax rate decreases inequality in both wealth and income but also leads to a contraction of the aggregate economy. This is primarily driven by the negative effects that the tax change has on top income earners. The aggregate gain in welfare is sizable in both experiments mainly due to a higher degree of distributional equality.
We analyze the macroeconomic implications of increasing the top marginal income tax rate using a dynamic general equilibrium framework with heterogeneous agents and a fiscal structure resembling the actual U.S. tax system. The wealth and income distributions generated by our model replicate the empirical ones. In two policy experiments, we increase the statutory top marginal tax rate from 35 to 70 percent and redistribute the additional tax revenue among households, either by decreasing all other marginal tax rates or by paying out a lump-sum transfer to all households. We find that increasing the top marginal tax rate decreases inequality in both wealth and income but also leads to a contraction of the aggregate economy. This is primarily driven by the negative effects that the tax change has on top income earners. The aggregate gain in welfare is sizable in both experiments mainly due to a higher degree of distributional equality.
We investigate consumption patterns in Europe with supervised machine learning methods and reveal differences in age and wealth impact across countries. Using data from the third wave (2017) of the Eurosystem’s Household Finance and Consumption Survey (HFCS), we assess how age and (liquid) wealth affect the marginal propensity to consume (MPC) in the Netherlands, Germany, France, and Italy. Our regression analysis takes the specification by Christelis et al. (2019) as a starting point. Decision trees are used to suggest alternative variable splits to create categorical variables for customized regression specifications. The results suggest an impact of differing wealth distributions and retirement systems across the studied Eurozone members and are relevant to European policy makers due to joint Eurozone monetary policy and increasing supranational fiscal authority of the EU. The analysis is further substantiated by a supervised machine learning analysis using a random forest and XGBoost algorithm.
Event studies have become increasingly important in securities fraud litigation after the Supreme Court’s decision in Halliburton II. Litigants have used event study methodology, which empirically analyzes the relationship between the disclosure of corporate information and the issuer’s stock price, to provide evidence in the evaluation of key elements of federal securities fraud, including materiality, reliance, causation, and damages. As the use of event studies grows and they increasingly serve a gatekeeping function in determining whether litigation will proceed beyond a preliminary stage, it will be critical for courts to use them correctly.
This Article explores an array of considerations related to the use of event studies in securities fraud litigation. It starts by describing the basic function of the event study: to determine whether a highly unusual price movement has occurred and the traditional statistical approach to making that determination. The Article goes on to identify special features of securities fraud litigation that distinguish litigation from the scholarly context in which event studies were developed. The Article highlights the fact that the standard approach can lead to the wrong conclusion and describes the adjustments necessary to address the litigation context. We use the example of six dates in the Halliburton litigation to illustrate these points.
Finally, the Article highlights the limitations of event studies – what they can and cannot prove – and explains how those limitations relate to the legal issues for which they are introduced. These limitations bear upon important normative questions about the role event studies should play in securities fraud litigation.
Advertising arbitrage
(2014)
Speculators often advertise arbitrage opportunities in order to persuade other investors and thus accelerate the correction of mispricing. We show that in order to minimize the risk and the cost of arbitrage an investor who identifies several mispriced assets optimally advertises only one of them, and overweights it in his portfolio; a risk-neutral arbitrageur invests only in this asset. The choice of the asset to be advertised depends not only on mispricing but also on its "advertisability" and accuracy of future news about it. When several arbitrageurs identify the same arbitrage opportunities, their decisions are strategic complements: they invest in the same asset and advertise it. Then, multiple equilibria may arise, some of which inefficient: arbitrageurs may correct small mispricings while failing to eliminate large ones. Finally, prices react more strongly to the ads of arbitrageurs with a successful track record, and reputation-building induces high-skill arbitrageurs to advertise more than others.
Advertising arbitrage
(2020)
Arbitrageurs with a short investment horizon gain from accelerating price discovery by advertising their private information. However, advertising many assets may overload investors' attention, reducing the number of informed traders per asset and slowing price discovery. So arbitrageurs optimally concentrate advertising on just a few assets, which they overweight in their portfolios. Unlike classic insiders, advertisers prefer assets with the least noise trading. If several arbitrageurs share information about the same assets, inefficient equilibria can arise, where investors' attention is overloaded and substantial mispricing persists. When they do not share, the overloading of investors' attention is maximal.
This paper explores consequences of consumer education on prices and welfare in retail financial markets when some consumers are naive about shrouded add-on prices and firms try to exploit it. Allowing for different information and pricing strategies we show that education is unlikely to push firms to disclose prices towards all consumers, which would be socially efficient. Instead, price discrimination emerges as a new equilibrium. Further, due to a feedback on prices, education that is good for consumers who become sophisticated may be bad for consumers who stay naive and even for the group of all consumers as a whole
This study examines the role of actual and perceived financial sophistication (i.e., financial literacy and confidence) for individuals' wealth accumulation. Using survey data from the German SAVE initiative, we find strong gender- and education-related differences in the distribution of the two variables and their effects on wealth: As financial literacy rises in formal education, whereas confidence increases in education for men but decreases for women, we observe that women become strongly underconfident with higher education, while men remain overconfident.Regarding wealth accumulation, we show that financial literacy has a positive effect that is stronger for women than for men and that is increasing (decreasing) in education for women (men). Confidence, however, supports only highly-educated men's wealth. When considering different channels for wealth accumulation, we observe that financial literacy is more important for current financial market participation, whereas confidence is more strongly associated with future-oriented financial planning. Overall, we demonstrate that highly-educated men's wealth levels benefit from their overconfidence via all financial decisions considered, but highly-educated women's financial planning suffers from their underconfidence. This may impair their wealth levels in old age.
A number of recent studies have suggested that activist stabilization policy rules responding to inflation and the output gap can attain simultaneously a low and stable rate of inflation as well as a high degree of economic stability. The foremost example of such a strategy is the policy rule proposed by Taylor (1993). In this paper, I demonstrate that the policy settings that would have been suggested by this rule during the 1970s, based on real-time data published by the U.S. Commerce Department, do not greatly differ from actual policy during this period. To the extent macroeconomic outcomes during this period are considered unfavorable, this raises questions regarding the usefulness of this strategy for monetary policy. To the extent the Taylor rule is believed to provide a reasonable guide to monetary policy, this finding raises questions regarding earlier critiques of monetary policy during the 1970s.
A resampling method based on the bootstrap and a bias-correction step is developed for improving the Value-at-Risk (VaR) forecasting ability of the normal-GARCH model. Compared to the use of more sophisticated GARCH models, the new method is fast, easy to implement, numerically reliable, and, except for having to choose a window length L for the bias-correction step, fully data driven. The results for several different financial asset returns over a long out-of-sample forecasting period, as well as use of simulated data, strongly support use of the new method, and the performance is not sensitive to the choice of L. Klassifizierung: C22, C53, C63, G12
This chapter outlines the conditions under which accounting-based smoothing can be beneficial for policyholders who hold with-profit or participating payout life annuities (PLAs). We use a realistically-calibrated model of PLAs to explore how alternative accounting techniques influence policyholder welfare as well as insurer profitability and stability. We find that accounting smoothing of participating life annuities is favorable to consumers and insurers, as it mitigates the impact of short-term volatility and enhances the utility of these long-term annuity contracts.
Accounting for financial stability: Bank disclosure and loss recognition in the financial crisis
(2020)
This paper examines banks’ disclosures and loss recognition in the financial crisis and identifies several core issues for the link between accounting and financial stability. Our analysis suggests that, going into the financial crisis, banks’ disclosures about relevant risk exposures were relatively sparse. Such disclosures came later after major concerns about banks’ exposures had arisen in markets. Similarly, the recognition of loan losses was relatively slow and delayed relative to prevailing market expectations. Among the possible explanations for this evidence, our analysis suggests that banks’ reporting incentives played a key role, which has important implications for bank supervision and the new expected loss model for loan accounting. We also provide evidence that shielding regulatory capital from accounting losses through prudential filters can dampen banks’ incentives for corrective actions. Overall, our analysis reveals several important challenges if accounting and financial reporting are to contribute to financial stability.
This paper investigates what we can learn from the financial crisis about the link between accounting and financial stability. The picture that emerges ten years after the crisis is substantially different from the picture that dominated the accounting debate during and shortly after the crisis. Widespread claims about the role of fair-value (or mark-to-market) accounting in the crisis have been debunked. However, we identify several other core issues for the link between accounting and financial stability. Our analysis suggests that, going into the financial crisis, banks’ disclosures about relevant risk exposures were relatively sparse. Such disclosures came later after major concerns about banks’ exposures had arisen in markets. Similarly, banks delayed the recognition of loan losses. Banks’ incentives seem to drive this evidence, suggesting that reporting discretion and enforcement deserve careful consideration. In addition, bank regulation through its interlinkage with financial accounting may have dampened banks’ incentives for corrective actions. Our analysis illustrates that a number of serious challenges remain if accounting and financial reporting are to contribute to financial stability.
Accounting for financial instruments in the banking industry: conclusions from a simulation model
(2003)
The paper analyses the effects of three sets of accounting rules for financial instruments - Old IAS before IAS 39 became effective, Current IAS or US GAAP, and the Full Fair Value (FFV) model proposed by the Joint Working Group (JWG) - on the financial statements of banks. We develop a simulation model that captures the essential characteristics of a modern universal bank with investment banking and commercial banking activities. We run simulations for different strategies (fully hedged, partially hedged) using historical data from periods with rising and falling interest rates. We show that under Old IAS a fully hedged bank can portray its zero economic earnings in its financial statements. As Old IAS offer much discretion, this bank may also present income that is either positive or negative. We further show that because of the restrictive hedge accounting rules, banks cannot adequately portray their best practice risk management activities under Current IAS or US GAAP. We demonstrate that - contrary to assertions from the banking industry - mandatory FFV accounting adequately reflects the economics of banking activities. Our detailed analysis identifies, in addition, several critical issues of the accounting models that have not been covered in previous literature.
Returns to experience for U.S. workers have changed over the post-war period. This paper argues that a simple model goes a long way towards replicating these changes. The model features three well-known ingredients: (i) an aggregate production function with constant skill-biased technical change; (ii) cohort qualities that vary with average years of schooling; and crucially (iii) time-invariant age-efficiency profiles. The model quantitatively accounts for changes in longitudinal and cross-sectional returns to experience, as well as the differential evolution of the college wage premium for young and old workers.
We analyze the implications of the governance structure in academic faculties for their recruitment decisions when competing for new researchers. The value to individual members through social interaction within the faculty depends on the average status of their fellow members. In recruitment decisions, incumbent members trade off the effect of entry on average faculty status against alternative uses of the recruitment budget if no entry takes place. We show that the best candidates join the best faculties but that they receive lower wages than some lesser ranking candidates. We also study the allocation of surplus created by the entry of a new faculty member and show that faculties with symmetric status distributions maximize their joint surplus under majority voting.
We develop a model that endogenizes the manager's choice of firm risk and of inside debt investment strategy. Our model delivers two predictions. First, managers have an incentive to reduce the correlation between inside debt and company stock in bad times. Second, managers that reduce such a correlation take on more risk in bad times. Using a sample of U.S. public firms, we provide evidence consistent with the model's predictions. Our results suggest that the weaker link between inside debt and company stock in bad times does not translate into a mitigation of debt-equity conflicts.
The long-run consumption risk model provides a theoretically appealing explanation for prominent asset pricing puzzles, but its intricate structure presents a challenge for econometric analysis. This paper proposes a two-step indirect inference approach that disentangles the estimation of the model's macroeconomic dynamics and the investor's preference parameters. A Monte Carlo study explores the feasibility and efficiency of the estimation strategy. We apply the method to recent U.S. data and provide a critical re-assessment of the long-run risk model's ability to reconcile the real economy and financial markets. This two-step indirect inference approach is potentially useful for the econometric analysis of other prominent consumption-based asset pricing models that are equally difficult to estimate.
We model the motives for residents of a country to hold foreign assets, including the precautionary motive that has been omitted from much previous literature as intractable. Our model captures many of the principal insights from the existing specialized literature on the precautionary motive, deriving a convenient formula for the economy’s target value of assets. The target is the level of assets that balances impatience, prudence, risk, intertemporal substitution, and the rate of return. We use the model to shed light on two topical questions: The “upstream” flows of capital from developing countries to advanced countries, and the long-run impact of resorbing global financial imbalances
We present a tractable model of the effects of nonfinancial risk on intertemporal choice. Our purpose is to provide a simple framework that can be adopted in fields like representative-agent macroeconomics, corporate finance, or political economy, where most modelers have chosen not to incorporate serious nonfinancial risk because available methods were too complex to yield transparent insights. Our model produces an intuitive analytical formula for target assets, and we show how to analyze transition dynamics using a familiar Ramsey-style phase diagram. Despite its starkness, our model captures most of the key implications of nonfinancial risk for intertemporal choice.
A theory of the boundaries of banks with implications for financial integration and regulation
(2015)
We offer a theory of the "boundary of the
rm" that is tailored to banking, as it builds on a single ine¢ ciency arising from risk-shifting and as it takes into account both interbank lending as an alternative to integration and the role of possibly insured deposit funding. Amongst others, it explains both why deeper economic integration should cause also greater financial integration through both bank mergers and interbank lending, albeit this typically remains ine¢ ciently incomplete, and why economic disintegration (or "desychronization"), as currently witnessed in the European Union, should cause less interbank exposure. It also suggests that recent policy measures such as the preferential treatment of retail deposits, the extension of deposit insurance, or penalties on "connectedness" could all lead to substantial welfare losses.
A tale of one exchange and two order books : effects of fragmentation in the absence of competition
(2018)
Exchanges nowadays routinely operate multiple, almost identically structured limit order markets for the same security. We study the effects of such fragmentation on market performance using a dynamic model where agents trade strategically across two identically-organized limit order books. We show that fragmented markets, in equilibrium, offer higher welfare to intermediaries at the expense of investors with intrinsic trading motives, and lower liquidity than consolidated markets. Consistent with our theory, we document improvements in liquidity and lower profits for liquidity providers when Euronext, in 2009, consolidated its order ow for stocks traded across two country-specific and identically-organized order books into a single order book. Our results suggest that competition in market design, not fragmentation, drives previously documented improvements in market quality when new trading venues emerge; in the absence of such competition, market fragmentation is harmful.
Did the Federal Reserves’ Quantitative Easing (QE) in the aftermath of the financial crisis have macroeconomic effects? To answer this question, the authors estimate a large-scale DSGE model over the sample from 1998 to 2020, including data of the Fed’s balance sheet. The authors allow for QE to affect the economy via multiple channels that arise from several financial frictions. Their nonlinear Bayesian likelihood approach fully accounts for the zero lower bound on nominal interest rates. They find that between 2009 to 2015, QE increased output by about 1.2 percent. This reflects a net increase in investment of nearly 9 percent, that was accompanied by a 0.7 percent drop in aggregate consumption. Both, government bond and capital asset purchases were effective in improving financing conditions. Especially capital asset purchases significantly facilitated new investment and increased the production capacity. Against the backdrop of a fall in consumption, supply side effects dominated which led to a mild disinflationary effect of about 0.25 percent annually.
A stochastic forward-looking model to assess the profitability and solvency of european insurers
(2016)
In this paper, we develop an analytical framework for conducting forward-looking assessments of profitability and solvency of the main euro area insurance sectors. We model the balance sheet of an insurance company encompassing both life and non-life business and we calibrate it using country level data to make it representative of the major euro area insurance markets. Then, we project this representative balance sheet forward under stochastic capital markets, stochastic mortality developments and stochastic claims. The model highlights the potential threats to insurers solvency and profitability stemming from a sustained period of low interest rates particularly in those markets which are largely exposed to reinvestment risks due to the relatively high guarantees and generous profit participation schemes. The model also proves how the resilience of insurers to adverse financial developments heavily depends on the diversification of their business mix. Finally, the model identifies potential negative spillovers between life and non-life business thorugh the redistribution of capital within groups.
In this paper we estimate a small model of the euro area to be used as a laboratory for evaluating the performance of alternative monetary policy strategies. We start with the relationship between output and inflation and investigate the fit of the nominal wage contracting model due to Taylor (1980)and three different versions of the relative real wage contracting model proposed by Buiter and Jewitt (1981)and estimated by Fuhrer and Moore (1995a) for the United States. While Fuhrer and Moore reject the nominal contracting model in favor of the relative contracting model which induces more inflation persistence, we find that both models fit euro area data reasonably well. When considering France, Germany and Italy separately, however, we find that the nominal contracting model fits German data better, while the relative contracting model does quite well in countries which transitioned out of a high inflation regime such as France and Italy. We close the model by estimating an aggregate demand relationship and investigate the consequences of the different wage contracting specifications for the inflation-output variability tradeoff, when interest rates are set according to Taylor 's rule.
A safe core mandate
(2023)
Central banks have vastly expanded their footprint on capital markets. At a time of extraordinary pressure by many sides, a simple benchmark for the scale and scope of their core mandate of price and financial stability may be useful.
We make a case for a narrow mandate to maintain and safeguard the border between safe and quasi safe assets. This ex-ante definition minimizes ambiguity and discourages risk creation and limit panic runs, primarily by separating market demand for reliable liquidity from risk-intolerant, price-insensitive demand for a safe store of value. The central bank may be occasionally forced to intervene beyond the safe core but should not be bound by any such ex-ante mandate, unless directed to specific goals set by legislation with explicit fiscal support.
We review distinct features of liquidity and safety demand, seeking a definition of the safety border, and discuss LOLR support for borderline safe assets such as MMF or uninsured deposits.
A safe core formulation is close to the historical focus on regulated entities, collateralized lending and attention to the public debt market, but its specific framing offers some context on controversial issues such as the extent of LOLR responsibilities. It also justifies a persistently large scale for central bank liabilities (Greenwood, Hansom and Stein 2016), as safety demand is related to financial wealth rather than GDP. Finally, it is consistent with an active central bank role in supporting liquidity in government debt markets trading and clearing (Duffie 2020, 2021).
This study examines the recent literature on the expectations, beliefs and perceptions of investors who incorporate Environmental, Social, Governance (ESG) considerations in investment decisions with the aim to generate superior performance and also make a societal impact. Through the lens of equilibrium models of agents with heterogeneous tastes for ESG investments, green assets are expected to generate lower returns in the long run than their non- ESG counterparts. However, at the short run, ESG investment can outperform non-ESG investment through various channels. Empirically, results of ESG outperformance are mixed. We find consensus in the literature that some investors have ESG preference and that their actions can generate positive social impact. The shift towards more sustainable policies in firms is motivated by the increased market values and the lower cost of capital of green firms driven by investors’ choices.
This paper analyzes how on-the-job search (OJS) by an agent impacts the moral hazard problem in a repeated principal-agent relationship. OJS is found to constitute a source of agency costs because efficient search incentives require that the agent receives all gains from trade. Further, the optimal incentive contract with OJS matches the design of empirically observed compensation contracts more accurately than models that ignore OJS. In particular, the optimal contract entails excessive performance pay plus efficiency wages. Efficiency wages reduce the opportunity costs of work effort and hence serve as a complement to bonuses. Thus, the model offers a novel explanation for the use of efficiency wages. When allowing for renegotiation, the model generates wage and turnover dynamics that are consistent with empirical evidence. I argue that the model contributes to explaining the concomitant rise in the use of performance pay and in competition for high-skill workers during the last three decades.
This paper examines the interaction of G7 real exchange rates with real output and interest rate differentials. Using cointegration methods, we generally find a link between the real exchange rate and the real interest differential. This finding contrasts with the majority of the extant research on the real exchange rate - real interest rate link. We identify a new measure of the equilibrium exchange rate in terms of the permanent component of the real exchange rate that is consistent with the dynamic equilibrium given by the cointegration relation. Furthermore, the presence of cointegration also allows us to identify real, nominal and transitory disturbances with only minimal identifying restrictions. Our findings suggest that persistent deviations of real exchange rates from their equilibrium value can have feedback effects on the underlying fundamentals, hence altering the equilibrium exchange rate itself. This has important implications for the persistence measures of real exchange rates that are reported elsewhere in the literature.
Using a novel dataset, we develop a structural model of the Very Large Crude Carrier (VLCC) market between the Arabian Gulf and the Far East. We study how fluctuations in oil tanker rates, oil exports, shipowner profits, and bunker fuel prices are determined by shocks to the supply and demand for oil tankers, to the utilization of tankers, and to the cost of operating tankers, including bunker fuel costs. Our analysis shows that time charter rates are largely unresponsive to tanker cost shocks. In response to higher costs, voyage profits decline, as cost shocks are only partially passed on to round-trip voyage rates. Oil exports from the Arabian Gulf also decline, reflecting lower demand for VLCCs. Positive utilization shocks are associated with higher profits, a slight increase in time charter rates and lower fuel prices and oil export volumes. Tanker supply and tanker demand shocks have persistent effects on time charter rates, round-trip voyage rates, the volume of oil exports, fuel prices, and profits with the expected sign.
Under a conventional policy rule, a central bank adjusts its policy rate linearly according to the gap between inflation and its target, and the gap between output and its potential. Under "the opportunistic approach to disinflation" a central bank controls inflation aggressively when inflation is far from its target, but concentrates more on output stabilization when inflation is close to its target, allowing supply shocks and unforeseen fluctuations in aggregate demand to move inflation within a certain band. We use stochastic simulations of a small-scale rational expectations model to contrast the behavior of output and inflation under opportunistic and linear rules. Klassifikation: E31, E52, E58, E61. July, 2005.
We raise some critical points against a naïve interpretation of “green finance” products and strategies. These critical insights are the background against which we take a closer look at instruments and policies that might allow green finance to become more impactful. In particular, we focus on the role of a taxonomy and investor activism. We also describe the interaction of government policies with green finance practice – an aspect, which has been mostly neglected in policy debates but needs to be taken into account. Finally, the special case of green government bonds is discussed.
We raise some critical points against a naïve interpretation of “green finance” products and strategies. These critical insights are the background against which we take a closer look at instruments and policies that might allow green finance to become more impactful. In particular, we focus on the role of a taxonomy and investor activism. We also describe the interaction of government policies with green finance practice – an aspect, which has been mostly neglected in policy debates but needs to be taken into account. Finally, the special case of green government bonds is discussed.
We create an alternative version of the present utility value formula to explicitly show that every store-of-value in the economy bears utility-interest (non-pecuniary income) for ist holder regardless of possible interest earnings from financial markets. In addition, we generalize the well-known welfare measures of consumer and producer surplus as present value concepts and apply them not only for the production and usage of consumer goods and durables but also for money and other financial assets. This helps us, inter alia, to formalize the circumstances under which even a producer of legal tender might become insolvent. We also develop a new measure of seigniorage and demonstrate why the well-established concept of monetary seigniorage is flawed. Our framework also allows us to formulate the conditions for liability-issued money such as inside money and financial instruments such as debt certificates to become – somewhat paradoxically – net wealth of the society.
In this paper we consider the dynamics of spot and futures prices in the presence of arbitrage. We propose a partially linear error correction model where the adjustment coefficient is allowed to depend non-linearly on the lagged price difference. We estimate our model using data on the DAX index and the DAX futures contract. We find that the adjustment is indeed nonlinear. The linear alternative is rejected. The speed of price adjustment is increasing almost monotonically with the magnitude of the price difference.
We build a novel leading indicator (LI) for the EU industrial production (IP). Differently from previous studies, the technique developed in this paper is able to produce an ex-ante LI that is immune to “overlapping information drawbacks”. In addition, the set of variables composing the LI relies on a dynamic and systematic criterion. This ensures that the choice of the variables is not driven by subjective views. Our LI anticipates swings (including the 2007-2008 crisis) in the EU industrial production – on average – by 2 to 3 months. The predictive power improves if the indicator is revised every five or ten years. In a forward-looking framework, via a general-to-specific procedure, we also show that our LI represents the most informative variable in approaching expectations on the EU IP growth.
Riley (1979)'s reactive equilibrium concept addresses problems of equilibrium existence in competitive markets with adverse selection. The game-theoretic interpretation of the reactive equilibrium concept in Engers and Fernandez (1987) yields the Rothschild-Stiglitz (1976)/Riley (1979) allocation as an equilibrium allocation, however multiplicity of equilibrium emerges. In this note we imbed the reactive equilibrium's logic in a dynamic market context with active consumers. We show that the Riley/Rothschild-Stiglitz contracts constitute the unique equilibrium allocation in any pure strategy subgame perfect Nash equilibrium.
This note argues that in a situation of an inelastic natural gas supply a restrictive monetary policy in the euro zone could reduce the energy bill and therefore has additional merits. A more hawkish monetary policy may be able to indirectly use monopsony power on the gas market. The welfare benefits of such a policy are diluted to the extent that some of the supply (approximately 10 percent) comes from within the euro zone, which may give rise to distributional concerns.
We extend the important idea of range-based volatility estimation to the multivariate case. In particular, we propose a range-based covariance estimator that is motivated by financial economic considerations (the absence of arbitrage), in addition to statistical considerations. We show that, unlike other univariate and multivariate volatility estimators, the range-based estimator is highly efficient yet robust to market microstructure noise arising from bid-ask bounce and asynchronous trading. Finally, we provide an empirical example illustrating the value of the high-frequency sample path information contained in the range-based estimates in a multivariate GARCH framework.
We develop a utility based model of fluctuations, with nominal rigidities, and unemployment. In doing so, we combine two strands of research: the New Keynesian model with its focus on nominal rigidities, and the Diamond-Mortensen-Pissarides model, with its focus on labor market frictions and unemployment. In developing this model, we proceed in two steps. We first leave nominal rigidities aside. We show that, under a standard utility specification, productivity shocks have no effect on unemployment in the constrained efficient allocation. We then focus on the implications of alternative real wage setting mechanisms for fluctuations in unemployment. We then introduce nominal rigidities in the form of staggered price setting by firms. We derive the relation between inflation and unemployment and discuss how it is influenced by the presence of real wage rigidities. We show the nature of the tradeoff between inflation and unemployment stabilization, and we draw the implications for optimal monetary policy. JEL Classification: E32, E50
A new governance architecture for european financial markets? Towards a european supervision of CCPs
(2018)
Does the new European outlook on financial markets, as voiced by the EU Commission since the beginning of the Capital Market Unions imply a movement of the EU towards an alignment of market integration and direct supervision of common rules? This paper sets out to answer this question for the case of common supervision for Central Counterparties (CCPs) in the European Union. Those entities gained crucial importance post-crisis due to new regulation which requires the mandatory clearing of standardized derivative contracts, transforming clearing houses into central nodes for cross-border financial transactions. While the EU-wide regulatory framework EMIR, enacted in 2012, stipulates common regulatory requirements, the framework still relies on home-country supervision of those rules, arguably leading to regulatory as well as supervisory arbitrage. Therefore, the regulatory reform to stabilize the OTC derivatives market replicated at its center a governance flaw, which had been identified as one of the major causes for the gravity of the financial crisis in the EU: the coupling of intense competition based on private risk management systems with a national supervision of European rules. This paper traces the history of this problem awareness and inquires which factors account for the fact that only in 2017 serious negotiations at the EU level ensued that envisioned a common supervision of CCPs to fix the flawed system of governance. Analyzing this shift in the European governance architecture, we argue that Brexit has opened a window of opportunity for a centralization of supervision for CCPs. Brexit aligns the urgency of the problem with material interests of crucial political stakeholder, in particular of Germany and France, providing the possibility for a grand European bargain.
In the aftermath of the global financial crisis, the state of macroeconomic modeling and the use of macroeconomic models in policy analysis has come under heavy criticism. Macroeconomists in academia and policy institutions have been blamed for relying too much on a particular class of macroeconomic models. This paper proposes a comparative approach to macroeconomic policy analysis that is open to competing modeling paradigms. Macroeconomic model comparison projects have helped produce some very influential insights such as the Taylor rule. However, they have been infrequent and costly, because they require the input of many teams of researchers and multiple meetings to obtain a limited set of comparative findings. This paper provides a new approach that enables individual researchers to conduct model comparisons easily, frequently, at low cost and on a large scale. Using this approach a model archive is built that includes many well-known empirically estimated models that may be used for quantitative analysis of monetary and fiscal stabilization policies. A computational platform is created that allows straightforward comparisons of models’ implications. Its application is illustrated by comparing different monetary and fiscal policies across selected models. Researchers can easily include new models in the data base and compare the effects of novel extensions to established benchmarks thereby fostering a comparative instead of insular approach to model development.
Central banks have faced a succession of crises over the past years as well as a number of structural factors such as a transition to a greener economy, demographic developments, digitalisation and possibly increased onshoring. These suggest that the future inflation environment will be different from the one we know. Thus uncertainty about important macroeconomic variables and, in particular, inflation dynamics will likely remain high.
We focus on the role of social media as a high-frequency, unfiltered mass information transmission channel and how its use for government communication affects the aggregate stock markets. To measure this effect, we concentrate on one of the most prominent Twitter users, the 45th President of the United States, Donald J. Trump. We analyze around 1,400 of his tweets related to the US economy and classify them by topic and textual sentiment using machine learning algorithms. We investigate whether the tweets contain relevant information for financial markets, i.e. whether they affect market returns, volatility, and trading volumes. Using high-frequency data, we find that Trump’s tweets are most often a reaction to pre-existing market trends and therefore do not provide material new information that would influence prices or trading. We show that past market information can help predict Trump’s decision to tweet about the economy.
This paper solves a dynamic model of households' mortgage decisions incorporating labor income, house price, inflation, and interest rate risk. It uses a zero-profit condition for mortgage lenders to solve for equilibrium mortgage rates given borrower characteristics and optimal decisions. The model quantifies the effects of adjustable vs. fixed mortgage rates, loan-to-value ratios, and mortgage affordability measures on mortgage premia and default. Heterogeneity in borrowers' labor income risk is important for explaining the higher default rates on adjustable-rate mortgages during the recent US housing downturn, and the variation in mortgage premia with the level of interest rates.
On average, "young" people underestimate whereas "old" people overestimate their chances to survive into the future. We adopt a Bayesian learning model of ambiguous survival beliefs which replicates these patterns. The model is embedded within a non-expected utility model of life-cycle consumption and saving. Our analysis shows that agents with ambiguous survival beliefs (i) save less than originally planned, (ii) exhibit undersaving at younger ages, and (iii) hold larger amounts of assets in old age than their rational expectations counterparts who correctly assess their survival probabilities. Our ambiguity-driven model therefore simultaneously accounts for three important empirical findings on household saving behavior.
Based on a cognitive notion of neo-additive capacities reflecting likelihood insensitivity with respect to survival chances, we construct a Choquet Bayesian learning model over the life-cycle that generates a motivational notion of neo-additive survival beliefs expressing ambiguity attitudes. We embed these neo-additive survival beliefs as decision weights in a Choquet expected utility life-cycle consumption model and calibrate it with data on subjective survival beliefs from the Health and Retirement Study. Our quantitative analysis shows that agents with calibrated neo-additive survival beliefs (i) save less than originally planned, (ii) exhibit undersaving at younger ages, and (iii) hold larger amounts of assets in old age than their rational expectations counterparts who correctly assess their survival chances. Our neo-additive life-cycle model can therefore simultaneously accommodate three important empirical findings on household saving behavior.
As part of the Next Generation EU (NGEU) program, the European Commission has pledged to issue up to EUR 250 billion of the NGEU bonds as green bonds, in order to confirm their commitment to sustainable finance and to support the transition towards a greener Europe. Thereby, the EU is not only entering the green bond market, but also set to become one of the biggest green bond issuers. Consequently, financial market participants are eager to know what to expect from the EU as a new green bond issuer and whether a negative green bond premium, a so-called Greenium, can be expected for the NGEU green bonds. This research paper formulates an expectation in regards to a potential Greenium for the NGEU green bonds, by conducting an interview with 15 sustainable finance experts and analyzing the public green bond market from September 2014 until June 2021, with respect to a potential green bond premium and its underlying drivers. The regression results confirm the existence of a significant Greenium (-0.7 bps) in the public green bond market and that the Greenium increases for supranational issuers with AAA rating, such as the EU. Moreover, the green bond premium is influenced by issuer sector and credit rating, but issue size and modified duration have no significant effect. Overall, the evaluated expert interviews and regression analysis lead to an expected Greenium for the NGEU green bonds of up to -4 bps, with the potential to further increase in the secondary market.
We examine how U.S. monetary policy affects the international activities of U.S. Banks. We access a rarely studied US bank‐level dataset to assess at a quarterly frequency how changes in the U.S. Federal funds rate (before the crisis) and quantitative easing (after the onset of the crisis) affects changes in cross‐border claims by U.S. banks across countries, maturities and sectors, and also affects changes in claims by their foreign affiliates. We find robust evidence consistent with the existence of a potent global bank lending channel. In response to changes in U.S. monetary conditions, U.S. banks strongly adjust their cross‐border claims in both the pre and post‐crisis period. However, we also find that U.S. bank affiliate claims respond mainly to host country monetary conditions.
Futures markets are a potentially valuable source of information about market expectations. Exploiting this information has proved difficult in practice, because the presence of a time-varying risk premium often renders the futures price a poor measure of the market expectation of the price of the underlying asset. Even though the expectation in principle may be recovered by adjusting the futures price by the estimated risk premium, a common problem in applied work is that there are as many measures of market expectations as there are estimates of the risk premium. We propose a general solution to this problem that allows us to uniquely pin down the best possible estimate of the market expectation for any set of risk premium estimates. We illustrate this approach by solving the long-standing problem of how to recover the market expectation of the price of crude oil. We provide a new measure of oil price expectations that is considerably more accurate than the alternatives and more economically plausible. We discuss implications of our analysis for the estimation of economic models of energy-intensive durables, for the debate on speculation in oil markets, and for oil price forecasting.
We selectively survey, unify and extend the literature on realized volatility of financial asset returns. Rather than focusing exclusively on characterizing the properties of realized volatility, we progress by examining economically interesting functions of realized volatility, namely realized betas for equity portfolios, relating them both to their underlying realized variance and covariance parts and to underlying macroeconomic fundamentals.
This paper studies constrained portfolio problems that may involve constraints on the probability or the expected size of a shortfall of wealth or consumption. Our first contribution is that we solve the problems by dynamic programming, which is in contrast to the existing literature that applies the martingale method. More precisely, we construct the non-separable value function by formalizing the optimal constrained terminal wealth to be a (conjectured) contingent claim on the optimal non-constrained terminal wealth. This is relevant by itself, but also opens up the opportunity to derive new solutions to constrained problems. As a second contribution, we thus derive new results for non-strict constraints on the shortfall of inter¬mediate wealth and/or consumption.
This paper considers a trading game in which sequentially arriving liquidity traders either opt for a market order or for a limit order. One class of traders is considered to have an extended trading horizon, implying their impatience is linked to their trading orientation. More specifically, sellers are considered to have a trading horizon of two periods, whereas buyers only have a single-period trading scope (the extended buyer-horizon case is completely symmetric). Clearly, as the life span of their submitted limit orders is longer, this setting implies sellers are granted a natural advantage in supplying liquidity. This benefit is hampered, however, by the direct competition arising between consecutively arriving sellers. Closed-form characterizations for the order submission strategies are obtained when solving for the equilibrium of this dynamic game. These allow to examine how these forces affect traders´ order placement decisions. Further, the analysis yields insight into the dynamic process of price formation and into the market clearing process of a non-intermediated, order driven market.