Refine
Year of publication
- 2014 (149) (remove)
Document Type
- Working Paper (149) (remove)
Language
- English (149) (remove)
Has Fulltext
- yes (149)
Is part of the Bibliography
- no (149)
Keywords
- financial crisis (6)
- monetary policy (6)
- transparency (4)
- Household Finance (3)
- Labor income risk (3)
- Portfolio choice (3)
- Systemic risk (3)
- asset pricing (3)
- financial literacy (3)
- shadow banking (3)
Institute
- Wirtschaftswissenschaften (128)
- Center for Financial Studies (CFS) (117)
- Sustainable Architecture for Finance in Europe (SAFE) (74)
- House of Finance (HoF) (71)
- Rechtswissenschaft (10)
- Institute for Monetary and Financial Stability (IMFS) (6)
- Exzellenzcluster Die Herausbildung normativer Ordnungen (4)
- Gesellschaftswissenschaften (3)
- Informatik (2)
- Institut für sozial-ökologische Forschung (ISOE) (2)
This paper is the first to conduct an incentive-compatible experiment using real monetary payoffs to test the hypothesis of probabilistic insurance which states that willingness to pay for insurance decreases sharply in the presence of even small default probabilities as compared to a risk-free insurance contract. In our experiment, 181 participants state their willingness to pay for insurance contracts with different levels of default risk. We find that the willingness to pay sharply decreases with increasing default risk. Our results hence strongly support the hypothesis of probabilistic insurance. Furthermore, we study the impact of customer reaction to default risk on an insurer’s optimal solvency level using our experimentally obtained data on insurance demand. We show that an insurer should choose to be default-free rather than having even a very small default probability. This risk strategy is also optimal when assuming substantial transaction costs for risk management activities undertaken to achieve the maximum solvency level.
We analyze the risk premium on bank bonds at origination with a special focus on the role of implicit and explicit public guarantees and the systemic relevance of the issuing institutions. By looking at the asset swap spread on 5,500 bonds, we find that explicit guarantees and sovereign creditworthiness have a substantial effect on the risk premium. In addition, while large institutions still enjoy lower issuance costs linked to the TBTF framework, we find evidence of enhanced market disciple for systemically important banks which face, since the onset of the financial crisis, an increased premium on bond placements.
One of the motivations for establishing a European banking union was the desire to break the ties with between national regulators and domestic financial institutions in order to prevent regulatory capture. However, supervisory authority over the financial sector at the national level can also have valuable public benefits. The aim of this policy letter is to detail these public benefits in order to counter discussions that focus only on conflicts of interest. It is informed by an analysis of how financial institutions interacted with policy-makers in the design of national bank rescue schemes in response to the banking crisis of 2008. Using this information, it discusses the possible benefits of close cooperation between financial institutions and regulators and analyzes these in the wake of a European banking union.
The recent decline in euro area inflation has triggered new calls for additional monetary stimulus by the ECB in order to counter the threat of a self‐reinforcing deflation and recession spiral. This note reviews the available evidence on inflation expectations, output gaps and other factors driving current inflation through the lens of the Phillips curve. It also draws a comparison to the Japanese experience with deflation in the late 1990s and the evidence from Japan concerning the outputinflation nexus at low trend inflation. The note concludes from this evidence that the risk of a selfreinforcing deflation remains very small. Thus, the ECB best await the impact of the long‐term refinancing operations decided in June that have the potential to induce substantial monetary accommodation once implemented for the first time in September.
he predictive likelihood is of particular relevance in a Bayesian setting when the purpose is to rank models in a forecast comparison exercise. This paper discusses how the predictive likelihood can be estimated for any subset of the observable variables in linear Gaussian state-space models with Bayesian methods, and proposes to utilize a missing observations consistent Kalman filter in the process of achieving this objective. As an empirical application, we analyze euro area data and compare the density forecast performance of a DSGE model to DSGE-VARs and reduced-form linear Gaussian models.
This paper contrasts the recent European initiatives on regulating corporate groups with alternative approaches to the phenomenon. In doing so it pays particular regard to the German codified law on corporate groups as the polar opposite to the piecemeal approach favored by E.U. legislation.
It finds that the European Commission’s proposal to submit (significant) related party transactions to enhanced transparency, outside fairness review, and ex ante shareholder approval is both flawed in its design and based on contestable assumptions on informed voting of institutional investors. In particular, the contemplated exemption for transactions with wholly owned subsidiaries allows controlling shareholders to circumvent the rule extensively. Moreover, vesting voting rights with (institutional) investors will not lead to the informed assessment that is hoped for, because these investors will rationally abstain from active monitoring and rely on proxy advisory firms instead whose competency to analyze non-routine significant related party transactions is questionable.
The paper further delineates that the proposed recognition of an overriding interest of the group requires strong counterbalances to adequately protect minority shareholders and creditors. Hence, if the Commission choses to go down this route it might end up with a comprehensive regulation that is akin to the unpopular Ninth Company Law Directive in spirit, though not in content. The latter prediction is corroborated by the pertinent parts of the proposal for a European Model Company Act.
How special are they? - Targeting systemic risk by regulating shadow banking : (October 5, 2014)
(2014)
This essay argues that at least some of the financial stability concerns associated with shadow banking can be addressed by an approach to financial regulation that imports its functional foundations more vigorously into the interpretation and implementation of existing rules. It shows that the general policy goals of prudential banking regulation remain constant over time despite dramatic transformations in the financial and technological landscape. Moreover, these overarching policy goals also legitimize intervention in the shadow banking sector. On these grounds, this essay encourages a more normative construction of available rules that potentially limits both the scope for regulatory arbitrage and the need for ever more rapid updates and a constant increase in the complexity of the regulatory framework. By tying the regulatory treatment of financial innovation closely to existing prudential rules and their underlying policy rationales, the proposed approach potentially ends the socially wasteful race between hare and tortoise that signifies the relation between regulators and a highly dynamic industry. In doing so it does not generally hamper market participants’ efficient discoveries where disintermediation proves socially beneficial. Instead, it only weeds-out rent-seeking circumventions of existing rules and standards.
How special are they? - Targeting systemic risk by regulating shadow banking : (October 5, 2014)
(2014)
This essay argues that at least some of the financial stability concerns associated with shadow banking can be addressed by an approach to financial regulation that imports its functional foundations more vigorously into the interpretation and implementation of existing rules. It shows that the general policy goals of prudential banking regulation remain constant over time despite dramatic transformations in the financial and technological landscape. Moreover, these overarching policy goals also legitimize intervention in the shadow banking sector. On these grounds, this essay encourages a more normative construction of available rules that potentially limits both the scope for regulatory arbitrage and the need for ever more rapid updates and a constant increase in the complexity of the regulatory framework. By tying the regulatory treatment of financial innovation closely to existing prudential rules and their underlying policy rationales, the proposed approach potentially ends the socially wasteful race between hare and tortoise that signifies the relation between regulators and a highly dynamic industry. In doing so it does not generally hamper market participants’ efficient discoveries where disintermediation proves socially beneficial. Instead, it only weeds-out rent-seeking circumventions of existing rules and standards.
We examine both the degree and the structural stability of inflation persistence at different quantiles of the conditional inflation distribution. Previous research focused exclusively on persistence at the conditional mean of the inflation rate. As economic theory provides reasons for inflation persistence to differ across conditional quantiles, this is a potentially severe constraint. Conventional studies of inflation persistence cannot identify changes in persistence at selected quantiles that leave persistence at the median of the distribution unchanged. Based on post-war US data we indeed find robust evidence for a structural break in persistence at all quantiles of the inflation process in the early 1980s. While prior to the 1980s inflation was not mean reverting, quantile autoregression based unit root tests suggest that since the end of the Volcker disinflation the unit root can be rejected at every quantile of the conditional inflation distribution.
Obstetrical care as a matter of time: ultrasound screening in anticipatory regimes of pregnancy
(2014)
This article explores the ways in which ultrasound screening influences the temporal dimensions of prevention in the obstetrical management of pregnancy. Drawing on praxeographic perspectives and empirically based on participant observation of ultrasound examinations in obstetricians’ offices, it asks how ultrasound scanning facilitates anticipatory modes of pregnancy management, and investigates the entanglement of different notions of time and temporality in the highly risk-oriented modes of prenatal care in Germany. Arguing that the paradoxical temporality of prevention – acting now in the name of the future – is intensified by ultrasound screening, I show how the attribution of risk regarding foetal growth in prenatal check-ups is based on the fragmentation of procreative time and ask how time standards come into play, how pregnancy is located in calendrical time, and how notions of foetal time and the everyday life times of pregnant women clash during negotiations between obstetricians and pregnant women about the determination of the due date. By analysing temporality as a practical accomplishment via technological devices such as ultrasound, the paper contributes to debates in feminist STS studies on the role of time in reproduction technologies and the management of pregnancy and birth in contemporary societies.
We examine the effects of credit default swaps (CDS), a major type of over-the-counter derivative, on the corporate liquidity management of the reference firms. CDS help firms to access the credit market since the lenders can hedge their credit risk more easily using these contracts. However, CDS-protected creditors can be tougher in debt renegotiations and less willing to support distressed borrowers, causing some firms to become more cautious. Consequently, we find that firms hold significantly more cash after the inception of CDS trading on their debt. The increase in cash holdings by CDS firms is more pronounced for financially constrained firms and firms facing higher refinancing risk. Moreover, bank relationships and outstanding credit facilities intensify the CDS effect on cash holding. Finally, firms with greater financial expertise hold more cash when their debt is referenced by CDS. These findings suggest that CDS, which are primarily a risk management tool for lenders, induce firms to adopt more conservative liquidity policies.
Robustness, validity, and significance of the ECB's asset quality review and stress test exercise
(2014)
As we are moving toward a eurozone banking union, the European Central Bank (ECB) is going to take over the regulatory oversight of 128 banks in November 2014. To that end, the ECB conducted a comprehensive assessment of these banks, which included an asset quality review (AQR) and a stress test. The fundamental question is how accurately will the financial condition of these banks have been assessed by the ECB when it commences its regulatory oversight? And, can the comprehensive assessment lead to a full repair of banks’ balance sheets so that the ECB takes over financially sound banks and is the necessary regulation in place to facilitate this? Overall, the evidence presented in this paper based on the design of the comprehensive assessment as well as own stress test exercises suggest that the ECB’s assessment might not comprehensively deal with the problems in the financial sector and risks may remain that will pose substantial threats to financial stability in the eurozone.
Efforts to control bank risk address the wrong problem in the wrong way. They presume that the financial crisis was caused by CEOs who failed to supervise risk-taking employees. The responses focus on executive pay, believing that executives will bring non-executives into line—using incentives to manage risk-taking—once their own pay is regulated. What they overlook is the effect on non-executive pay of the competition for talent. Even if executive pay is regulated, and executives act in the bank’s best interests, they will still be trapped into providing incentives that encourage risk-taking by non-executives due to the negative externality that arises from that competition. Greater risk-taking can increase short-term profits and, in turn, the amount a non-executive receives, potentially at the expense of long-term bank value. Non-executives, therefore, have an incentive to incur significant risk upfront so long as they can depart for a new employer before any losses materialize. The result is an upward spiral in compensation—reducing an executive’s ability to set non-executive pay and the ability of any one bank to adjust compensation to reflect risk-taking and long-term outcomes. New regulation must address the tension between compensation and competition. Regulators should take account of the effect of competition on market-wide levels of pay, including by non-banks who compete for talent. The ability of non-executives to jump from a bank employer to another financial firm should also be limited. In addition, banks should be required to include a long-term equity component in non-executive pay, with subsequent employers being restricted from compensating a new employee for any losses she incurs related to her prior work.
Motivated by the question whether sound and expressive applicative similarities for program calculi with should-convergence exist, this paper investigates expressive applicative similarities for the untyped call-by-value lambda-calculus extended with McCarthy's ambiguous choice operator amb. Soundness of the applicative similarities w.r.t. contextual equivalence based on may-and should-convergence is proved by adapting Howe's method to should-convergence. As usual for nondeterministic calculi, similarity is not complete w.r.t. contextual equivalence which requires a rather complex counter example as a witness. Also the call-by-value lambda-calculus with the weaker nondeterministic construct erratic choice is analyzed and sound applicative similarities are provided. This justifies the expectation that also for more expressive and call-by-need higher-order calculi there are sound and powerful similarities for should-convergence.
I analyze a critical illness insurance in a consumption-investment model over the life cycle. I solve a model with stochastic mortality risk and health shock risk numerically. These shocks are interpreted as critical illness and can negatively affect the expected remaining lifetime, the health expenses, and the income. In order to hedge the health expense effect of a shock, the agent has the possibility to contract a critical illness insurance. My results highlight that the critical illness insurance is strongly desired by the agents. With an insurance profit of 20%, nearly all agents contract the insurance in the working stage of the life cycle and more than 50% of the agents contract the insurance during retirement. With an insurance profit of 200%, still nearly all working agents contract the insurance, whereas there is little demand in the retirement stage.
I numerically solve realistically calibrated life cycle consumption-investment problems in continuous time featuring stochastic mortality risk driven by jumps, unspanned labor income as well as short-sale and liquidity constraints and a simple insurance. I compare models with deterministic and stochastic hazard rate of death to a model without mortality risk. Mortality risk has only minor effects on the optimal controls early in the life cycle but it becomes crucial in later years. A diffusive component in the hazard rate of death has no significant impact, whereas a jump component is desired by the agent and influences optimal controls and wealth evolution. The insurance is used to ensure optimal bequest such that there is no accidental bequest. In the absence of the insurance, the biggest part of bequest is accidental.
The pi-calculus is a well-analyzed model for mobile processes and mobile computations.
While a lot of other process and lambda calculi that are core languages of higher-order concurrent and/or functional programming languages use a contextual semantics observing the termination behavior of programs in all program contexts, traditional program equivalences in the pi-calculus are bisimulations and barbed testing equivalences, which observe the communication capabilities of processes under reduction and in contexts.
There is a distance between these two approaches to program equivalence which makes it hard to compare the pi-calculus with other languages. In this paper we contribute to bridging this gap by investigating a contextual semantics of the synchronous pi-calculus with replication and without sums.
To transfer contextual equivalence to the pi-calculus we add a process Stop as constant which indicates success and is used as the base to define and analyze the contextual equivalence which observes may- and should-convergence of processes.
We show as a main result that contextual equivalence in the pi-calculus with Stop conservatively extends barbed testing equivalence in the (Stop-free) pi-calculus. This implies that results on contextual equivalence can be directly transferred to the (Stop-free) pi-calculus with barbed testing equivalence.
We analyze the contextual ordering, prove some nontrivial process equivalences, and provide proof tools for showing contextual equivalences. Among them are a context lemma, and new notions of sound applicative similarities for may- and should-convergence.
We outline a procedure for consistent estimation of marginal and joint default risk in the euro area financial system. We interpret the latter risk as the intrinsic financial system fragility and derive several systemic fragility indicators for euro area banks and sovereigns, based on CDS prices. Our analysis documents that although the fragility of the euro area banking system had started to deteriorate before Lehman Brothers' file for bankruptcy, investors did not expect the crisis to affect euro area sovereigns' solvency until September 2008. Since then, and especially after November 2009, joint sovereign default risk has outpaced the rise of systemic risk within the banking system.
This country report was prepared for the 19th World Congress of the International Academy of Comparative Law in Vienna in 2014. It is structured as a questionnaire and provides an overview of the legal framework for Free and Open Source Software (FOSS) and other alternative license models like (e.g.) Creative Commons under German law. The first set of questions addresses the applicable statutory provisions and the reported case law in this area. The second section concerns contractual issues, in particular with regard to the interpretation and validity of open content licenses. The third section deals with copyright aspects of open content models, for example regarding revocation rights and rights to equitable remuneration. The final set of questions pertains to patent, trademark and competition law issues of open content licenses.
his paper distils three lessons for bank regulation from the experience of the 2009-12 euro-area financial crisis. First, it highlights the key role that sovereign debt exposures of banks have played in the feedback loop between bank and fiscal distress, and inquires how the regulation of banks’ sovereign exposures in the euro area should be changed to mitigate this feedback loop in the future. Second, it explores the relationship between the forbearance of non-performing loans by European banks and the tendency of EU regulators to rescue rather than resolving distressed banks, and asks to what extent the new regulatory framework of the euro-area “banking union” can be expected to mitigate excessive forbearance and facilitate resolution of insolvent banks. Finally, the paper highlights that capital requirements based on the ratio of Tier-1 capital to banks’ risk-weighted assets were massively gamed by large banks, which engaged in various forms of regulatory arbitrage to minimize their capital charges while expanding leverage. This argues in favor of relying on a set of simpler and more robust indicators to determine banks’ capital shortfall, such as book and market leverage ratios.
Has economic research been helpful in dealing with the financial crises of the early 2000s? On the whole, the answer is negative, although there are bright spots. Economists have largely failed to predict both crises, largely because most of them were not analytically equipped to understand them, in spite of their recurrence in the last 25 years. In the pre-crisis period, however, there have been important exceptions – theoretical and empirical strands of research that largely laid out the basis for our current thinking about financial crises. Since 2008, a flurry of new studies offered several different interpretations of the US crisis: to some extent, they point to potentially complementary factors, but disagree on their relative importance, and therefore on policy recommendations. Research on the euro debt crisis has so far been much more limited: even Europe-based researchers – including CEPR ones – have often directed their attention more to the US crisis than to that occurring on their doorstep. In terms of impact on policy and regulatory reform, the record is uneven. On the one hand, the swift and massive liquidity provision by central banks in the wake of both crises is, at least partly, to be credited to previous research on the role of central banks as lenders of last resort in crises and on the real effects of bank lending and monetary policy. On the other hand, economists have had limited impact on the reform of prudential and security market regulation. In part, this is due to their neglect of important regulatory choices, which policy-makers are therefore left to take without the guidance of academic research-based analysis.
Especially in developing countries credit constraints are often perceived as one of the most important market frictions constraining firm innovation and growth. Huge amounts of public money are being devoted to the removal of such constraints but their effectiveness is still subject to an intense policy debate. This paper contributes to this debate by analysing the effects of the Brazilian Development Bank (BNDES) loans. It finds that, before receiving BNDES support, granted firms are indeed more credit constrained than comparable non-granted firms. It also finds that BNDES support allows granted firms to achieve the same level of performance as similar non-granted firms that are not credit constrained. However, it does not allow granted firms to outperform similar non-granted ones.
What would be the economic effects of the UK leaving the European Union on living standards of British people? We focus on the effects of trade on welfare net of lower fiscal transfers to the EU. We use a standard quantitative static general equilibrium trade model with multiple sectors, countries and intermediates, as in Costinot and Rodriguez-Clare (2013). Static losses range between 1.13% and 3.09% of GDP, depending on the assumptions used in our counterfactual scenarios. Including dynamic effects could more than double such losses.
This chapter discusses whether and how 'new quantitative trade models' (NQTMs) can be fruitfully applied to quantify the welfare effects of trade liberalization, thus shedding light on the trade-related effects of further European integration. On the one hand, it argues that NQTMs have indeed the potential of being used to supplement traditional 'computable general equilibrium' (CGE) analysis thanks to their tight connection between theory and data, appealing micro-theoretical foundations, and enhanced attention to the estimation of structural parameters. On the other hand, further work is still needed in order to fully exploit such potential.
What happened in Cyprus? The economic consequences of the last communist government in Europe
(2014)
This paper reviews developments in the Cypriot economy following the introduction of the euro on 1 January 2008 and leading to the economic collapse of the island five years later. The main cause of the collapse is identified with the election of a communist government in February 2008, within two months of the introduction of the euro, and its subsequent choices for action and inaction on economic policy matters. The government allowed a rapid deterioration of public finances, and despite repeated warnings, damaged the country's creditworthiness and lost market access in May 2011. The destruction of the island's largest power station in July 2011 subsequently threw the economy into recession. Together with the intensification of the euro area crisis in the summer and fall of 2011, these events weakened the banking system which was vulnerable due to its exposure in Greece. Rather than deal with its fiscal crisis, the government secured a loan from the Russian government that allowed it to postpone action until after the February 2013 election. Rather than protect the banking system, losses were imposed on banks and a campaign against them was coordinated and used as a platform by the communist party for the February 2013 election. The strategy succeeded in delaying resolution of the crisis and avoiding short-term political cost for the communist party before the election, but also in precipitating a catastrophe right after the election.
Are rules and boundaries sufficient to limit harmful central bank discretion? Lessons from Europe
(2014)
Marvin Goodfriend’s (2014) insightful, informative and provocative work explains concisely and convincingly why the Fed needs rules and boundaries. This paper reviews the broader institutional design problem regarding the effectiveness of the central bank in practice and confirms the need for rules and boundaries. The framework proposed for improving the Fed incorporates key elements that have already been adopted in the European Union. The case of ELA provision by the ECB and the Central Bank of Cyprus to Marfin-Laiki Bank during the crisis, however, suggests that the existence of rules and boundaries may not be enough to limit harmful discretion. During a crisis, novel interpretations of the legal authority of the central bank may be introduced to create a grey area that might be exploited to justify harmful discretionary decisions even in the presence of rules and boundaries. This raises the question how to ensure that rules and boundaries are respected in practice
We analyze the differential impact of domestic and foreign monetary policy on the local supply of bank credit in domestic and foreign currencies. We analyze a novel, supervisory dataset from Hungary that records all bank lending to firms including its currency denomination. Accounting for time-varying firm-specific heterogeneity in loan demand, we find that a lower domestic interest rate expands the supply of credit in the domestic but not in the foreign currency. A lower foreign interest rate on the other hand expands lending by lowly versus highly capitalized banks relatively more in the foreign than in the domestic currency.
Inflation differentials in the euro area have been persistent since the adoption of the single currency. This paper analyzes the impact of product and labor market regulation on inflation in a sample of 11 countries. The results show that, after the adoption of the euro, product market deregulation has a relevant and significant effect on the level of inflation, while higher labor market regulation increases the responsiveness of inflation to the output gap.
This paper investigates the role of monetary policy in the collapse in the long-term real interest rates in the decade before the onset of the financial crisis using a sample of five advanced economies (United States, United Kingdom, the euro area, Sweden and Canada). The results from an estimated panel VAR with monthly data show that, while monetary policy shocks had negligible effects on long-term real interest rates, shocks to the long-term real interest rates had a one-to-one effect on the short nominal rate.
Riley (1979)'s reactive equilibrium concept addresses problems of equilibrium existence in competitive markets with adverse selection. The game-theoretic interpretation of the reactive equilibrium concept in Engers and Fernandez (1987) yields the Rothschild-Stiglitz (1976)/Riley (1979) allocation as an equilibrium allocation, however multiplicity of equilibrium emerges. In this note we imbed the reactive equilibrium's logic in a dynamic market context with active consumers. We show that the Riley/Rothschild-Stiglitz contracts constitute the unique equilibrium allocation in any pure strategy subgame perfect Nash equilibrium.
Europe's debt crisis casts doubt on the effectiveness of fiscal austerity in highly-integrated economies. Closed-economy models overestimate its effectiveness, because they underestimate tax-base elasticities and ignore cross-country tax externalities. In contrast, we study tax responses to debt shocks in a two-country model with endogenous utilization that captures those externalities and matches the capital-tax-base elasticity. Quantitative results show that unilateral capital tax hikes cannot restore fiscal solvency in Europe, and have large negative (positive) effects at "home" ("abroad"). Restoring solvency via either Nash competition or Cooperation reduces (increases) capital (labor) taxes significantly, and leaves countries with larger debt shocks preferring autarky.
In my paper I take issue with proponents of ‘intersectionality’ which believe that a theoretical concept cannot/should not be detached from its original context of invention. Instead, I argue that the traveling of theory in a global context automatically involves appropriations, amendment and changes in response to the original meaning. However, I reject the idea that ‘intersectionality’ can be used as a freefloating signifier; on the contrary, it has to be embedded in the respective (historical, social, cultural) context in which it is used. I will start by mapping some of the current debates engaging with the pros and cons of the global implementation of the concept (the controversy about master categories, the dispute about the centrality of ‘race’, and the argument about the amendment of categories). I will then turn to my own use of ‘intersectionality’ as a methodological tool (elaborated in Lutz and Davis 2005). Here, we shifted attention from how structures of racism, class discrimination and sexism determine individuals’ identities and practices to how individuals ongoingly and flexibly negotiate their multiple and converging identities in the context of everyday life. Introducing the term doing intersectionality we explored how individuals creatively and often in surprising ways draw upon various aspects of their multiple identities as a resource to gain control over their lives.
In my paper I will show how ‘gender’ or ‘ethnicity’ are invariably linked to structures of domination, but can also mobilize or deconstruct disempowering discourses, even undermine and transform oppressive practices.
This paper investigates extensions of the method of endogenous gridpoints (ENDGM) introduced by Carroll (2006) to higher dimensions with more than one continuous endogenous state variable. We compare three different categories of algorithms: (i) the conventional method with exogenous grids (EXOGM), (ii) the pure method of endogenous gridpoints (ENDGM) and (iii) a hybrid method (HYBGM). ENDGM comes along with Delaunay interpolation on irregular grids. Comparison of methods is done by evaluating speed and accuracy. We find that HYBGM and ENDGM both dominate EXOGM. In an infinite horizon model, ENDGM also always dominates HYBGM. In a finite horizon model, the choice between HYBGM and ENDGM depends on the number of gridpoints in each dimension. With less than 150 gridpoints in each dimension ENDGM is faster than HYBGM, and vice versa. For a standard choice of 25 to 50 gridpoints in each dimension, ENDGM is 1.4 to 1.7 times faster than HYBGM in the finite horizon version and 2.4 to 2.5 times faster in the infinite horizon version of the model.
On 23 July 2014, the U.S. Securities and Exchange Commission (SEC) passed the “Money Market Reform: Amendments to Form PF ,” designed to prevent investor runs on money market mutual funds such as those experienced in institutional prime funds following the bankruptcy of Lehman Brothers. The present article evaluates the reform choices in the U.S. and draws conclusions for the proposed EU regulation of money market funds.
This paper investigates the impact of news media sentiment on financial market returns and volatility in the long-term. We hypothesize that the way the media formulate and present news to the public produces different perceptions and, thus, incurs different investor behavior. To analyze such framing effects we distinguish between optimistic and pessimistic news frames. We construct a monthly media sentiment indicator by taking the ratio of the number of newspaper articles that contain predetermined negative words to the number of newspaper articles that contain predetermined positive words in the headline and/or the lead paragraph. Our results indicate that pessimistic news media sentiment is positively related to global market volatility and negatively related to global market returns 12 to 24 months in advance. We show that our media sentiment indicator reflects very well the financial market crises and pricing bubbles over the past 20 years.
The record-breaking prices observed in the art market over the last three years raise the question of whether we are experiencing a speculative bubble. Given the difficulty to determine the fundamental value of artworks, we apply a right-tailed unit root test with forward recursive regressions (SADF test) to detect explosive behaviors directly in the time series of four different art market segments (“Impressionist and Modern”, “Post-war and Contemporary”, “American”, and “Latin American”) for the period from 1970 to 2013. We identify two historical speculative bubbles and find an explosive movement in today’s “Post-war and Contemporary” and “American” fine art market segments.
This chapter analyzes the risk and return characteristics of investments in artists from the Middle East and Northern Africa (MENA) region over the sample period 2000 to 2012. With hedonic regression modeling we create an annual index that is based on 3,544 paintings created by 663 MENA artists. Our empirical results prove that investing in such a hypothetical index provides strong financial returns. While the results show an exponential growth in sales since 2006, the geometric annual return of the MENA art index is a stable13.9 percent over the whole period. We conclude that investing in MENA paintings would have been profitable but also note that we examined the performance of an emerging art market that has only seen an upward trend without any correction, yet.
SAFE Professor Michalis Haliassos was a member of the National Council for Research and Technology (ESET) established by the Government of Greece for the period 2010-2013. The council, consisting of eleven scientists from a range of disciplines, has now published their communiqué "National Strategic Framework for Research and Innovation 2014 -2020".
To promote the advancement of research, technology and innovation in Greece, the strategic plan proposed by the authors seeks to identify areas of existing research strength and excellence that can be further advanced to become engines for progress and growth in Greece, as well as flaws inherent to the present system. The authors stress the need to address current constraints to growth, which include the declining education system; the confusion and weaknesses of R&D governance and management; the discontinuities and inefficiencies of resource allocation and investment; the lack of adaptation to clearly-defined national priorities; and the inadequate opportunities and funding for high-quality research and development to flourish. They stress the need for prioritisation and efficient allocation; stability of the policy frame; predictability of planning; provision of opportunity; recognition of excellence; and responsiveness to current and future needs.
This assessment concept paper provides a methodological approach for the formative assessment and summative assessment of GIZ’s International Water Stewardship Programme (IWaSP) and its component partnerships. IWaSP promotes partnerships between the private sector (corporations and SMEs), the public sector and the society to tackle shared water risks and to manage water equitably to meet competing demands. This evaluative assessment concept describes the generic approach of the assessment, the cycle for the assessment of partnerships, the country coordination and the programme.
The overall goal of the assessment is to provide evidence for taxpayers in the donor countries and for citizens in the partnership countries. It also aims to examine the relevance of the programme’s approach, its underlying assumptions, and the heterogeneity of stakeholders and their specific interests. Since the assessment is also formative feedback to GIZ and IWaSP stakeholders, it aims to guide the future implementation of the partnerships and the programme.
The assessment is guided by several generic principles: assessing for learning (formative assessment); assessment of learning (summative assessment); iteration; structuring complex problems; unblocking results; and conformity with other assessment criteria set out by the OECD the Development Assistance Committee (DAC) and GIZ’s Capacity Works success factors (GTZ 2010).
These generic criteria are adapted to the three levels of the IWaSP structure. First, the assessment cycle for partnerships includes the validation of stakeholders (mapping), the analysis of secondary literature, face-to-face interviews and a process for feeding back the findings. Generic tools are provided to guide the assessment, such as a list of key documents and an interview guide. Partnerships will undergo a baseline, interim assessment and final assessment. As progress varies across individual IWaSP partnerships, the steps taken by each partnership to assess shared water risks, prioritise and agree interventions, are expected to differ slightly. In response to these differences the sequencing and content of the assessment may need to be adapted for the different partnerships.
Second, the country-level assessment considers issues such as the coordination of partnerships within a country, scoping strategies, and interaction between partnership and the programme. Information gathered during the partnership assessment feeds into the country-level assessment.
Third, the assessment cycle for the programme involves a document and monitoring plan analysis, reflection on the different perspectives of the programme staff, country staff and external stakeholders.
The final section is concerned with reporting. Several annexes are provided relating to the organisation and preparation of the assessment, including question guidelines and analysis procedures.
This paper provides a systematic analysis of individual attitudes towards ambiguity, based on laboratory experiments. The design of the analysis allows to capture individual behavior across various levels of ambiguity, ranging from low to high. Attitudes towards risk and attitudes towards ambiguity are disentangled, providing pure measures of ambiguity aversion. Ambiguity aversion is captured in several ways, i.e. as a discount factor net of a risk premium, and as an estimated parameter in a generalized utility function. We find that ambiguity aversion varies across individuals, and with the level of ambiguity, being most prominent for intermediate levels. Around one third of subjects show no aversion, one third show maximum aversion, and one third show intermediate levels of ambiguity aversion, while there is almost no ambiguity seeking. While most theoretical work on ambiguity builds on maxmin expected utility, our results provide evidence that MEU does not adequately capture individual attitudes towards ambiguity for the majority of individuals. Instead, our results support models that allow for intermediate levels of ambiguity aversion. Moreover, we find risk aversion to be statistically unrelated to ambiguity aversion on average. Taken together, the results support the view that ambiguity is an important and distinct argument in decision making under uncertainty.
A recent proposal by the Financial Stability Board (FSB) suggests a new risk capital buffer for globally operating systemically important financial institutions. The suggested metric, “Total Loss Absorbing Capacity“ (TLAC), is composed of Tier-1 capital and loss absorbing debt. In a crisis situation, “bail-in-able” debt is to be written down or converted into equity. Jan Krahnen argues that the credibility of bail-in, in the case of systemically important financial institutions, hinges crucially on the design of TLAC and the requirements that will be placed on loss absorbing “bail-in-able” debt.The fear of direct systemic consequences through bail-in could be overcome, if a holding ban were placed on the “bail-in-bonds” of financial institutions. The holding ban would stipulate that these bonds cannot be held by other institutions within the banking sector.
We study consumption-portfolio and asset pricing frameworks with recursive preferences and unspanned risk. We show that in both cases, portfolio choice and asset pricing, the value function of the investor/representative agent can be characterized by a specific semilinear partial differential equation. To date, the solution to this equation has mostly been approximated by Campbell-Shiller techniques, without addressing general issues of existence and uniqueness. We develop a novel approach that rigorously constructs the solution by a fixed point argument. We prove that under regularity conditions a solution exists and establish a fast and accurate numerical method to solve consumption-portfolio and asset pricing problems with recursive preferences and unspanned risk. Our setting is not restricted to affine asset price dynamics. Numerical examples illustrate our approach.
In this paper, we propose a novel approach on how to estimate systemic risk and identify its key determinants. For all US financial companies with publicly traded equity options, we extract their option-implied value-at-risks (VaRs) and measure the spillover effects between individual company VaRs and the option-implied VaR of an US financial index. First, we study the spillover effect of increasing company risks on the financial sector. Second, we analyze which companies are most affected if the tail risk of the financial sector increases. We find that key accounting and market valuation metrics such as size, leverage, balance sheet composition, market-to-book ratio and earnings have a significant influence on the systemic risk profile of a financial institution. In contrast to earlier studies, the employed panel vector autoregression (PVAR) estimator allows for a causal interpretation of the results.
This paper studies the life cycle consumption-investment-insurance problem of a family. The wage earner faces the risk of a health shock that significantly increases his probability of dying. The family can buy term life insurance with realistic features. In particular, the available contracts are long term so that decisions are sticky and can only be revised at significant costs. Furthermore, a revision is only possible as long as the insured person is healthy. A second important and realistic feature of our model is that the labor income of
the wage earner is unspanned. We document that the combination of unspanned labor income and the stickiness of insurance decisions reduces the insurance demand significantly. This is because an income shock induces the need to reduce the insurance coverage, since premia become less affordable. Since such a reduction is costly and families anticipate these potential costs, they buy less protection at all ages. In particular, young families stay away from life insurance markets altogether.
he observed hump-shaped life-cycle pattern in individuals' consumption cannot be explained by the classical consumption-savings model. We explicitly solve a model with utility of both consumption and leisure and with educational decisions affecting future wages. We show optimal consumption is hump shaped and determine the peak age. The hump results from consumption and leisure being substitutes and from the implicit price of leisure being decreasing over time; more leisure means less education, which lowers future wages, and the present value of foregone wages decreases with age. Consumption is hump shaped whether the wage is hump shaped or increasing over life.
Advertising arbitrage
(2014)
Speculators often advertise arbitrage opportunities in order to persuade other investors and thus accelerate the correction of mispricing. We show that in order to minimize the risk and the cost of arbitrage an investor who identifies several mispriced assets optimally advertises only one of them, and overweights it in his portfolio; a risk-neutral arbitrageur invests only in this asset. The choice of the asset to be advertised depends not only on mispricing but also on its "advertisability" and accuracy of future news about it. When several arbitrageurs identify the same arbitrage opportunities, their decisions are strategic complements: they invest in the same asset and advertise it. Then, multiple equilibria may arise, some of which inefficient: arbitrageurs may correct small mispricings while failing to eliminate large ones. Finally, prices react more strongly to the ads of arbitrageurs with a successful track record, and reputation-building induces high-skill arbitrageurs to advertise more than others.
Most simulated micro-founded macro models use solely consumer-demand aggregates in order to estimate deep economy-wide preference parameters, which are useful for policy evaluation. The underlying demand-aggregation properties that this approach requires, should be easy to empirically disprove: since household-consumption choices differ for households with more members, aggregation can be rejected if appropriate data violate an affine equation regarding how much individuals benefit from within-household sharing of goods. We develop a survey method that tests the validity of this equation, without utility-estimation restrictions via models. Surprisingly, in six countries, this equation is not rejected, lending support to using consumer-demand aggregates.
This paper explores consequences of consumer education on prices and welfare in retail financial markets when some consumers are naive about shrouded add-on prices and firms try to exploit it. Allowing for different information and pricing strategies we show that education is unlikely to push firms to disclose prices towards all consumers, which would be socially efficient. Instead, price discrimination emerges as a new equilibrium. Further, due to a feedback on prices, education that is good for consumers who become sophisticated may be bad for consumers who stay naive and even for the group of all consumers as a whole
We characterize optimal redistribution in a dynastic family model with human capital. We show how a government can improve the trade-off between equality and incentives by changing the amount of observable human capital. We provide an intuitive decomposition for the wedge between human-capital investment in the laissez faire and the social optimum. This wedge differs from the wedge for bequests because human capital carries risk: its returns depend on the non-diversi
able risk of children's ability. Thus, human capital investment is encouraged more than bequests in the social optimum if human capital is a bad hedge for consumption risk.