Refine
Year of publication
- 2017 (106) (remove)
Document Type
- Working Paper (106) (remove)
Language
- English (106) (remove)
Has Fulltext
- yes (106)
Is part of the Bibliography
- no (106) (remove)
Keywords
- asset pricing (4)
- bail-in (4)
- financial stability (4)
- EIOPA (3)
- MREL (3)
- TLAC (3)
- Asset pricing (2)
- Banking Union (2)
- Corporate Governance (2)
- Culture (2)
Institute
- Wirtschaftswissenschaften (87)
- Center for Financial Studies (CFS) (76)
- Sustainable Architecture for Finance in Europe (SAFE) (63)
- House of Finance (HoF) (45)
- Institute for Monetary and Financial Stability (IMFS) (11)
- Informatik (5)
- Rechtswissenschaft (3)
- Gesellschaftswissenschaften (2)
- Institut für sozial-ökologische Forschung (ISOE) (2)
- Kulturwissenschaften (2)
- Extern (1)
- Exzellenzcluster Die Herausbildung normativer Ordnungen (1)
- Geowissenschaften (1)
- Institute for Law and Finance (ILF) (1)
This paper reviews social network analysis (SNA) as a method to be utilized in biographical research which is a novel contribution. We argue that applying SNA in the context of biography research through standardized data collection as well as visualization of networks can open up participants’ interpretations of relations throughout their lives, and allow a creative and innovative way of data collection that is responsive to participants’ own meanings and associations while allowing the researchers to conduct systematical data analysis. The paper discusses the analytical potential of SNA in biographical research, where the efficacy of this method is critically discussed, together with its limitations, and its potential within the context of biographical research.
The long-run consumption risk model provides a theoretically appealing explanation for prominent asset pricing puzzles, but its intricate structure presents a challenge for econometric analysis. This paper proposes a two-step indirect inference approach that disentangles the estimation of the model's macroeconomic dynamics and the investor's preference parameters. A Monte Carlo study explores the feasibility and efficiency of the estimation strategy. We apply the method to recent U.S. data and provide a critical re-assessment of the long-run risk model's ability to reconcile the real economy and financial markets. This two-step indirect inference approach is potentially useful for the econometric analysis of other prominent consumption-based asset pricing models that are equally difficult to estimate.
What processes transform (im)mobile individuals into ‘migrants’ and geographic movements across political-territorial borders into ‘migration’? To address this question, the article develops the doing migration approach, which combines perspectives from social constructivism, praxeology and the sociologies of knowledge and culture. ‘Doing migration’ starts with the processes of social attribution that differentiate between ‘migrants’ and ‘non-migrants’. Embedded in institutional, organizational and interactional routines these attributions generate unique social orders of migration. By illustrating these conceptual ideas, the article provides insights into the elements of the contemporary European order of ‘migration’. Its institutional routines contribute to the emergence of a European migration regime that involves narratives of economization, securitization and humanitarization. The organizational routines of the European migration order involve surveillance and diversity management, which have disciplining effects on those defined as ‘migrants’. The routines of everyday face-to-face interactions produce various micro-forms of doing ‘migration’ through stigmatization and othering, but they also provide opportunities to resist a social attribution as ‘migrant’.
Motivated by tools for automaed deduction on functional programming languages and programs, we propose a formalism to symbolically represent $\alpha$-renamings for meta-expressions. The formalism is an extension of usual higher-order meta-syntax which allows to $\alpha$-rename all valid ground instances of a meta-expression to fulfill the distinct variable convention. The renaming mechanism may be helpful for several reasoning tasks in deduction systems. We present our approach for a meta-language which uses higher-order abstract syntax and a meta-notation for recursive let-bindings, contexts, and environments. It is used in the LRSX Tool -- a tool to reason on the correctness of program transformations in higher-order program calculi with respect to their operational semantics. Besides introducing a formalism to represent symbolic $\alpha$-renamings, we present and analyze algorithms for simplification of $\alpha$-renamings, matching, rewriting, and checking $\alpha$-equivalence of symbolically $\alpha$-renamed meta-expressions.
Asymmetric social norms
(2017)
Studies of cooperation in infinitely repeated matching games focus on homogeneous economies, where full cooperation is efficient and any defection is collectively sanctioned. Here we study heterogeneous economies where occasional defections are part of efficient play, and show how to support those outcomes through contagious punishments.
In 1983, Brian Henderson published an article that examined various types of narrative structure in film, including flashbacks and flashforwards. After analyzing a whole spectrum of techniques capable of effecting a transition between past and present – blurs, fades, dissolves, and so on – he concluded: "Our discussions indicate that cinema has not (yet) developed the complexity of tense structures found in literary works". His "yet" (in parentheses) was an instance of laudable caution, as very soon – in some ten–fifteen years – the situation would change drastically, and temporal twists would become a trademark of a new genre that has not (yet) acquired a standardized name: "modular narratives", "puzzle films", and "complex films" are among the labels used.
This paper presents new evidence on the expectation formation process of firms from a survey of the German manufacturing sector. It focuses on the expectation about their future business conditions, which enters the widely followed economic sentiment index and which is an important determinant of their employment and investment decisions. We find that firms extrapolate their experience too much and make predictable forecasting errors. Moreover, firms do not seem to anticipate the upcoming reversals of business cycle peaks and troughs which causes suboptimal adjustment of investment and employment and affects their inventories and profits. However, the impact on expectation errors decreases with the size and the age of the firm as firms learn to reduce their extrapolation bias over time.
The level of capital tax gains has high explanatory power regarding the question of what drives economic inequality. On this basis, the authors develop a simple, yet micro-founded portfolio selection model to explain the dynamics of wealth inequality given empirical tax series in the US. The results emphasize that the level and the transition of speed of wealth inequality depend crucially on the degree of capital taxation. The projections predict that – continuing on the present path of capital taxation in the US – the gap between rich and poor is expected to shrink whereas “massive” tax cuts will further increase the degree of wealth concentration.
I propose a dynamic stochastic general equilibrium model in which the leverage of borrowers as well as banks and housing finance play a crucial role in the model dynamics. The model is used to evaluate the relative effectiveness of a policy to inject capital into banks versus a policy to relieve households of mortgage debt. In normal times, when the economy is near the steady state and policy rates are set according to a Taylor-type rule, capital injections to banks are more effective in stimulating the economy in the long-run. However, in the middle of a housing debt crisis, when households are highly leveraged, the short-run output effects of the debt relief are more substantial. When the zero lower bound (ZLB) is additionally considered, the debt relief policy can be much more powerful in boosting the economy both in the short-run and in the longrun. Moreover, the output effects of the debt relief become increasingly larger, the longer the ZLB is binding.
We analyze the market reaction to the sentiment of the CEO speech at the Annual General Meeting (AGM). As the AGM is typically preceded by several information disclosures, the CEO speech may be expected to contribute only marginally to investors’ decision-making. Surprisingly, however, we observe from the transcripts of 338 CEO speeches of German corporates between 2008 and 2016 that their sentiment is significantly related to abnormal stock returns and trading volumes following the AGM. Using a novel business-specific German dictionary based on Loughran and McDonald (2011), we find a negative association of the post-AGM returns with the speeches’ negativity and a positive association with the speeches’ relative positivity (i.e. positivity relative to negativity). Relative positivity moreover corresponds with a lower trading volume in a short time window surrounding the AGM. Investors hence seem to perceive the sentiment of CEO speeches at AGMs as a valuable indicator of future firm performance.
In the context of the upcoming Brexit, a relocation of the clearing of euro-OTC derivatives for EU-based firms is the subject of controversial discussion. The opponents of a relocation argue that a relocation would cause additional costs for market participants of up to USD 100 bn over a period of 5 years. This paper shows that this cost estimate is fairly unrealistic and that relocation costs would amount to approximately USD 0.6 bn p.a., which translates to cumulative costs of around USD 3.2 bn for a transition period of 5 years. In light of the strategic importance of systemically relevant CCPs for the financial stability of the eurozone, the potential relocation costs should not be a decision criterion.
The EU Collective Redress Recommendation has invited Member States to introduce collective redress mechanisms by 26 July 2015. The well-known reservations claim potentially abusive litigation and potential settlement of not well-founded claims resulting from controversial funding of cases by means of contingency fees and from ‘opt-out’ class action procedures. The paper posits that there may also be some fear that the European Commission may try to pursue the enforcement of its regulatory agenda in this way at the expense of individual claimants’ interests. Therefore a comparative analysis is carried out to see to what extent concerns about individual rights as opposed to regulatory goals are reflected in the different newly revised systems in place across Europe. As an iterim result the Dutch settlement procedure for mass damage claims, the English Group Litigation Order and the German test case procedure turn out to be relatively well-suited to deal with mass damage claims. At the same time, none of them can quite reach an optimal balance between individual rights and regulatory goals and therefore each of them is subject to criticism. That is why the further question is raised in how far these procedures could complement each other, thus contributing to the enforcement of individual rights without overregulating markets in Europe.
Coming early to the party
(2017)
We examine the strategic behavior of High Frequency Traders (HFTs) during the pre-opening phase and the opening auction of the NYSE-Euronext Paris exchange. HFTs actively participate, and profitably extract information from the order flow. They also post "flash crash" orders, to gain time priority. They make profits on their last-second orders; however, so do others, suggesting that there is no speed advantage. HFTs lead price discovery, and neither harm nor improve liquidity. They "come early to the party", and enjoy it (make profits); however, they also help others enjoy the party (improve market quality) and do not have privileges (their speed advantage is not crucial).
Commodity connectedness
(2017)
We use variance decompositions from high-dimensional vector autoregressions to characterize connectedness in 19 key commodity return volatilities, 2011-2016. We study both static (full-sample) and dynamic (rolling-sample) connectedness. We summarize and visualize the results using tools from network analysis. The results reveal clear clustering of commodities into groups that match traditional industry groupings, but with some notable differences. The energy sector is most important in terms of sending shocks to others, and energy, industrial metals, and precious metals are themselves tightly connected.
Monetary policy communication is particularly important during unconventional times, because high uncertainty about the economy, the introduction of new policy tools and possible limits to the central bank’s toolkit could hamper the predictability of policy actions. We study how monetary policy communication should and has worked under such circumstances. Our main results relate to announcements of asset purchase programmes and the use of forward guidance. We show that announcements of asset purchase programmes have lowered market uncertainty, particularly when accompanied by a contextual release of implementation details such as the envisaged size of the programme. We also show that forward guidance reduces uncertainty more effectively when it is state‐contingent or when it provides guidance about a long horizon than when it is open‐ended or covers only a short horizon, and that the credibility of forward guidance is strengthened if the central bank also has embarked on an asset purchase programme.
This paper studies a consumption-portfolio problem where money enters the agent's utility function. We solve the corresponding Hamilton-Jacobi-Bellman equation and provide closed-form solutions for the optimal consumption and portfolio strategy both in an infinite- and finite-horizon setting. For the infinite-horizon problem, the optimal stock demand is one particular root of a polynomial. In the finite-horizon case, the optimal stock demand is given by the inverse of the solution to an ordinary differential equation that can be solved explicitly. We also prove verification results showing that the solution to the Bellman equation is indeed the value function of the problem. From an economic point of view, we find that in the finite-horizon case the optimal stock demand is typically decreasing in age, which is in line with rules of thumb given by financial advisers and also with recent empirical evidence.
The paper provides an overview and an economic analysis of the development of the corporate governance of German banks since the 1950s, highlighting peculiarities – as seen from the meanwhile prevailing standard model perspective – of the German case. These peculiarities refer to the specific German notion and legal-institutional regime of corporate governance in general as well as to the specific three-pillar structure of the German banking system.
The most striking changes in the corporate governance of German banks during the past 50 years occurred in the case of the large shareholder-owned banks. For them, capital markets have become an important element of corporate governance, and their former orientation towards the interests of a broadly defined set of stakeholders has largely been replaced by a one-sided concentration on shareholders’ interests. In contrast, the corporate governance regimes of the smaller local public savings banks and the local cooperative banks have remained virtually unchanged. They acknowledge a broader horizon of stakeholder interests and put an emphasis on monitoring.
The Great Financial Crisis, beginning in 2007, has led to a considerable reassessment in the academic and political debate on bank governance. On an international level, it has revived the older notion that, in view of their high leverage and their innate complexity, banks are “special” and bank corporate governance also – and needs to be seen in this light, not least because research indicates that banks with a strong and one-sided shareholder orientation – and thus with what appears to be the best corporate governance according to the standard model – have suffered most in the crisis. In the German case, the crisis has shown that the smaller local banks have survived the crisis much better than large private and public banks, whose funding strongly depends on wholesale markets. This may point to certain advantages of their governance and ownership regimes. But the differences in the performance during the crisis years may also, or even more so, be a consequence of the business models of large vs small banks than of their different governance regimes.
Under Solvency II, corporate governance requirements are a complementary, but nonetheless essential, element to build a sound regulatory framework for insurance undertakings, also to address risks not specifically mitigated by the sole solvency capital requirements. After recalling the provisions of the second pillar concerning the system of governance, the paper is devoted to highlight the emerging regulatory trends in the corporate governance of insurance firms. Among others, it signals the exceptional extension of the duties and responsibilities assigned to the Board of directors, far beyond the traditional role of both monitoring the chief executive officer, and assessing the overall direction and strategy of the business. However, a better risk governance is not necessarily built on narrow rule-based approaches to corporate governance.
Under Solvency II, corporate governance requirements are a complementary, but nonetheless essential, element to build a sound regulatory framework for insurance undertakings, also to address risks not specifically mitigated by the sole solvency capital requirements. After recalling the provisions of the Second Pillar concerning the system of governance, the paper highlights the emerging regulatory trends in the corporate governance of insurance firms. Among others things, it signals the exceptional extension of the duties and responsibilities assigned to the board of directors, far beyond the traditional role of both monitoring the chief executive officer, and assessing the overall direction and strategy of the business. However, a better risk governance is not necessarily built on narrow rule-based approaches to corporate governance.
A counterparty credit limit (CCL) is a limit imposed by a financial institution to cap its maximum possible exposure to a specified counterparty. Although CCLs are designed to help institutions mitigate counterparty risk by selective diversification of their exposures, their implementation restricts the liquidity that institutions can access in an otherwise centralized pool. We address the question of how this mechanism impacts trade prices and volatility, both empirically and via a new model of trading with CCLs. We find empirically that CCLs cause little impact on trade. However, our model highlights that in extreme situations, CCLs could serve to destabilize prices and thereby influence systemic risk.
We analyze older individuals’ debt and financial vulnerability using data from the Health and Retirement Study (HRS) and the National Financial Capability Study (NFCS). Specifically, in the HRS we examine three different cohorts (individuals age 56–61) in 1992, 2004, and 2010 to evaluate cross-cohort changes in debt over time. We also use two waves of the NFCS (2012 and 2015) to gain additional insights into debt management and older individuals’ capacity to shield themselves against shocks. We show that recent cohorts have taken on more debt and face more financial insecurity, mostly due to having purchased more expensive homes with smaller down payments.
We study the impact of estimation errors of firms on social welfare. For this purpose, we present a model of the insurance market in which insurers face parameter uncertainty about expected loss sizes. As consumers react to under- and overestimation by increasing and decreasing demand, respectively, insurers require a safety loading for parameter uncertainty. If the safety loading is too small, less risk averse consumers benefit from less informed insurers by speculating on them underestimating expected losses. Otherwise, social welfare increases with insurers’ information. We empirically estimate safety loadings in the US property and casualty insurance market, and show that these are likely to be sufficiently large for consumers to benefit from more informed insurers.
To broaden the scope of monetary policy, cash abolishment is often suggested as a means of breaking through the zero lower bound. However, practically nothing is said about the welfare costs of such a proposal. Rösl, Seitz and Tödter argue that the welfare costs of bypassing the zero lower bound can be analyzed analytically and empirically by assuming negative interest rates on cash holdings. They gauge the welfare effects of abolishing cash, both, for the euro area and for Germany.
Their findings suggest that the welfare losses of negative interest rates incurred by money holders are large, notably if implemented in the current low interest rate environment. Imposing a negative interest rate of 3 percentage points on cash holdings and reducing the interest on all assets included in M3 creates a deadweight loss of € 62bn for the euro area and of €18bn for Germany. Therefore, the authors argue that cash abolishment or negative interest rates on cash to break through the zero lower bound at any price can hardly be a meaningful policy goal.
Causality is a widely-used concept in theoretical and empirical economics. The recent financial economics literature has used Granger causality to detect the presence of contemporaneous links between financial institutions and, in turn, to obtain a network structure. Subsequent studies combined the estimated networks with traditional pricing or risk measurement models to improve their fit to empirical data. In this paper, we provide two contributions: we show how to use a linear factor model as a device for estimating a combination of several networks that monitor the links across variables from different viewpoints; and we demonstrate that Granger causality should be combined with quantile-based causality when the focus is on risk propagation. The empirical evidence supports the latter claim.
This study provides a graphic overview on core legislation in the area of economic and financial services. The presentation essentially covers the areas within the responsibility of the Economic and Monetary Affairs Committee (ECON); hence it starts with core ECON areas but also displays neighbouring areas of other Committees' competences which are closely connected to and impacting on ECON's work. It shows legislation in force, proposals and other relevant provisions on banking, securities markets and investment firms, market infrastructure, insurance and occupational pensions, payment services, consumer protection in financial services, the European System of Financial Supervision, European Monetary Union, euro bills and coins and statistics, competition, taxation, commerce and company law, accounting and auditing. Moreover, it notes selected provisions that might become relevant in the upcoming Article 50 TEU negotiations.
We compare the cost effectiveness of two pronatalist policies:
(a) child allowances; and
(b) daycare subsidies.
We pay special attention to estimating how intended fertility (fertility before children are born) responds to these policies. We use two evaluation tools:
(i) a dynamic model on fertility, labor supply, outsourced childcare time, parental time, asset accumulation and consumption; and
(ii) randomized vignette-survey policy experiments.
We implement both tools in the United States and Germany, finding consistent evidence that daycare subsidies are more cost effective. Nevertheless, the required public expenditure to increase fertility to the replacement level might be viewed as prohibitively high.
This paper aims to analyze the effects of financial constraints and the financial crisis on the financing and investment policies of newly founded firms. Thereby, the analysis adds important new insights on a crucial segment of the economy. We make use of a large and comprehensive data set of French firms founded in the years 2004-2006, i.e. well before the financial crisis. Our panel data analysis shows that the global financial crisis imposed a shock (mostly demand-driven) on the financing as well as on the investments of these firms. Moreover, we find that financially constrained firms use less external debt financing and invest smaller amounts. They also rely on less trade credit. With regard to bank financing, newly founded firms which are more financially constrained accumulate less bank debt and repay initial bank debt slower than their non-financially constraint counterparts. Finally, we find that financially constrained firms are affected to a smaller degree by the financial crisis than their less financially constrained counterparts.
In this paper we propose a way forward towards increased financial resilience in times of growing disagreement concerning open borders, free trade and global regulatory standards. In light of these concerns, financial resilience remains a highly valued policy objective. We wish to contribute by suggesting an agenda of concrete, do-able steps supporting an enhanced level of resilience, combined with a deeper understanding of its relevance in the public domain.
First, remove inconsistencies across regulatory rules and territorial regimes, and ensure their credibility concerning implementation. Second, discourage the use of financial regulatory standards as means of international competition. Third, give more weight to pedagogically explaining the established regulatory standards in public, to strengthen their societal backing.
Bank regulators have the discretion to discipline banks by executing enforcement actions to ensure that banks correct deficiencies regarding safe and sound banking principles. We
highlight the trade-offs regarding the execution of enforcement actions for financial stability. Following this we provide an overview of the differences in the legal framework governing supervisors’ execution of enforcement actions in the Banking Union and the United States. After discussing work on the effect of enforcement action on bank behaviour and the real economy, we present data on the evolution of enforcement actions
and monetary penalties by U.S. regulators. We conclude by noting the importance of supervisors to levy efficient monetary penalties and stressing that a division of competences among different regulators should not lead to a loss of efficiency regarding
the execution of enforcement actions.
The publication of the Liikanen Group's final report in October 2012 was surrounded by high expectations regarding the implementation of the reform plans through the proposed measures that reacted to the financial and sovereign debt crises. The recommendations mainly focused on introducing a mild version of banking separation and the creation of the preconditions for bail-in measures. In this article, we present an overview of the regulatory reforms, to which the financial sector has been subject over the past years in accordance with the concepts laid out in the Liikanen Report. It becomes clear from our assessment that more specific steps have yet to be taken before the agenda is accomplished. In particular, bail-in rules must be implemented more consistently. Beyond the question of the required minimum, the authors develop the notion of a maximum amount of liabilities subject to bail-in. The combination of both components leads to a three-layer structure of bank capital: a bail-in tranche, a deposit-insured bailout tranche, and an intermediate run-endangered mezzanine tranche. The size and treatment of the latter must be put to a political debate that weighs the costs and benefits of a further increase in financial stability beyond that achieved through loss-bearing of the bail-in tranche.
The Global Irrigation Model (GIM) is used within the framework of the global hydrological model WaterGAP to calculate monthly irrigation crop water use. Results on a 0.5 degrees grid include, consumption (ICU) and, via division by irrigation efficiencies, water withdrawal (IWU). The model distinguishes up to two cropping periods of rice and non-rice crops, each grown for 150 days, using a grid of area equipped for irrigation (AEI). Historical development of AEI and fraction of area actually irrigated (AAI) was previously considered via scaling of cell-specific results with country-specific factors for each year. In this study, GIM was adapted to use the new Historical Irrigation Data set (HID) with cell-specific AEI for 14 time slices between 1900 and 2005. AEI grids were temporally interpolated, and using the optional grid of AAI/AEI, results for years 1901-2014 were generated (runs "HID-ACT"). Thus, new installation or abandonment of irrigation infrastructure in new grid cells can be represented in a spatially explicit manner. For evaluated years 1910, 1960, 1995, and 2005, ICU from HID-ACT was superior to country-specific scaled results (run "HID-ACTHIST") in representing historical development of the spatial pattern. Compared to US state-level reference data, spatial patterns were better, while country totals were not always better. For calculating the cropping periods, 30-years climate means are needed, the choice of which is relevant. Four chosen periods before 1981-2010 all resulted in considerable, pertaining changes of ICU spatial pattern, and various percent changes in country totals. This might be because of already present climate change.
We shed new light on the macroeconomic effects of rising temperatures. In the data, a shock to global temperature dampens expenditures in research and development (R&D). We rationalize this empirical evidence within a stochastic endogenous growth model, featuring temperature risk and growth sustained through innovations. In line with the novel evidence in the data, temperature shocks undermine economic growth via a drop in R&D. Moreover, in our endogenous growth setting temperature risk generates non-negligible welfare costs (i.e., 11% of lifetime utility). An active government, which is committed to a zero fiscal deficit policy, can offset the welfare costs of global temperature risk by subsidizing the aggregate capital investment with one-fifth of total public spending.
Empirical evidence suggests that investments in research and development (R&D) by older and larger firms are more spread out internationally than R&D investments by younger and smaller firms. In this paper, I explore the quantitative implications of this type of heterogeneity by assuming that incumbents, i.e. current monopolists engaging in incremental innovation, have a higher degree of internationalization in their R&D technologies than entrants, i.e. new firms engaging in radical innovation, in a two-country endogenous growth general equilibrium model. In particular, this assumption allows the model to break the perfect correlation between incumbents’ and entrants’ innovation probabilities and to match the empirical counterpart exactly.
Exploiting NASDAQ order book data and difference-in-differences methodology, we identify the distinct effects of trading pause mechanisms introduced on U.S. stock exchanges after May 2010. We show that the mere existence of such a regulation constitutes a safeguard which makes market participants behave differently in anticipation of a pause. Pauses tend to break local price trends, make liquidity suppliers revise positions, and enhance price discovery. In contrast, pauses do not have a “cool off” effect on markets, but rather accelerate volatility and bid-ask spreads. This implies a regulatory trade-off between the protective role of trading pauses and their adverse effects on market quality.
The Judgement of the EGC in the Case T-122/15 – Landeskreditbank Baden-Württemberg - Förderbank v European Central Bank is the first statement of the European judiciary on the sub-stantive law of the Banking Union. Beyond its specific holding, the decision is of great importance, because it hints at the methodological approach the EGC will take in interpreting prudential banking regulation in the appeals against supervisory measures that fall in its jurisdiction under TFEU, arts. 256(1) subpara 1 and 263(4). Specifically, the case pertained to the scope of direct ECB oversight of significant banks in the euro area and the reassignment of this competence to national competent authorities (NCAs) in individual circumstances (Single Supervisory Mechanism (SSM) Regulation, art. 6(4) subpara 2; SSM Framework Regulation, arts. 70, 71).
This Chapter explores how an environment of persistent low returns influences saving, investing, and retirement behaviors, as compared to what in the past had been thought of as more “normal” financial conditions. Our calibrated lifecycle dynamic model with realistic tax, minimum distribution, and Social Security benefit rules produces results that agree with observed saving, work, and claiming age behavior of U.S. households. In particular, our model generates a large peak at the earliest claiming age at 62, as in the data. Also in line with the evidence, our baseline results show a smaller second peak at the (system-defined) Full Retirement Age of 66. In the context of a zero-return environment, we show that workers will optimally devote more of their savings to non-retirement accounts and less to 401(k) accounts, since the relative appeal of investing in taxable versus tax-qualified retirement accounts is lower in a low return setting. Finally, we show that people claim Social Security benefits later in a low interest rate environment.
Since 2014 the ECB has implemented a massive expansion of monetary policy including large-scale asset purchases and negative policy rates. As the euro area economy has improved and inflation has risen, questions concerning the future normalization of monetary policy are starting to dominate the public debate.
The study argues that the ECB should develop a strategy for policy normalization and communicate it very soon to prepare the ground for subsequent steps towards tightening. It provides analysis and makes proposals concerning key aspects of this strategy. The aim is to facilitate the emergence of expectations among market participants that are consistent with a smooth process of policy normalization.
Low probability events are overweighted in the pricing of out-of the-money index puts and single stock calls. We find that this behavioral bias is strongly time-varying, linked to equity market sentiment, and higher moments of the risk-neutral density. An implied volatility (IV) sentiment measure that is jointly derived from index and single stock options explains investors' overweight of tail events the best. Our findings also suggest that IV-sentiment predicts equity markets reversals better than overweight of small probabilities itself. When employed in a trading strategy, IV-sentiment delivers economically significant results, which are more consistent than the ones produced by the market sentiment factor. The joint use of information from the single stock and index option markets seems to explain the forecasting power of IV-sentiment. Out-of-sample tests on reversal prediction show that our IV-sentiment measure adds value over and above traditional factors in the equity risk premium literature, especially as an equity-buying signal. This reversals prediction seems to improve time-series and cross-sectional momentum strategies.
We propose a model for measuring the runtime of concurrent programs by the minimal number of evaluation steps. The focus of this paper are improvements, which are program transformations that improve this number in every context, where we distinguish between sequential and parallel improvements, for one or more processors, respectively. We apply the methods to CHF, a model of Concurrent Haskell extended by futures. The language CHF is a typed higher-order functional language with concurrent threads, monadic IO and MVars as synchronizing variables. We show that all deterministic reduction rules and 15 further program transformations are sequential and parallel improvements. We also show that introduction of deterministic parallelism is a parallel improvement, and its inverse a sequential improvement, provided it is applicable. This is a step towards more automated precomputation of concurrent programs during compile time, which is also formally proven to be correctly optimizing.
Given rising life expectations around the world, it seems that old-age pension benefits will need to be cut and pension contributions boosted in many nations. Yet our research on old-age system reforms does not require raising mandatory retirement ages or contributions. Instead, we offer ways to enhance incentives for people to work longer and delay retirement. There are good reasons to incentivize older people to work longer and delay retirement. These include rising longevity, the shrinking workforce, and emerging evidence indicating that working longer can be associated with better mental and physical health for many people. Nevertheless, old age Social Security systems in many nations find that people tend to claim benefits early, usually leading to reduced benefits.In the United States, for instance, a majority of Americans claim their Social Security benefits at the earlier feasible age, namely 62, even though their monthly benefits would be 75% higher if they waited until age 70. To test whether this is the result of people underweighting the economic value of higher lifetime benefit streams, we examine whether people would claim later and work longer if they were rewarded with a lump sum instead of a higher lifetime benefit stream for deferring. Two arguments have been offered to explain early claiming. One is that workers claim early to avoid potentially “forfeiting” their deferred benefits should they die too soon (Brown et al., 2016). A second explanation is that many people underweight the economic value of lifetime benefit streams (Brown et al., 2017). This latter rationale motivates the present study.
We study the general equilibrium implications of different fiscal policies on macroeconomic quantities, asset prices, and welfare by utilizing two endogenous growth models. The expanding variety model features only homogeneous innovations by entrants. The Schumpeterian growth model features heterogeneous innovations: "incremental" innovations by incumbents and "radical" innovations by entrants. The government levies taxes on labor income and corporate profits and supplies subsidies to consumption, capital investment, and investments in research and development by entrants and, if applicable, incumbents. With these models at hand, we provide new insights on the interplay of innovation dynamics and fiscal policy.
The currrent debate on monetary and fiscal policy is heavily influenced by estimates of the equilibrium real interest rate. Beyer and Wieland re-estimate the U.S. equilibrium rate with the methodology of Laubach and Williams and further modifications. They provide new estimates for the United States, the euro area and Germany and subject them to sensitivity tests. Beyer and Wieland conclude that due to the great uncertainty and sensitivity, the observed decline in the estimates is not a reliable indicator of a need for expansionary monetary and fiscal policy. Yet, if those estimates are employed to determine the appropriate monetary policy stance, such estimates are better used together with the consistent estimate of the level of potential output.
We investigate how solvency and wholesale funding shocks to 84 OECD parent banks affect the lending of 375 foreign subsidiaries. We find that parent solvency shocks are more important than wholesale funding shocks for subsidiary lending. Furthermore, we find that parent undercapitalization does not affect the transmission of shocks, while wholesale shocks transmit to foreign subsidiaries of parents that rely primarily on wholesale funding. We also find that transmission is affected by the strategic role of the subsidiary for the parent and follows a locational, rather than an organizational pecking order. Surprisingly, liquidity regulation exacerbates the transmission of adverse wholesale shocks. We further document that parent banks tend to use their own capital and liquidity buffers first, before transmitting. Finally, we show that solvency shocks have higher impact on large subsidiary banks with low growth opportunities in mature markets.
We propose a 2-country asset-pricing model where agents' preferences change endogenously as a function of the popularity of internationally traded goods. We determine the effect of the time-variation of preferences on equity markets, consumption and portfolio choices. When agents are more sensitive to the popularity of domestic consumption goods, the local stock market reacts more strongly to the preferences of local agents than to the preferences of foreign agents. Therefore, home bias arises because home-country stock represents a better investment opportunity for hedging against future fluctuations in preferences. We test our model and find that preference evolution is a plausible driver of key macroeconomic variables and stock returns.
On 15 August 2017, the Bundesverfassungsgericht (BVerfG) referred the case against the European Central Bank’s policy of Quantitative Easing (QE) to the European Court of Justice (ECJ). The author argues that this event differs in several aspects from the OMT case in 2015 – in content as well as in form. The BVerfG recognizes that it is a legitimate goal of the ECB’s monetary policy to bring inflation up close to 2%, and that the instrument employed for QE is one of monetary policy. However, it doubts whether the sheer volume of QE would not distort the character of the program as one of monetary policy. The ECJ will now have to clarify the extent to which the ECJ’s findings in its OMT judgment are relevant for QE as well as the standard of review applicable to monetary policy. The author raises the questions of whether the principle of democracy under German constitutional law can actually provide the standard by which the ECB is to be measured, and how tight judicial review could be exercised over the ECB without encroaching upon its autonomy in monetary policy matters – and thus upon the very essence of central bank independence.
We theoretically and empirically study large-scale portfolio allocation problems when transaction costs are taken into account in the optimization problem. We show that transaction costs act on the one hand as a turnover penalization and on the other hand as a regularization, which shrinks the covariance matrix. As an empirical framework, we propose a flexible econometric setting for portfolio optimization under transaction costs, which incorporates parameter uncertainty and combines predictive distributions of individual models using optimal prediction pooling. We consider predictive distributions resulting from highfrequency based covariance matrix estimates, daily stochastic volatility factor models and regularized rolling window covariance estimates, among others. Using data capturing several hundred Nasdaq stocks over more than 10 years, we illustrate that transaction cost regularization (even to small extent) is crucial in order to produce allocations with positive Sharpe ratios. We moreover show that performance differences between individual models decline when transaction costs are considered. Nevertheless, it turns out that adaptive mixtures based on high-frequency and low-frequency information yield the highest performance. Portfolio bootstrap reveals that naive 1=N-allocations and global minimum variance allocations (with and without short sales constraints) are significantly outperformed in terms of Sharpe ratios and utility gains.
We propose a long-run risk model with stochastic volatility, a time-varying mean reversion level of volatility, and jumps in the state variables. The special feature of our model is that the jump intensity is not affine in the conditional variance but driven by a separate process. We show that this separation of jump risk from volatility risk is needed to match the empirically weak link between the level and the slope of the implied volatility smile for S&P 500 options.
Fascicle XVI of the exsiccate "K. KALB & A. APTROOT: LICHENES NEOTROPICI" (new name for "K. KALB: LICHENES NEOTROPIC" from fascicle XVI onwards) with 23 lichen specimens (No. 628–650) from Brazil, Chile, Dominican Republic, Ecuador, Kenya, Peru and Venezuela is distributed. Three species are described as new, namely Lopadium subcoralloideum Aptroot & Kalb, Lecanactis caceresiana Kalb & Aptroot and Rhizocarpon sipmanianum Kalb & Aptroot. The holotypes of the new species are deposited at Universidade Federal de Mato Grosso do Sul (UFMS). Range extensions are reported for Hypocenomyce tinderreyensis (new to the Neo-tropics; so far only known from Australia, but apparently austral), Ocellularia baorucensis (new to Brazil), Physcidia striata (recently described from Rondônia and the Venezuelean Amazon, and subsequently reported from Amapá and Brazilian Amazonas. The collection from Brazil/Mato Grosso do Sul represents a major range extension to the South), Tephromela campestricola (new to the Neotropics; not different in any way from European material) and Xanthoparmelia arvidssonii (new to Venezuela).
We develop a state-space model to decompose bid and ask quotes of CDS into two components, fair default premium and liquidity premium. This approach gives a better estimate of the default premium than mid quotes, and it allows to disentangle and compare the liquidity premium earned by the protection buyer and the protection seller. In contrast to other studies, our model is structurally much simpler, while it also allows for correlation between liquidity and default premia, as supported by empirical evidence. The model is implemented and applied to a large data set of 118 CDS for a period ranging from 2004 to 2010. The model-generated output variables are analyzed in a difference-in-difference framework to determine how the default premium, as well as the liquidity premium of protection buyers and sellers, evolved during different periods of the financial crisis and to which extent they differ for financial institutions compared to non-financials.
We establish a benchmark result for the relationship between the loanable funds and the money-creation approach to banking. In particular, we show that both processes yield the same allocations when there is no uncertainty and thus no bank default. In such cases, using the much simpler loanable funds approach as a shortcut does not imply any loss of generality.
After the Lehman-Brothers collapse, the stock index has exceeded its pre-Lehman-Brothers peak by 36% in real terms. Seemingly, markets have been demanding more stocks instead of bonds. Yet, instead of observing higher bond rates, paradoxically, bond rates have been persistently negative after the Lehman-Brothers collapse. To explain this paradox, we suggest that, in the post-Lehman-Brothers period, investors changed their perceptions on disasters, thinking that disasters occur once every 30 years on average, instead of disasters occurring once every 60 years. In our asset-pricing calibration exercise, this rise in perceived market fragility alone can explain the drop in both bond rates and price-dividend ratios observed after the Lehman-Brothers collapse, which indicates that markets mostly demanded bonds instead of stocks.
Public employees in many developing economies earn much higher wages than similar privatesector workers. These wage premia may reflect an efficient return to effort or unobserved skills, or an inefficient rent causing labor misallocation. To distinguish these explanations, we exploit the Kenyan government’s algorithm for hiring eighteen-thousand new teachers in 2010 in a regression discontinuity design. Fuzzy regression discontinuity estimates yield a civil-service wage premium of over 100 percent (not attributable to observed or unobserved skills), but no effect on motivation, suggesting rent-sharing as the most plausible explanation for the wage premium.
The German savings and cooperative banks of the 19th century were precursors of modern microfinance. They provided access to financial services for the majority of the German population, which was formerly excluded from bank funding. Furthermore, they did this at low costs for themselves and affordable prices for their clients. By creating networks of financially viable and stable financial institutions covering the entire country, they contributed significantly to building a sound and “inclusive” financial infrastructure in Germany. A look back at the history of German savings and cooperative banks and combining these experiences with the lessons learned from modern microfinance can guide current policy and be valuable for present and future models of microfinance business.
For some time now, structural macroeconomic models used at central banks have been predominantly New Keynesian DSGE models featuring nominal rigidities and forwardlooking decision-making. While these features are widely deemed crucial for policy evaluation exercises, most central banks have added more detailed characterizations of the financial sector to these models following the Great Recession in order to improve their fit to the data and their forecasting performance. We employ a comparative approach to investigate the characteristics of this new generation of New Keynesian DSGE models and document an elevated degree of model uncertainty relative to earlier model generations. Policy transmission is highly heterogeneous across types of financial frictions and monetary policy causes larger effects, on average. The New Keynesian DSGE models we analyze suggest that a simple policy rule robust to model uncertainty involves a weaker response to inflation and the output gap in the presence of financial frictions as compared to earlier generations of such models. Leaning-against-the-wind policies in models of this class estimated for the Euro Area do not lead to substantial gains. With regard to forecasting performance, the inclusion of financial frictions can generate improvements, if conditioned on appropriate data. Looking forward, we argue that model-averaging and embracing alternative modelling paradigms is likely to yield a more robust framework for the conduct of monetary policy.
Financial market interactions can lead to large and persistent booms and recessions. Instability is an inherent threat to economies with speculative financial markets. A central bank’s interest rate setting can amplify the expectation feedback in the financial market and this can lead to unstable dynamics and excess volatility. The paper suggests that policy institutions may be well-advised to handle tools like asset price targeting with care since such instruments might add a structural link between asset prices and macroeconomic aggregates. Neither stock prices nor indices are a good indicator to base decisions on.
We document that natural disasters significantly weaken the stability of banks with business activities in affected regions, as reflected in lower z-scores, higher probabilities of default, higher non-performing assets ratios, higher foreclosure ratios, lower returns on assets and lower bank equity ratios. The effects are economically relevant and suggest that insurance payments and public aid programs do not sufficiently protect bank borrowers against financial difficulties. We also find that the adverse effects on bank stability dissolve after some years if no further disasters occur in the meantime.
People who delay claiming Social Security receive higher lifelong benefits upon retirement. We survey individuals on their willingness to delay claiming later, if they could receive a lump sum in lieu of a higher annuity payment. Using a moment-matching approach, we calibrate a lifecycle model tracking observed claiming patterns under current rules and predict optimal claiming outcomes under the lump sum approach. Our model correctly predicts that early claimers under current rules would delay claiming most when offered actuarially fair lump sums, and for lump sums worth 87% as much, claiming ages would still be higher than at present.
Optimal trend inflation
(2017)
We present a sticky-price model incorporating heterogeneous Firms and systematic firm-level productivity trends. Aggregating the model in closed form, we show that it delivers radically different predictions for the optimal inflation rate than canonical sticky price models featuring homogenous Firms:
(1) the optimal steady-state inflation rate generically differs from zero and,
(2) inflation optimally responds to productivity disturbances.
Using micro data from the US Census Bureau to estimate the inflation-relevant productivity trends at the firm level, we find that the optimal US inflation rate is positive. It was slightly above 2 percent in the year 1986, but continuously declined thereafter, reaching about 1 percent in the year 2013.
Patterns and interpretation
(2017)
One thing for sure: digitization has completely changed the literary archive. People like me used to work on a few hundred nineteenth-century novels; today, we work on thousands of them; tomorrow, hundreds of thousands. This has had a major effect on literary history, obviously enough, but also on critical methodology; because, when we work on 200,000 novels instead of 200, we are not doing the same thing, 1,000 times bigger; we are doing a different thing. The new scale changes our relationship to our object, and in fact 'it changes the object itself'.
Different insurance activities exhibit different levels of persistence of shocks and volatility. For example, life insurance is typically more persistent but less volatile than non-life insurance. We examine how diversification among life, non-life insurance, and active reinsurance business affects an insurer's contribution and exposure to the risk of other companies. Our model shows that a counterparty's credit risk exposure to an insurance group substantially depends on the relative proportion of the insurance group's life and non-life business. The empirical analysis confirms this finding with respect to several measures for spillover risk. The optimal proportion of life business that minimizes spillover risk decreases with leverage of the insurance group, and increases with active reinsurance business.
New provisioning rules introduced by IFRS 9 are expected to reduce the procyclicality of provisioning. Heterogeneity among banks in the procyclicality of provisioning may not only reflect the formal accounting rules, but also variation in discretionary provisioning policies. This paper presents empirical evidence on the heterogeneity of provisioning procyclicality among significant banks that are directly supervised by the ECB. In particular, this paper finds that provisioning is relatively procyclical at banks that have i) high loans-to-assets ratios, ii) high shares of non-interest income in total operating income, iii) low capitalization rates, and iv) low total assets. Supervisory guidance provided to banks on how to implement IFRS 9 has mostly been of a qualitative nature, and may prove inadequate to prevent an undesirably wide future variation in provisioning among EU banks.
This paper was provided at the request of the Committee on Economic and Monetary Affairs of the European Parliament and commissioned and drafted under the responsibility of the Economic Governance Support Unit (EGOV) of the European Parliament. It was originally published on the European Parliament’s webpage.
Recent work has analyzed the forecasting performance of standard dynamic stochastic general equilibrium (DSGE) models, but little attention has been given to DSGE models that incorporate nonlinearities in exogenous driving processes. Against that background, we explore whether incorporating stochastic volatility improves DSGE forecasts (point, interval, and density). We examine real-time forecast accuracy for key macroeconomic variables including output growth, inflation, and the policy rate. We find that incorporating stochastic volatility in DSGE models of macroeconomic fundamentals markedly improves their density forecasts, just as incorporating stochastic volatility in models of financial asset returns improves their density forecasts.
Crowdfunding is a buzzword that signifies a sub-set in the new forms of finance facilitated by advances in information technology usually categorized as fintech. Concerns for financial stability, investor and consumer protection, or the prevention of money laundering or funding of terrorism hinge incrementally on including the new techniques to initiate financing relationships adequately in the regulatory framework.
This paper analyzes the German regulation of crowdinvesting and finds that it does not fully live up to the regulatory challenges posed by this novel form of digitized matching of supply and demand on capital markets. It should better reflect the key importance of crowdinvesting platforms, which may become critical providers of market infrastructure in the not too distant future. Moreover, platforms can play an important role in investor protection that cannot be performed by traditional disclosure regimes geared towards more seasoned issuers. Against this background, the creation of an exemption from the traditional prospectus regime seems to be a plausible policy choice. However, it needs to be complemented by an adequate regulatory stimulation of platforms’ role as gatekeepers.
We introduce rewriting of meta-expressions which stem from a meta-language that uses higher-order abstract syntax augmented by meta-notation for recursive let, contexts, sets of bindings, and chain variables. Additionally, three kinds of constraints can be added to meta-expressions to express usual constraints on evaluation rules and program transformations. Rewriting of meta-expressions is required for automated reasoning on programs and their properties. A concrete application is a procedure to automatically prove correctness of program transformations in higher-order program calculi which may permit recursive let-bindings as they occur in functional programming languages. Rewriting on meta-expressions can be performed by solving the so-called letrec matching problem which we introduce. We provide a matching algorithm to solve it. We show that the letrec matching problem is NP-complete, that our matching algorithm is sound and complete, and that it runs in non-deterministic polynomial time.
This paper investigates the effects of a rise in interest rate and lapse risk of endowment life insurance policies on the liquidity and solvency of life insurers. We model the book and market value balance sheet of an average German life insurer, subject to both GAAP and Solvency II regulation, featuring an existing back book of policies and an existing asset allocation calibrated by historical data. The balance sheet is then projected forward under stochastic financial markets. Lapse rates are modeled stochastically and depend on the granted guaranteed rate of return and prevailing level of interest rates. Our results suggest that in the case of a sharp increase in interest rates, policyholders sharply increase lapses and the solvency position of the insurer deteriorates in the short-run. This result is particularly driven by the interaction between a reduction in the market value of assets, large guarantees for existing policies, and a very slow adjustment of asset returns to interest rates. A sharp or gradual rise in interest rates is associated with substantial and persistent liquidity needs, that are particularly driven by lapse rates.
The growth and popularity of defined contribution pensions, along with the government’s increasing attention to retirement plan costs and investment choices provided, make it important to understand how people select their retirement plan investments. This paper shows how employees in a large firm altered their fund allocations when the employer streamlined its pension fund menu and deleted nearly half of the offered funds. Using administrative data, we examine the changes in plan participant investment choices that resulted from the streamlining and how these changes might affect participants’ eventual retirement wellbeing. We show that streamlined participants’ new allocations exhibited significantly lower within-fund turnover rates and expense ratios, and we estimate this could lead to aggregate savings for these participants over a 20-year period of $20.2M, or in excess of $9,400 per participant. Moreover, after the reform, streamlined participants’ portfolios held significantly less equity and exhibited significantly lower risks by way of reduced exposures to most systematic risk factors, compared to their non-streamlined counterparts.
This paper applies the theory of structured finance to the regulation of asset backed securities. We find the current regulation in Europe (Article 405 of the CRR) and the US (Section D of Dodd-Frank Act) to be severely flawed with respect to its key intention: the imposition of a strict loss retention requirement. While nominal retention is always 5%, the true level of loss retention varies across available retention options between zero loss retention and full loss retention at the extreme ends. Based on a standard model of structured finance transactions, we propose a new risk retention metric RM measuring the level of an issuer’s skin-in-the-game. The new metric could help to achieve a better implementation of CRR/CRD-IV and DFA, by making disclosure of the RM-number compulsory for all ABS transactions. There are also implications for the operation of rating agencies. On a general level, the RM metric will be instrumental in achieving simplicity and transparency in securitizations (STS).
The Capital Markets Union-project of the European Commission aims for an increase of marketbased debt financing of small and medium-sized enterprises (SMEs), complementing bank lending. In this essay we argue that rather than focussing on pure non-bank lending, a reasonable mix of bankand market-based financing should be considered. Banks are said to have a comparative advantage in critical lending functions such as credit screening, debtor monitoring and debt renegotiation. All forms of lending require a persistent skin-in-the-game of critical players in order to be effective. The regulator should insist on full disclosure of skin-in-the-game, thereby improving capital allocation and reducing systemic risks.
We introduce an innovative approach to measure bank integration, based on the corporate culture of multinational banking conglomerates. The new measure, the Power Index, assesses the prevalence of a language of power and authority in the financial reports of global banks. We employ a two-step approach: as a first step, we investigate whether parent-bank or parent-country characteristics are more important for bank integration. In a second step, we analyze whether bank integration affects the transmission of shocks across borders. We find that the level of integration of global banks is determined by parent-bank-specific factors, as well as by the social centralization in the parent’s country: ethnically diverse and linguistically homogenous countries nurture decentralized corporate structures. Political and economic factors, such as corruption, political rights and economic development also affect bank integration. Furthermore, we find that organizational integration affects the transmission of exogenous shocks from parent banks to their subsidiaries: the more centralized a global bank is, the lower the lending of its subsidiaries after a solvency shock. Wholesale shocks do not appear to be transmitted through this channel. Also, past experience with solvency shocks reduces the integration between parents and subsidiaries.
We explore space improvements in LRP, a polymorphically typed call-by-need functional core language. A relaxed space measure is chosen for the maximal size usage during an evaluation. It Abstracts from the details of the implementation via abstract machines, but it takes garbage collection into account and thus can be seen as a realistic approximation of space usage. The results are: a context lemma for space improving translations and for space equivalences; all but one reduction rule of the calculus are shown to be space improvements, and the exceptional one, the copy-rule, is shown to increase space only moderately.
Several further program transformations are shown to be space improvements or space equivalences, in particular the translation into machine expressions is a space equivalence. These results are a step Forward in making predictions about the change in runtime space behavior of optimizing transformations in callbyneed functional languages.
We explore space improvements in LRP, a polymorphically typed call-by-need functional core language. A relaxed space measure is chosen for the maximal size usage during an evaluation. It Abstracts from the details of the implementation via abstract machines, but it takes garbage collection into account and thus can be seen as a realistic approximation of space usage. The results are: a context lemma for space improving translations and for space equivalences; all but one reduction rule of the calculus are shown to be space improvements, and the exceptional one, the copy-rule, is shown to increase space only moderately.
Several further program transformations are shown to be space improvements or space equivalences, in particular the translation into machine expressions is a space equivalence. These results are a step Forward in making predictions about the change in runtime space behavior of optimizing transformations in callbyneed functional languages.
This paper examines the relationship between oil movements and systemic risk of financial institution in major petroleum-based economies. We estimate ΔCoVaR for those institutions and observe the presence of elevated increases in its levels corresponding to the subprime and global financial crises. The results provide evidence in favor of risk measurement improvements by accounting for oil returns in the risk functions. The spread between the standard CoVaR and the CoVaR that includes oil absorbs in a time range longer than the duration of the oil shock. This indicates that the drop in the oil price has a longer effect on risk and requires more time to be discounted by the financial institutions. To support the analysis, we consider also the other major market-based systemic risk measures.
During the last IAIS Global Seminar in June 2017, IAIS disclosed the agenda for a gradual shift in the systemic risk assessment methodology from the current Entity Based Approach (EBA) to a new Activity Based Approach(ABA). The EBA, which was developed in the aftermath of the 2008/2009 financial crisis, defines a list of Global Systemically Important Insurers (G-SIIs) based on a pre-defined set of criteria related to the size of the institution. These G-SIIs are subject to additional regulatory requirements since their distress or disorderly failure would potentially cause significant disruption to the global financial system and economic activity. Even if size is still a needed element of a systemic risk assessment, the strong emphasis put on the too-big-to-fail approach in insurance, i.e. EBA, might be partially missing the underlying nature of systemic risk in insurance. Not only certain activities, including insurance activities such as life or non-life lines of business, but also common exposures or certain managerial practices such as leverage or funding structures, tend to contribute to systemic risk of insurers but are not covered by the current EBA (Berdin and Sottocornola, 2015). Therefore, we very much welcome the general development of the systemic risk assessment methodology, even if several important questions still need to be answered.
According to the Bank Recovery and Resolution Directive (BRRD), introduced as a lesson from the recent financial crisis, the losses a failing bank incurred should generally be borne by its investors. Before a minimum bail-in has occurred, government money can only be injected in emergency cas-es to remedy a serious disturbance in the economy and to preserve financial stability. This policy letter argues that in case of the Italian Bank Monte dei Paschi di Siena (MPS), which the Italian gov-ernment currently plans to bail out, a resolution would most likely not cause such a systemic event. A bailout contrary to the existing rules will lead to a mispricing of bank capital and retard the re-structuring of the European banking sector, the authors write. They appeal to the European Central Bank, the Systemic Risk Board and the EU Commission to follow the rules as the test-case MPS will have a direct impact on the credibility of the new BRRD regime and the responsible institutions.
The international diffusion of technology plays a key role in stimulating global growth and explaining co-movements of international equity returns. Existing empirical evidence suggests that countries are heterogeneous in their attitude toward innovation: Some countries rely more on technology adoption while other countries rely more on internal technology production. European countries that rely more on adoption are also typically characterized by lower fiscal policy exibility and higher labor market rigidity. We develop a two-country model – where both countries rely on R&D and adoption – to study the short-run and long-run effects of aggregate technology and adoption probability shocks on economic growth in the presence of the aforementioned asymmetries. Our framework suggests that an increase in the ability to adopt technology from abroad stimulates economic growth in the country that benefits from higher adoption rates but the beneficial effects also spread to the foreign country. Moreover, it helps explaining the differences in macro quantities and equity returns observed in the international data.
This paper examines the welfare implications of rising temperatures. Using a standard VAR, we empirically show that a temperature shock has a sizable, negative and statistically significant impact on TFP, output, and labor productivity. We rationalize these findings within a production economy featuring long-run temperature risk. In the model, macro-aggregates drop in response to a temperature shock, consistent with the novel evidence in the data. Such adverse effects are long-lasting. Over a 50-year horizon, a one-standard deviation temperature shock lowers both cumulative output and labor productivity growth by 1.4 percentage points. Based on the model, we also show that temperature risk is associated with non-negligible welfare costs which amount to 18.4% of the agent's lifetime utility and grow exponentially with the size of the impact of temperature on TFP. Finally, we show that faster adaptation to temperature shocks results in lower welfare costs. These welfare benefits become substantially higher in the presence of permanent improvements in the speed of adaptation.
We show an ambivalent role of high-frequency traders (HFTs) in the Eurex Bund Futures market around high-impact macroeconomic announcements and extreme events. Around macroeconomic announcements, HFTs serve as market makers, post competitive spreads, and earn most of their profits through liquidity supply. Right before the announcement, however, HFTs significantly widen spreads and cause a rapid but short-lived drying-out of liquidity. In turbulent periods, such as after the U.K. Brexit announcement, HFTs shift their focus from market making activities to aggressive (but not necessarily profitable) directional strategies. Then, HFT activity becomes dominant and market quality can degrade.
This paper reexamines the current legal landscape regarding the protection of trade marks and other industrial property rights in signs on the Internet. It is based on a comparative analysis of EU and national laws, in particular, German, U.S., and U.K. law. It starts with a short restatement of the principles governing trade mark conflicts that occur within a particular jurisdiction (part 2) and proceeds to the regulation of transnational disputes (part 3). This juxtaposition yields two basic approaches. Whereas trade mark conflicts within closed legal systems are generally adjudicated according to a binary either/or logic, transnational disputes are and should indeed be solved in a way that leads to a fair coexistence of conflicting trade mark laws and rights under multiple laws. This paper explains how geolocation technologies can alleviate the implementation of the principle of fair coexistence in concrete cases.
This paper sets the background for the Special Issue of the Journal of Empirical Finance on the European Sovereign Debt Crisis. It identifies the channel through which risks in the financial industry leaked into the public sector. It discusses the role of the bank rescues in igniting the sovereign debt crisis and reviews approaches to detect early warning signals to anticipate the buildup of crises. It concludes with a discussion of potential implications of sovereign distress for financial markets.
A tontine provides a mortality driven, age-increasing payout structure through the pooling of mortality. Because a tontine does not entail any guarantees, the payout structure of a tontine is determined by the pooling of individual characteristics of tontinists. Therefore, the surrender decision of single tontinists directly affects the remaining members' payouts. Nevertheless, the opportunity to surrender is crucial to the success of a tontine from a regulatory as well as a policyholder perspective. Therefore, this paper derives the fair surrender value of a tontine, first on the basis of expected values, and then incorporates the increasing payout volatility to determine an equitable surrender value. Results show that the surrender decision requires a discount on the fair surrender value as security for the remaining members. The discount intensifies in decreasing tontine size and increasing risk aversion. However, tontinists are less willing to surrender for decreasing tontine size and increasing risk aversion, creating a natural protection against tontine runs stemming from short-term liquidity shocks. Furthermore we argue that a surrender decision based on private information requires a discount on the fair surrender value as well.
On average young people \undersave" whereas old people \oversave" with respect to the rational expectations model of life-cycle consumption and savings. According to numerous studies on subjective survival beliefs, young people also \underestimate" whereas old people \overestimate" their objective survival chances on average. We take a structural behavioral economics approach to jointly address both empirical phenomena by embedding subjective survival beliefs that are consistent with these biases into a rank-dependent utility (RDU) model over life-cycle consumption. The resulting consumption behavior is dynamically inconsistent. Considering both naive and sophisticated RDU agents we show that within this framework underestimation of young age and overestimation of old age survival probabilities may (but need not) give rise to the joint occurrence of undersaving and oversaving. In contrast to this RDU model, the familiar quasi-hyperbolic discounting (QHD), which is nested as a special case, cannot generate oversaving.
During the 1970s, industrial countries, including the US and continental Europa, experienced a combination of slow productivity growth and high unemplyoment. Subsequent research has shown that the standard model of unemployment actually gives counterfactual predictions. Motivated by the observation that the 1970s were also characterized by high and rising inflation, Tesfaselassie and Wolters examine the effect of growth on unemployment in the presence of nominal price rigidity.
The authors demonstrate that the effect of growth on unemployment may be positive or negative. Faster growth leads to lower unemployment if the rate of inflation is high enough. There is a threshold level of inflation below which faster growth leads to higher unemployment and above which faster growth leads to lower unemployment. The threshold level in turn depends on labor market characteristics, such as hiring efficiency, the job destruction rate, workers' relative bargaining power and the opportunity cost of work.
The impact of network connectivity on factor exposures, asset pricing and portfolio diversification
(2017)
This paper extends the classic factor-based asset pricing model by including network linkages in linear factor models. We assume that the network linkages are exogenously provided. This extension of the model allows a better understanding of the causes of systematic risk and shows that (i) network exposures act as an inflating factor for systematic exposure to common factors and (ii) the power of diversification is reduced by the presence of network connections. Moreover, we show that in the presence of network links a misspecified traditional linear factor model presents residuals that are correlated and heteroskedastic. We support our claims with an extensive simulation experiment.
Very few people doubt that it is a fundamental demand of justice that members of legal-political normative orders ought to have legal rights that define their basic standing as subjects of such an order. But when it comes to the concrete understanding of such rights, debates abound. What is the nature of these rights – are they an expression of the sovereign will of individuals, or are they based on important human interests? How should these rights be justified – do they have a particular moral ground, and if so, only one or many?
This paper studies the long-run effects of credit market disruptions on real firm outcomes and how these effects depend on nominal wage rigidities at the firm level. I trace out the long-run investment and growth trajectories of firms which are more adversely affected by a transitory shock to aggregate credit supply. Affected firms exhibit a temporary investment gap for two years following the shock, resulting in a persistent accumulated growth gap. I show that affected firms with a higher degree of wage rigidity exhibit a steeper drop in investment and grow more slowly than affected firms with more flexible wages.
Fleckenstein et al. (2014) document that nominal Treasuries trade at higher prices than inflation-swapped indexed bonds, which exactly replicate the nominal cash flows. We study whether this mispricing arises from liquidity premiums in inflation-indexed bonds (TIPS) and inflation swaps. Using US data, we show that the level of liquidity affects TIPS, whereas swap yields include a liquidity risk premium. We also allow for liquidity effects in nominal bonds. These results are based on a model with a systematic liquidity risk factor and asset-specific liquidity characteristics. We show that these liquidity (risk) premiums explain a substantial part of the TIPS underpricing.
Despite various policy and management responses, biodiversity continues to decline worldwide. We must redouble our efforts to halt biodiversity loss. The current lack of policy action can be partly linked to an insufficient knowledge base regarding the conservation and sustainable use of biodiversity. Biodiversity research needs to incorporate both social and ecological factors to gain a deeper understanding of the interrelations between society and nature that affect biodiversity. A transdisciplinary research approach is crucial to fulfilling these requirements. It aims to produce new insights by integrating scientific and nonscientific knowledge. Several measures need to be taken to strengthen transdisciplinary social-ecological biodiversity research: Within the science community: firstly, scientists themselves must promote transdisciplinarity; secondly, the reward system for scientists must be brought into line with transdisciplinary research processes; and thirdly, academic training needs to advocate transdisciplinarity. As for research policies, research funding priorities need to be linked to large scale biodiversity policy frameworks, and funding for transdisciplinary social-ecological research on biodiversity must be increased significantly.
I analyze the real effects of the quality of the judicial enforcement by showing that an increase in the average duration of civil proceedings reduces firms' employment. I exploit a reorganization of court districts in Italy as an exogenous shock to court productivity and, using an instrumental variable approach, estimate an elasticity of employment to average trial length between -0.24 and -0.29. These results are very different from OLS estimates which do not control for endogeneity, and suggest that stronger law enforcement eases financing constraints. The effects are more pronounced in highly levered and more financially dependent firms, and appear to affect mainly firms in less financially developed areas. Revenues respond more slowly than employment to the reform, and wages fall as the judiciary improves. There is no evidence of effects on capital structure and profitability. These results offer a more complete picture of the interplay between legal institutions and real economic outcomes.
Coming (great) events cast their (long) shadow before. As the financial crisis gave birth to the creation of the European System of Financial Supervision (ESFS), the imminent Brexit now serves as an impulse to rather extensively reorganize it. Pursuant to the preferences of the Commission—as revealed in its draft for a regulation amending the regulations founding the European Supervisory Authorities (ESA)—the supervision (and regulation) of the financial sectors should be further centralized and integrated and additional powers should be given to the ESAs. To a large degree these alterations are intended to adjust the competences of the European Securities and Markets Authority (ESMA) to better meet its new objectives under the Capital Markets Union (“CMU”). In view that an equivalent to the CMU or the Banking Union—in the sense of a European Insurance Union—is not yet on the horizon for the insurance sector (or the occupational pensions sector), one could prima vista take the view that insurance supervision and regulation is once again taken captive by the necessity of regulatory reforms stemming from other financial sectors. However, even if that is partially the case, the outcome of the intended reforms might still be advantageous for the insurance sector and an important step in the right direction. Therefore, it needs to be intensively discussed.
At this stage, some of the most prominent envisioned changes to the structure, tasks and powers of the European Insurance and Occupational Pensions Authority (EIOPA) and their necessity, usefulness or counter-productivity still have to be examined.
We investigate the effect of overreaction in the fine art market. Using a unique sample of auction prices of modern prints, we define an overvalued (undervalued) print as a print that was bought for a price above (below) its high (low) auction pricing estimate. Based on the overreaction hypothesis, we predict that overvalued (undervalued) prints generate a negative (positive) excess return at a subsequent sale. Our empirical findings confirm our expectations. We report that prints that were bought for a price 10 percent above (below) its high (low) pricing estimate generate a positive (negative) excess return of 12 percent (17 percent) after controlling for the general price movement on the prints market. The price correction for overvalued (undervalued) prints is more pronounced during recessions (expansions).
This paper analyzes the bail-in tool under the Bank Recovery and Resolution Directive (BRRD) and predicts that it will not reach its policy objective. To make this argument, this paper first describes the policy rationale that calls for mandatory private sector involvement (PSI). From this analysis, the key features for an effective bail-in tool can be derived.
These insights serve as the background to make the case that the European resolution framework is likely ineffective in establishing adequate market discipline through risk-reflecting prices for bank capital. The main reason for this lies in the avoidable embeddedness of the BRRD’s bail-in tool in the much broader resolution process, which entails ample discretion of the authorities also in forcing private sector involvement. Moreover, the idea that nearly all positions on the liability side of a bank’s balance sheet should be subjected to bail-in is misguided. Instead, a concentration of PSI in instruments that fall under the minimum requirements for own funds and eligible liabilities (MREL) is preferable.
Finally, this paper synthesized the prior analysis by putting forward an alternative regulatory approach that seeks to disentangle private sector involvement as a precondition for effective bank-resolution as much as possible form the resolution process as such.
This paper analyses the bail-in tool under the BRRD and predicts that it will not reach its policy objective. To make this argument, this paper first describes the policy rationale that calls for mandatory PSI. From this analysis the key features for an effective bail-in tool can be derived. These insights serve as the background to make the case that the European resolution framework is likely ineffective in establishing adequate market discipline through risk-reflecting prices for bank capital. The main reason for this lies in the avoidable embeddedness of the BRRD’s bail-in tool in the much broader resolution process which entails ample discretion of the authorities also in forcing private sector involvement. Finally, this paper synthesized the prior analysis by putting forward an alternative regulatory approach that seeks to disentangle private sector involvement as a precondition for effective bank-resolution as much as possible form the resolution process as such.
The object of this study is one of the most ambitious projects of twentieth-century art history: Aby Warburg's 'Atlas Mnemosyne', conceived in the summer of 1926 – when the first mention of a 'Bilderatlas', or "atlas of images", occurs in his journal – and truncated three years later, unfinished, by his sudden death in October 1929. Mnemosyne consisted in a series of large black panels, about 170x140 cm., on which were attached black-and-white photographs of paintings, sculptures, book pages, stamps, newspaper clippings, tarot cards, coins, and other types of images. Warburg kept changing the order of the panels and the position of the images until the very end, and three main versions of the Atlas have been recorded: one from 1928 (the "1-43 version", with 682 images); one from the early months of 1929, with 71 panels and 1050 images; and the one Warburg was working on at the time of his death, also known as the "1-79 version", with 63 panels and 971 images (which is the one we will examine). But Warburg was planning to have more panels – possibly many more – and there is no doubt that Mnemosyne is a dramatically unfinished and controversial object of study.
Telemonitoring devices can be used to screen consumers' characteristics and mitigate information asymmetries that lead to adverse selection in insurance markets. However, some consumers value their privacy and dislike sharing private information with insurers. In the second-best efficient Wilson-Miyazaki-Spence framework, we allow for consumers to reveal their risk type for an individual subjective cost and show analytically how this affects insurance market equilibria as well as utilitarian social welfare. Our analysis shows that the choice of information disclosure with respect to revelation of their risk type can substitute deductibles for consumers whose transparency aversion is sufficiently low. This can lead to a Pareto improvement of social welfare and a Pareto efficient market allocation. However, if all consumers are offered cross-subsidizing contracts, the introduction of a transparency contract decreases or even eliminates cross-subsidies. Given the prior existence of a WMS equilibrium, utility is shifted from individuals who do not reveal their private information to those who choose to reveal. Our analysis provides a theoretical foundation for the discussion on consumer protection in the context of digitalization. It shows that new technologies bring new ways to challenge crosssubsidization in insurance markets and stresses the negative externalities that digitalization has on consumers who are not willing to take part in this development.