Refine
Year of publication
- 2017 (129) (remove)
Document Type
- Working Paper (129) (remove)
Has Fulltext
- yes (129)
Is part of the Bibliography
- no (129)
Keywords
- asset pricing (4)
- bail-in (4)
- financial stability (4)
- EIOPA (3)
- MREL (3)
- TLAC (3)
- Asset pricing (2)
- Banking Union (2)
- Corporate Governance (2)
- Culture (2)
Institute
- Wirtschaftswissenschaften (91)
- Center for Financial Studies (CFS) (79)
- Sustainable Architecture for Finance in Europe (SAFE) (65)
- House of Finance (HoF) (46)
- Institute for Monetary and Financial Stability (IMFS) (12)
- Rechtswissenschaft (8)
- Informatik (5)
- Exzellenzcluster Die Herausbildung normativer Ordnungen (3)
- Gesellschaftswissenschaften (3)
- Kulturwissenschaften (3)
We establish a benchmark result for the relationship between the loanable funds and the money-creation approach to banking. In particular, we show that both processes yield the same allocations when there is no uncertainty and thus no bank default. In such cases, using the much simpler loanable funds approach as a shortcut does not imply any loss of generality.
The impact of network connectivity on factor exposures, asset pricing and portfolio diversification
(2017)
This paper extends the classic factor-based asset pricing model by including network linkages in linear factor models. We assume that the network linkages are exogenously provided. This extension of the model allows a better understanding of the causes of systematic risk and shows that (i) network exposures act as an inflating factor for systematic exposure to common factors and (ii) the power of diversification is reduced by the presence of network connections. Moreover, we show that in the presence of network links a misspecified traditional linear factor model presents residuals that are correlated and heteroskedastic. We support our claims with an extensive simulation experiment.
The growth and popularity of defined contribution pensions, along with the government’s increasing attention to retirement plan costs and investment choices provided, make it important to understand how people select their retirement plan investments. This paper shows how employees in a large firm altered their fund allocations when the employer streamlined its pension fund menu and deleted nearly half of the offered funds. Using administrative data, we examine the changes in plan participant investment choices that resulted from the streamlining and how these changes might affect participants’ eventual retirement wellbeing. We show that streamlined participants’ new allocations exhibited significantly lower within-fund turnover rates and expense ratios, and we estimate this could lead to aggregate savings for these participants over a 20-year period of $20.2M, or in excess of $9,400 per participant. Moreover, after the reform, streamlined participants’ portfolios held significantly less equity and exhibited significantly lower risks by way of reduced exposures to most systematic risk factors, compared to their non-streamlined counterparts.
During the 1970s, industrial countries, including the US and continental Europa, experienced a combination of slow productivity growth and high unemplyoment. Subsequent research has shown that the standard model of unemployment actually gives counterfactual predictions. Motivated by the observation that the 1970s were also characterized by high and rising inflation, Tesfaselassie and Wolters examine the effect of growth on unemployment in the presence of nominal price rigidity.
The authors demonstrate that the effect of growth on unemployment may be positive or negative. Faster growth leads to lower unemployment if the rate of inflation is high enough. There is a threshold level of inflation below which faster growth leads to higher unemployment and above which faster growth leads to lower unemployment. The threshold level in turn depends on labor market characteristics, such as hiring efficiency, the job destruction rate, workers' relative bargaining power and the opportunity cost of work.
This study provides a graphic overview on core legislation in the area of economic and financial services. The presentation essentially covers the areas within the responsibility of the Economic and Monetary Affairs Committee (ECON); hence it starts with core ECON areas but also displays neighbouring areas of other Committees' competences which are closely connected to and impacting on ECON's work. It shows legislation in force, proposals and other relevant provisions on banking, securities markets and investment firms, market infrastructure, insurance and occupational pensions, payment services, consumer protection in financial services, the European System of Financial Supervision, European Monetary Union, euro bills and coins and statistics, competition, taxation, commerce and company law, accounting and auditing. Moreover, it notes selected provisions that might become relevant in the upcoming Article 50 TEU negotiations.
We compare the cost effectiveness of two pronatalist policies:
(a) child allowances; and
(b) daycare subsidies.
We pay special attention to estimating how intended fertility (fertility before children are born) responds to these policies. We use two evaluation tools:
(i) a dynamic model on fertility, labor supply, outsourced childcare time, parental time, asset accumulation and consumption; and
(ii) randomized vignette-survey policy experiments.
We implement both tools in the United States and Germany, finding consistent evidence that daycare subsidies are more cost effective. Nevertheless, the required public expenditure to increase fertility to the replacement level might be viewed as prohibitively high.
After the Lehman-Brothers collapse, the stock index has exceeded its pre-Lehman-Brothers peak by 36% in real terms. Seemingly, markets have been demanding more stocks instead of bonds. Yet, instead of observing higher bond rates, paradoxically, bond rates have been persistently negative after the Lehman-Brothers collapse. To explain this paradox, we suggest that, in the post-Lehman-Brothers period, investors changed their perceptions on disasters, thinking that disasters occur once every 30 years on average, instead of disasters occurring once every 60 years. In our asset-pricing calibration exercise, this rise in perceived market fragility alone can explain the drop in both bond rates and price-dividend ratios observed after the Lehman-Brothers collapse, which indicates that markets mostly demanded bonds instead of stocks.
For some time now, structural macroeconomic models used at central banks have been predominantly New Keynesian DSGE models featuring nominal rigidities and forwardlooking decision-making. While these features are widely deemed crucial for policy evaluation exercises, most central banks have added more detailed characterizations of the financial sector to these models following the Great Recession in order to improve their fit to the data and their forecasting performance. We employ a comparative approach to investigate the characteristics of this new generation of New Keynesian DSGE models and document an elevated degree of model uncertainty relative to earlier model generations. Policy transmission is highly heterogeneous across types of financial frictions and monetary policy causes larger effects, on average. The New Keynesian DSGE models we analyze suggest that a simple policy rule robust to model uncertainty involves a weaker response to inflation and the output gap in the presence of financial frictions as compared to earlier generations of such models. Leaning-against-the-wind policies in models of this class estimated for the Euro Area do not lead to substantial gains. With regard to forecasting performance, the inclusion of financial frictions can generate improvements, if conditioned on appropriate data. Looking forward, we argue that model-averaging and embracing alternative modelling paradigms is likely to yield a more robust framework for the conduct of monetary policy.
Since 2014 the ECB has implemented a massive expansion of monetary policy including large-scale asset purchases and negative policy rates. As the euro area economy has improved and inflation has risen, questions concerning the future normalization of monetary policy are starting to dominate the public debate.
The study argues that the ECB should develop a strategy for policy normalization and communicate it very soon to prepare the ground for subsequent steps towards tightening. It provides analysis and makes proposals concerning key aspects of this strategy. The aim is to facilitate the emergence of expectations among market participants that are consistent with a smooth process of policy normalization.
What processes transform (im)mobile individuals into ‘migrants’ and geographic movements across political-territorial borders into ‘migration’? To address this question, the article develops the doing migration approach, which combines perspectives from social constructivism, praxeology and the sociologies of knowledge and culture. ‘Doing migration’ starts with the processes of social attribution that differentiate between ‘migrants’ and ‘non-migrants’. Embedded in institutional, organizational and interactional routines these attributions generate unique social orders of migration. By illustrating these conceptual ideas, the article provides insights into the elements of the contemporary European order of ‘migration’. Its institutional routines contribute to the emergence of a European migration regime that involves narratives of economization, securitization and humanitarization. The organizational routines of the European migration order involve surveillance and diversity management, which have disciplining effects on those defined as ‘migrants’. The routines of everyday face-to-face interactions produce various micro-forms of doing ‘migration’ through stigmatization and othering, but they also provide opportunities to resist a social attribution as ‘migrant’.
This paper reviews social network analysis (SNA) as a method to be utilized in biographical research which is a novel contribution. We argue that applying SNA in the context of biography research through standardized data collection as well as visualization of networks can open up participants’ interpretations of relations throughout their lives, and allow a creative and innovative way of data collection that is responsive to participants’ own meanings and associations while allowing the researchers to conduct systematical data analysis. The paper discusses the analytical potential of SNA in biographical research, where the efficacy of this method is critically discussed, together with its limitations, and its potential within the context of biographical research.
Bank regulators have the discretion to discipline banks by executing enforcement actions to ensure that banks correct deficiencies regarding safe and sound banking principles. We
highlight the trade-offs regarding the execution of enforcement actions for financial stability. Following this we provide an overview of the differences in the legal framework governing supervisors’ execution of enforcement actions in the Banking Union and the United States. After discussing work on the effect of enforcement action on bank behaviour and the real economy, we present data on the evolution of enforcement actions
and monetary penalties by U.S. regulators. We conclude by noting the importance of supervisors to levy efficient monetary penalties and stressing that a division of competences among different regulators should not lead to a loss of efficiency regarding
the execution of enforcement actions.
This paper aims to analyze the effects of financial constraints and the financial crisis on the financing and investment policies of newly founded firms. Thereby, the analysis adds important new insights on a crucial segment of the economy. We make use of a large and comprehensive data set of French firms founded in the years 2004-2006, i.e. well before the financial crisis. Our panel data analysis shows that the global financial crisis imposed a shock (mostly demand-driven) on the financing as well as on the investments of these firms. Moreover, we find that financially constrained firms use less external debt financing and invest smaller amounts. They also rely on less trade credit. With regard to bank financing, newly founded firms which are more financially constrained accumulate less bank debt and repay initial bank debt slower than their non-financially constraint counterparts. Finally, we find that financially constrained firms are affected to a smaller degree by the financial crisis than their less financially constrained counterparts.
We develop a state-space model to decompose bid and ask quotes of CDS into two components, fair default premium and liquidity premium. This approach gives a better estimate of the default premium than mid quotes, and it allows to disentangle and compare the liquidity premium earned by the protection buyer and the protection seller. In contrast to other studies, our model is structurally much simpler, while it also allows for correlation between liquidity and default premia, as supported by empirical evidence. The model is implemented and applied to a large data set of 118 CDS for a period ranging from 2004 to 2010. The model-generated output variables are analyzed in a difference-in-difference framework to determine how the default premium, as well as the liquidity premium of protection buyers and sellers, evolved during different periods of the financial crisis and to which extent they differ for financial institutions compared to non-financials.
This paper examines the relationship between oil movements and systemic risk of financial institution in major petroleum-based economies. We estimate ΔCoVaR for those institutions and observe the presence of elevated increases in its levels corresponding to the subprime and global financial crises. The results provide evidence in favor of risk measurement improvements by accounting for oil returns in the risk functions. The spread between the standard CoVaR and the CoVaR that includes oil absorbs in a time range longer than the duration of the oil shock. This indicates that the drop in the oil price has a longer effect on risk and requires more time to be discounted by the financial institutions. To support the analysis, we consider also the other major market-based systemic risk measures.
Motivated by tools for automaed deduction on functional programming languages and programs, we propose a formalism to symbolically represent $\alpha$-renamings for meta-expressions. The formalism is an extension of usual higher-order meta-syntax which allows to $\alpha$-rename all valid ground instances of a meta-expression to fulfill the distinct variable convention. The renaming mechanism may be helpful for several reasoning tasks in deduction systems. We present our approach for a meta-language which uses higher-order abstract syntax and a meta-notation for recursive let-bindings, contexts, and environments. It is used in the LRSX Tool -- a tool to reason on the correctness of program transformations in higher-order program calculi with respect to their operational semantics. Besides introducing a formalism to represent symbolic $\alpha$-renamings, we present and analyze algorithms for simplification of $\alpha$-renamings, matching, rewriting, and checking $\alpha$-equivalence of symbolically $\alpha$-renamed meta-expressions.
We introduce rewriting of meta-expressions which stem from a meta-language that uses higher-order abstract syntax augmented by meta-notation for recursive let, contexts, sets of bindings, and chain variables. Additionally, three kinds of constraints can be added to meta-expressions to express usual constraints on evaluation rules and program transformations. Rewriting of meta-expressions is required for automated reasoning on programs and their properties. A concrete application is a procedure to automatically prove correctness of program transformations in higher-order program calculi which may permit recursive let-bindings as they occur in functional programming languages. Rewriting on meta-expressions can be performed by solving the so-called letrec matching problem which we introduce. We provide a matching algorithm to solve it. We show that the letrec matching problem is NP-complete, that our matching algorithm is sound and complete, and that it runs in non-deterministic polynomial time.
Financial market interactions can lead to large and persistent booms and recessions. Instability is an inherent threat to economies with speculative financial markets. A central bank’s interest rate setting can amplify the expectation feedback in the financial market and this can lead to unstable dynamics and excess volatility. The paper suggests that policy institutions may be well-advised to handle tools like asset price targeting with care since such instruments might add a structural link between asset prices and macroeconomic aggregates. Neither stock prices nor indices are a good indicator to base decisions on.
Der urheberrechtlich konnotierte Begriff des Plagiats zählt zu den anerkannten Grundtatbeständen wissenschaftlichen Fehlverhaltens. Der Beitrag zeigt indes, dass das Urheberrecht und das Wissenschaftsrecht keine konzentrischen Kreise bilden, sondern unterschiedliche Zwecke mit je anderen Regelungskonzepten verfolgen. Die Übernahme urheberrechtlicher Argumentationsmuster in die Wissenschaftsethik und das Wissenschaftsrecht erschwert die Herausbildung spezifisch wissenschaftsbezogener Kriterien zur Beurteilung wissenschaftlichen Fehlverhaltens. Als Alternative entwickelt der Beitrag ein Konzept wissenschaftlicher Redlichkeit, das sich am Recht gegen unlauteren Wettbewerb orientiert. Dazu werden weitreichende teleologische und strukturelle Gemeinsamkeiten des Lauterkeitsrechts und der Regeln zu wissenschaftlichem Fehlverhalten aufgedeckt. Insbesondere verfolgen beide Materien eine funktionale Teleologie. Das Lauterkeitsrecht gewährleistet die Funktionsbedingungen des wirtschaftlichen Wettbewerbs, das Verbot wissenschaftlichen Fehlverhaltens sichert die Funktionsbedingungen und damit zugleich den Zielerreichungsgrad des offenen Wissenschaftsprozesses und des Wettbewerbs um wissenschaftliche Reputation.
I propose a dynamic stochastic general equilibrium model in which the leverage of borrowers as well as banks and housing finance play a crucial role in the model dynamics. The model is used to evaluate the relative effectiveness of a policy to inject capital into banks versus a policy to relieve households of mortgage debt. In normal times, when the economy is near the steady state and policy rates are set according to a Taylor-type rule, capital injections to banks are more effective in stimulating the economy in the long-run. However, in the middle of a housing debt crisis, when households are highly leveraged, the short-run output effects of the debt relief are more substantial. When the zero lower bound (ZLB) is additionally considered, the debt relief policy can be much more powerful in boosting the economy both in the short-run and in the longrun. Moreover, the output effects of the debt relief become increasingly larger, the longer the ZLB is binding.
This paper analyses the bail-in tool under the BRRD and predicts that it will not reach its policy objective. To make this argument, this paper first describes the policy rationale that calls for mandatory PSI. From this analysis the key features for an effective bail-in tool can be derived. These insights serve as the background to make the case that the European resolution framework is likely ineffective in establishing adequate market discipline through risk-reflecting prices for bank capital. The main reason for this lies in the avoidable embeddedness of the BRRD’s bail-in tool in the much broader resolution process which entails ample discretion of the authorities also in forcing private sector involvement. Finally, this paper synthesized the prior analysis by putting forward an alternative regulatory approach that seeks to disentangle private sector involvement as a precondition for effective bank-resolution as much as possible form the resolution process as such.
The bail-in tool as implemented in the European bank resolution framework suffers from severe shortcomings. To some extent, the regulatory framework can remedy the impediments to the desirable incentive effect of private sector involvement (PSI) that emanate from a lack of predictability of outcomes, if it compels banks to issue a sufficiently sized minimum of high-quality, easy to bail-in (subordinated) liabilities. Yet, even the limited improvements any prescription of bail-in capital can offer for PSI’s operational effectiveness seem compromised in important respects.
The main problem, echoing the general concerns voiced against the European bail-in regime, is that the specifications for minimum requirements for own funds and eligible liabilities (MREL) are also highly detailed and discretionary and thus alleviate the predicament of investors in bail-in debt, at best, only insufficiently. Quite importantly, given the character of typical MREL instruments as non-runnable long-term debt, even if investors are able to gauge the relevant risk of PSI in a bank’s failure correctly at the time of purchase, subsequent adjustment of MREL-prescriptions by competent or resolution authorities potentially change the risk profile of the pertinent instruments. Therefore, original pricing decisions may prove inadequate and so may market discipline that follows from them.
The pending European legislation aims at an implementation of the already complex specifications of the Financial Stability Board (FSB) for Total Loss Absorbing Capacity (TLAC) by very detailed and case specific amendments to both the regulatory capital and the resolution regime with an exorbitant emphasis on proportionality and technical fine-tuning. What gets lost in this approach, however, is the key policy objective of enhanced market discipline through predictable PSI: it is hardly conceivable that the pricing of MREL-instruments reflects an accurate risk-assessment of investors because of the many discretionary choices a multitude of agencies are supposed to make and revisit in the administration of the new regime. To prove this conclusion, this chapter looks in more detail at the regulatory objectives of the BRRD’s prescriptions for MREL and their implementation in the prospectively amended European supervisory and resolution framework.
Das Clearing von Euro-OTC-Derivaten post Brexit – eine Analyse der vorliegenden Kostenschätzungen
(2017)
Im Zusammenhang mit dem Brexit wird über die Kosten einer Relokation des Clearing des Euro-OTC-Derivate-Geschäftes auf ein EU-CCP diskutiert. Das vorliegende Papier zeigt, dass die bislang vorliegenden Kostenschätzungen, die von Kosten in Höhe von bis zu USD 100 Mrd. für einen Zeitraum von fünf Jahren ausgehen, viel zu hoch sind. Die erwarteten Kosten einer Relokation liegen vielmehr bei ca. USD 0,6 Mrd. p.a. bzw. ca. USD 3,2 Mrd. für eine Übergangsphase von fünf Jahren. Angesichts der hohen Bedeutung von systemrelevanten CCPs für die Stabilität der Eurozone sollten diese Kosten nicht entscheidungsrelevant für eine Relokation sein.
In the context of the upcoming Brexit, a relocation of the clearing of euro-OTC derivatives for EU-based firms is the subject of controversial discussion. The opponents of a relocation argue that a relocation would cause additional costs for market participants of up to USD 100 bn over a period of 5 years. This paper shows that this cost estimate is fairly unrealistic and that relocation costs would amount to approximately USD 0.6 bn p.a., which translates to cumulative costs of around USD 3.2 bn for a transition period of 5 years. In light of the strategic importance of systemically relevant CCPs for the financial stability of the eurozone, the potential relocation costs should not be a decision criterion.
Inhalt
1. Strauß, Johann (Vater:/Sr.) (* 14.3.1804 – † 25.9.1849) [3]
2. Strauß, Johann (Sohn:/Jr.) (* 25.10.1825 – † 3.6.1899) [6]
3. Die Operetten-Adaptionen [12]
3.1 Die Fledermaus (Operette in 3 Akten, Johann Strauß, Sohn) :/:/ UA: 5.4.1874 [12]
3.2 Eine Nacht in Venedig (Operette in 3 Akten, Johann Strauß, Sohn) :/:/ UA: 3.10.1883 [17]
3.3 Der Zigeunerbaron (Operette in 3 Akten, Johann Strauß, Sohn) :/:/ UA: 24.10.1885 [18]
3.4 Wiener Blut (Johann Strauß, Sohn) :/:/ UA: 25.10.1899 [19]
3.5 Frühlingsluft (Jose. Strauß) :/:/ UA: 9.5.1903 [
Les Blank
(2017)
To broaden the scope of monetary policy, cash abolishment is often suggested as a means of breaking through the zero lower bound. However, practically nothing is said about the welfare costs of such a proposal. Rösl, Seitz and Tödter argue that the welfare costs of bypassing the zero lower bound can be analyzed analytically and empirically by assuming negative interest rates on cash holdings. They gauge the welfare effects of abolishing cash, both, for the euro area and for Germany.
Their findings suggest that the welfare losses of negative interest rates incurred by money holders are large, notably if implemented in the current low interest rate environment. Imposing a negative interest rate of 3 percentage points on cash holdings and reducing the interest on all assets included in M3 creates a deadweight loss of € 62bn for the euro area and of €18bn for Germany. Therefore, the authors argue that cash abolishment or negative interest rates on cash to break through the zero lower bound at any price can hardly be a meaningful policy goal.
Wie verhalten sich Freiheit und Geld zueinander? In der liberalen Tradition der Philosophie und der Ökonomik wird Geld meist als bloßes Mittel gefasst, dessen Einführung den Austausch von Waren erleichtert, darüber hinaus jedoch keine tiefergreifenden sozialen Folgen zeitigt. Im Gegensatz hierzu wird in diesem Working Paper der Zusammenhang von Geld und (Un-)Freiheit herausgearbeitet. Im Anschluss an die Tradition kritischer Sozialphilosophie und in Auseinandersetzung mit Marx, Simmel und der neueren Geldsoziologie wird dabei in einem ersten Schritt der paradoxe Charakter dieser gesellschaftlich eröffneten Freiheit dargelegt: Zum einen kultiviert Geld in kapitalistischen Ökonomien eine individuelle Form von Wahlfreiheit. Zum anderen wird über Geld der Zugang zum gesellschaftlichen Reichtum auf ungleiche und disziplinierende Weise strukturiert: Je nach individueller Verfügung über finanzielle Mittel ist man auf unterschiedliche Weise zum Verkauf der eigenen Arbeitskraft angehalten, um den Zugriff auf Güter und die eigene Reproduktion zu sichern. Diese paradoxe Form von Freiheit wird in einem zweiten Schritt hinsichtlich ihrer Entfremdungstendenz befragt: Insofern die über die Institution des Geldes eröffnete Freiheit ihren gesellschaftlichen Ermöglichungsgrund verdeckt, kann sie als eine fetischisierte Form von Freiheit begriffen werden.
Rechtspopulistische Bewegungen machen sich zur Zeit in vielen westlichen Staaten zum Sprachrohr angeblich bisher unterdrückter Bevölkerungsgruppen und Meinungen. Die identitäre Bewegung entwickelt diesen Ansatz weiter zu einem Projekt der autoritären Staatlichkeit gegen Multikulturalismus, Islam und Einwanderung. Dabei verbindet sie ihre Kampagne für einen ethnisch geschlossen Nationalstaat mit der Kritik an der kapitalistischen Globalisierung. Mit einem Sprachduktus, der Politik emotionalisiert, wird durch «geistige Verschärfung» das Programm eines defensiven Ethnonationalismus entfaltet. Dieser beruft sich auf Traditionsbestandteile eines völkischen Antimodernismus und eine von dem russischen Philosophen Alexander Dugin entworfene eurasische Geopolitik.
Ein europäischer Keynesianismus als Grundlage für ein gesamteuropäisches Wirtschaftskonzept würde als offensive Gegenstrategie die Idee einer sozialstaatlichen Erneuerung propagieren können. Zudem sind Akteure aus der Zivilgesellschaft aufgefordert, gegen Fremdenfeindlichkeit und Orientierungsverlust aufklärerisch zu wirken.
Despite various policy and management responses, biodiversity continues to decline worldwide. We must redouble our efforts to halt biodiversity loss. The current lack of policy action can be partly linked to an insufficient knowledge base regarding the conservation and sustainable use of biodiversity. Biodiversity research needs to incorporate both social and ecological factors to gain a deeper understanding of the interrelations between society and nature that affect biodiversity. A transdisciplinary research approach is crucial to fulfilling these requirements. It aims to produce new insights by integrating scientific and nonscientific knowledge. Several measures need to be taken to strengthen transdisciplinary social-ecological biodiversity research: Within the science community: firstly, scientists themselves must promote transdisciplinarity; secondly, the reward system for scientists must be brought into line with transdisciplinary research processes; and thirdly, academic training needs to advocate transdisciplinarity. As for research policies, research funding priorities need to be linked to large scale biodiversity policy frameworks, and funding for transdisciplinary social-ecological research on biodiversity must be increased significantly.
We propose a 2-country asset-pricing model where agents' preferences change endogenously as a function of the popularity of internationally traded goods. We determine the effect of the time-variation of preferences on equity markets, consumption and portfolio choices. When agents are more sensitive to the popularity of domestic consumption goods, the local stock market reacts more strongly to the preferences of local agents than to the preferences of foreign agents. Therefore, home bias arises because home-country stock represents a better investment opportunity for hedging against future fluctuations in preferences. We test our model and find that preference evolution is a plausible driver of key macroeconomic variables and stock returns.
The international diffusion of technology plays a key role in stimulating global growth and explaining co-movements of international equity returns. Existing empirical evidence suggests that countries are heterogeneous in their attitude toward innovation: Some countries rely more on technology adoption while other countries rely more on internal technology production. European countries that rely more on adoption are also typically characterized by lower fiscal policy exibility and higher labor market rigidity. We develop a two-country model – where both countries rely on R&D and adoption – to study the short-run and long-run effects of aggregate technology and adoption probability shocks on economic growth in the presence of the aforementioned asymmetries. Our framework suggests that an increase in the ability to adopt technology from abroad stimulates economic growth in the country that benefits from higher adoption rates but the beneficial effects also spread to the foreign country. Moreover, it helps explaining the differences in macro quantities and equity returns observed in the international data.
On average young people \undersave" whereas old people \oversave" with respect to the rational expectations model of life-cycle consumption and savings. According to numerous studies on subjective survival beliefs, young people also \underestimate" whereas old people \overestimate" their objective survival chances on average. We take a structural behavioral economics approach to jointly address both empirical phenomena by embedding subjective survival beliefs that are consistent with these biases into a rank-dependent utility (RDU) model over life-cycle consumption. The resulting consumption behavior is dynamically inconsistent. Considering both naive and sophisticated RDU agents we show that within this framework underestimation of young age and overestimation of old age survival probabilities may (but need not) give rise to the joint occurrence of undersaving and oversaving. In contrast to this RDU model, the familiar quasi-hyperbolic discounting (QHD), which is nested as a special case, cannot generate oversaving.
We analyze the market reaction to the sentiment of the CEO speech at the Annual General Meeting (AGM). As the AGM is typically preceded by several information disclosures, the CEO speech may be expected to contribute only marginally to investors’ decision-making. Surprisingly, however, we observe from the transcripts of 338 CEO speeches of German corporates between 2008 and 2016 that their sentiment is significantly related to abnormal stock returns and trading volumes following the AGM. Using a novel business-specific German dictionary based on Loughran and McDonald (2011), we find a negative association of the post-AGM returns with the speeches’ negativity and a positive association with the speeches’ relative positivity (i.e. positivity relative to negativity). Relative positivity moreover corresponds with a lower trading volume in a short time window surrounding the AGM. Investors hence seem to perceive the sentiment of CEO speeches at AGMs as a valuable indicator of future firm performance.
Die Reihe „Papers of Excellence 2.0: Ausgewählte Arbeiten aus den Fachdidaktiken und Bildungswissenschaften der Goethe-Universität Frankfurt a.M.“ ist eine neue, erweiterte und zusätzliche Auflage der bekannten Reihe „Papers of Excellence: Ausgewählte Arbeiten aus den Fachdidaktiken“, welche seit 2010 von Daniela Elsner und Anja Wildemann im Shaker-Verlag herausgegeben wird. In alter Tradition werden auch in der ab sofort zusätzlich zur Printausgabe erscheinenden Online Version dieser Buchreihe herausragende Examens- und Masterarbeiten, die sich durch eine ausgewiesene empirische, fachdidaktische Auseinandersetzung mit einem Thema auszeichnen, zusammenfassend vorgestellt. Neu ist, dass die Online Version nun auch Arbeiten mit einem bildungswissenschaftlichen Fokus aufnimmt und solche, die an der Schnittstelle zwischen Fachdidaktik und Bildungswissenschaften an-gelegt sind. Die Papers of Excellence 2.0, die derzeit nur Studien integriert, die an der Goethe Universität Frankfurt am Main angefertigt wurden, werden von Astrid Jurecka (Bildungswissenschaften) und Daniela Elsner (Fachdidaktik) herausgegeben und sind kostenfrei zugänglich.
We propose a model for measuring the runtime of concurrent programs by the minimal number of evaluation steps. The focus of this paper are improvements, which are program transformations that improve this number in every context, where we distinguish between sequential and parallel improvements, for one or more processors, respectively. We apply the methods to CHF, a model of Concurrent Haskell extended by futures. The language CHF is a typed higher-order functional language with concurrent threads, monadic IO and MVars as synchronizing variables. We show that all deterministic reduction rules and 15 further program transformations are sequential and parallel improvements. We also show that introduction of deterministic parallelism is a parallel improvement, and its inverse a sequential improvement, provided it is applicable. This is a step towards more automated precomputation of concurrent programs during compile time, which is also formally proven to be correctly optimizing.
We explore space improvements in LRP, a polymorphically typed call-by-need functional core language. A relaxed space measure is chosen for the maximal size usage during an evaluation. It Abstracts from the details of the implementation via abstract machines, but it takes garbage collection into account and thus can be seen as a realistic approximation of space usage. The results are: a context lemma for space improving translations and for space equivalences; all but one reduction rule of the calculus are shown to be space improvements, and the exceptional one, the copy-rule, is shown to increase space only moderately.
Several further program transformations are shown to be space improvements or space equivalences, in particular the translation into machine expressions is a space equivalence. These results are a step Forward in making predictions about the change in runtime space behavior of optimizing transformations in callbyneed functional languages.
We explore space improvements in LRP, a polymorphically typed call-by-need functional core language. A relaxed space measure is chosen for the maximal size usage during an evaluation. It Abstracts from the details of the implementation via abstract machines, but it takes garbage collection into account and thus can be seen as a realistic approximation of space usage. The results are: a context lemma for space improving translations and for space equivalences; all but one reduction rule of the calculus are shown to be space improvements, and the exceptional one, the copy-rule, is shown to increase space only moderately.
Several further program transformations are shown to be space improvements or space equivalences, in particular the translation into machine expressions is a space equivalence. These results are a step Forward in making predictions about the change in runtime space behavior of optimizing transformations in callbyneed functional languages.
Under Solvency II, corporate governance requirements are a complementary, but nonetheless essential, element to build a sound regulatory framework for insurance undertakings, also to address risks not specifically mitigated by the sole solvency capital requirements. After recalling the provisions of the second pillar concerning the system of governance, the paper is devoted to highlight the emerging regulatory trends in the corporate governance of insurance firms. Among others, it signals the exceptional extension of the duties and responsibilities assigned to the Board of directors, far beyond the traditional role of both monitoring the chief executive officer, and assessing the overall direction and strategy of the business. However, a better risk governance is not necessarily built on narrow rule-based approaches to corporate governance.
This paper investigates the effects of a rise in interest rate and lapse risk of endowment life insurance policies on the liquidity and solvency of life insurers. We model the book and market value balance sheet of an average German life insurer, subject to both GAAP and Solvency II regulation, featuring an existing back book of policies and an existing asset allocation calibrated by historical data. The balance sheet is then projected forward under stochastic financial markets. Lapse rates are modeled stochastically and depend on the granted guaranteed rate of return and prevailing level of interest rates. Our results suggest that in the case of a sharp increase in interest rates, policyholders sharply increase lapses and the solvency position of the insurer deteriorates in the short-run. This result is particularly driven by the interaction between a reduction in the market value of assets, large guarantees for existing policies, and a very slow adjustment of asset returns to interest rates. A sharp or gradual rise in interest rates is associated with substantial and persistent liquidity needs, that are particularly driven by lapse rates.
Different insurance activities exhibit different levels of persistence of shocks and volatility. For example, life insurance is typically more persistent but less volatile than non-life insurance. We examine how diversification among life, non-life insurance, and active reinsurance business affects an insurer's contribution and exposure to the risk of other companies. Our model shows that a counterparty's credit risk exposure to an insurance group substantially depends on the relative proportion of the insurance group's life and non-life business. The empirical analysis confirms this finding with respect to several measures for spillover risk. The optimal proportion of life business that minimizes spillover risk decreases with leverage of the insurance group, and increases with active reinsurance business.
A tontine provides a mortality driven, age-increasing payout structure through the pooling of mortality. Because a tontine does not entail any guarantees, the payout structure of a tontine is determined by the pooling of individual characteristics of tontinists. Therefore, the surrender decision of single tontinists directly affects the remaining members' payouts. Nevertheless, the opportunity to surrender is crucial to the success of a tontine from a regulatory as well as a policyholder perspective. Therefore, this paper derives the fair surrender value of a tontine, first on the basis of expected values, and then incorporates the increasing payout volatility to determine an equitable surrender value. Results show that the surrender decision requires a discount on the fair surrender value as security for the remaining members. The discount intensifies in decreasing tontine size and increasing risk aversion. However, tontinists are less willing to surrender for decreasing tontine size and increasing risk aversion, creating a natural protection against tontine runs stemming from short-term liquidity shocks. Furthermore we argue that a surrender decision based on private information requires a discount on the fair surrender value as well.
Under Solvency II, corporate governance requirements are a complementary, but nonetheless essential, element to build a sound regulatory framework for insurance undertakings, also to address risks not specifically mitigated by the sole solvency capital requirements. After recalling the provisions of the Second Pillar concerning the system of governance, the paper highlights the emerging regulatory trends in the corporate governance of insurance firms. Among others things, it signals the exceptional extension of the duties and responsibilities assigned to the board of directors, far beyond the traditional role of both monitoring the chief executive officer, and assessing the overall direction and strategy of the business. However, a better risk governance is not necessarily built on narrow rule-based approaches to corporate governance.
Telemonitoring devices can be used to screen consumers' characteristics and mitigate information asymmetries that lead to adverse selection in insurance markets. However, some consumers value their privacy and dislike sharing private information with insurers. In the second-best efficient Wilson-Miyazaki-Spence framework, we allow for consumers to reveal their risk type for an individual subjective cost and show analytically how this affects insurance market equilibria as well as utilitarian social welfare. Our analysis shows that the choice of information disclosure with respect to revelation of their risk type can substitute deductibles for consumers whose transparency aversion is sufficiently low. This can lead to a Pareto improvement of social welfare and a Pareto efficient market allocation. However, if all consumers are offered cross-subsidizing contracts, the introduction of a transparency contract decreases or even eliminates cross-subsidies. Given the prior existence of a WMS equilibrium, utility is shifted from individuals who do not reveal their private information to those who choose to reveal. Our analysis provides a theoretical foundation for the discussion on consumer protection in the context of digitalization. It shows that new technologies bring new ways to challenge crosssubsidization in insurance markets and stresses the negative externalities that digitalization has on consumers who are not willing to take part in this development.
We study the impact of estimation errors of firms on social welfare. For this purpose, we present a model of the insurance market in which insurers face parameter uncertainty about expected loss sizes. As consumers react to under- and overestimation by increasing and decreasing demand, respectively, insurers require a safety loading for parameter uncertainty. If the safety loading is too small, less risk averse consumers benefit from less informed insurers by speculating on them underestimating expected losses. Otherwise, social welfare increases with insurers’ information. We empirically estimate safety loadings in the US property and casualty insurance market, and show that these are likely to be sufficiently large for consumers to benefit from more informed insurers.