Refine
Year of publication
- 2017 (129) (remove)
Document Type
- Working Paper (129) (remove)
Has Fulltext
- yes (129)
Is part of the Bibliography
- no (129)
Keywords
- asset pricing (4)
- bail-in (4)
- financial stability (4)
- EIOPA (3)
- MREL (3)
- TLAC (3)
- Asset pricing (2)
- Banking Union (2)
- Corporate Governance (2)
- Culture (2)
Institute
- Wirtschaftswissenschaften (91)
- Center for Financial Studies (CFS) (79)
- Sustainable Architecture for Finance in Europe (SAFE) (65)
- House of Finance (HoF) (46)
- Institute for Monetary and Financial Stability (IMFS) (12)
- Rechtswissenschaft (8)
- Informatik (5)
- Exzellenzcluster Die Herausbildung normativer Ordnungen (3)
- Gesellschaftswissenschaften (3)
- Kulturwissenschaften (3)
Crowdfunding is a buzzword that signifies a sub-set in the new forms of finance facilitated by advances in information technology usually categorized as fintech. Concerns for financial stability, investor and consumer protection, or the prevention of money laundering or funding of terrorism hinge incrementally on including the new techniques to initiate financing relationships adequately in the regulatory framework.
This paper analyzes the German regulation of crowdinvesting and finds that it does not fully live up to the regulatory challenges posed by this novel form of digitized matching of supply and demand on capital markets. It should better reflect the key importance of crowdinvesting platforms, which may become critical providers of market infrastructure in the not too distant future. Moreover, platforms can play an important role in investor protection that cannot be performed by traditional disclosure regimes geared towards more seasoned issuers. Against this background, the creation of an exemption from the traditional prospectus regime seems to be a plausible policy choice. However, it needs to be complemented by an adequate regulatory stimulation of platforms’ role as gatekeepers.
This paper reexamines the current legal landscape regarding the protection of trade marks and other industrial property rights in signs on the Internet. It is based on a comparative analysis of EU and national laws, in particular, German, U.S., and U.K. law. It starts with a short restatement of the principles governing trade mark conflicts that occur within a particular jurisdiction (part 2) and proceeds to the regulation of transnational disputes (part 3). This juxtaposition yields two basic approaches. Whereas trade mark conflicts within closed legal systems are generally adjudicated according to a binary either/or logic, transnational disputes are and should indeed be solved in a way that leads to a fair coexistence of conflicting trade mark laws and rights under multiple laws. This paper explains how geolocation technologies can alleviate the implementation of the principle of fair coexistence in concrete cases.
Hans-Joachim Schlegel
(2017)
Die Verknüpfung des Fahrrades mit dem Öffentlichen Verkehr (ÖV) kann den Umweltverbund stärken, den Übergang von einem Verkehrssystem auf das andere erleichtern und eine attraktive Alternative zum motorisierten Individualverkehr schaffen. Die vorliegende Arbeit repräsentiert den ersten umfassenden Projektbericht innerhalb des Forschungsprojektes „Verbesserte Integration des Fahrrads in den öffentlichen Verkehr – Systematische Erschließung von Handlungsoptionen und Bewertung von Best-Practices“. Ziel dieser Publikation ist eine aktuelle Aufarbeitung des Standes der Forschung und der Praxis zu diesem Thema.
Nach einer systematischen Darstellung wichtiger Forschungsstränge, werden zunächst Ergebnisse aus dem „Mobilität in Deutschland“-Datensatz hinsichtlich intermodaler Rad‐ÖV‐Nutzungsmuster präsentiert. Aufbauend darauf werden Datengrundlagen und generelle Erfordernisse an Erhebungsmethoden diskutiert, um die inter- und multimodale Verkehrsmittelnutzung tatsächlich beurteilen zu können. Die Darstellung des aktuellen Standes in der Praxis wird anhand von drei Schwerpunkt-Feldern vorgenommen: Bike and Ride Abstellanlagen an Haltestellen des ÖV, Radmitnahme in den Fahrzeugen des ÖV und Fahrradverleihsysteme. Dabei werden sowohl infrastrukturelle Maßnahmen als auch Betreiberkonzepte und Marketingaspekte behandelt. Ein weiteres Kapitel beschäftigt sich mit den Chancen und Herausforderungen im Zuge der fortschreitenden Digitalisierung und der Verbreitung von mobilen Endgeräten.
Das Fahren ohne (gültigen) Fahrschein im ÖPNV ist bereits seit den 1960er Jahren ein Problem zahlreicher Verkehrsunternehmen. Auch für die ÖPNV-Nutzenden stellt es ein Problem dar, da Personen, die den ÖPNV ohne (gültiges) Ticket nutzen in Deutschland eine Straftat begehen. In der Wissenschaft wurde das Thema aus unterschiedlichen Perspektiven heraus untersucht (v.a. Rechtswissenschaften, Betriebswirtschaften, Kriminologie sowie einige sozialwissenschaftliche Ansätze), jedoch konzentriert sich die Forschung vorrangig auf sozio-demographische Charakteristika, Marktsegmentierungen und die Folgen des Fahrens ohne (gültigen) Fahrschein für Verkehrsunternehmen und -verbünde. Die Motive und Beweggründe des Fahrens ohne (gültigen) Fahrschein werden in den vorhandenen Studien lediglich objektiv betrachtet. Das Arbeitspapier zeigt die Ergebnisse einer Untersuchung der Motive des Fahrens ohne (gültigen) Fahrschein im Bediengebiet des Rhein-Main-Verkehrsverbundes (RMV), die mithilfe von qualitativen Interviews mit Personen, die bei einer Fahrkartenkontrolle kein (gültiges) Ticket vorzeigen konnten, exploriert wurden.
In this paper we propose a way forward towards increased financial resilience in times of growing disagreement concerning open borders, free trade and global regulatory standards. In light of these concerns, financial resilience remains a highly valued policy objective. We wish to contribute by suggesting an agenda of concrete, do-able steps supporting an enhanced level of resilience, combined with a deeper understanding of its relevance in the public domain.
First, remove inconsistencies across regulatory rules and territorial regimes, and ensure their credibility concerning implementation. Second, discourage the use of financial regulatory standards as means of international competition. Third, give more weight to pedagogically explaining the established regulatory standards in public, to strengthen their societal backing.
On 15 August 2017, the Bundesverfassungsgericht (BVerfG) referred the case against the European Central Bank’s policy of Quantitative Easing (QE) to the European Court of Justice (ECJ). The author argues that this event differs in several aspects from the OMT case in 2015 – in content as well as in form. The BVerfG recognizes that it is a legitimate goal of the ECB’s monetary policy to bring inflation up close to 2%, and that the instrument employed for QE is one of monetary policy. However, it doubts whether the sheer volume of QE would not distort the character of the program as one of monetary policy. The ECJ will now have to clarify the extent to which the ECJ’s findings in its OMT judgment are relevant for QE as well as the standard of review applicable to monetary policy. The author raises the questions of whether the principle of democracy under German constitutional law can actually provide the standard by which the ECB is to be measured, and how tight judicial review could be exercised over the ECB without encroaching upon its autonomy in monetary policy matters – and thus upon the very essence of central bank independence.
Seit 2006 haben die Bundesländer das Recht, den Steuersatz der Grunderwerbsteuer selbst zu bestimmen. Von diesem Recht wurde in den meisten Bundesländern – mit Ausnahme von Bayern und Sachsen – ausgiebig Gebrauch gemacht. Mit dieser Entwicklung sind verschiedene negative Begleiterscheinungen der Steuer weiter in den Vordergrund gerückt. Ausweichreaktionen und Preiseffekte auf dem Immobilienmarkt führten dazu, dass aus jedem Prozent, um das der Steuersatz erhöht wurde, schätzungsweise nur rund 0,6 Prozent zusätzliche Steuereinahmen resultierten, während ohne Ausweichreaktionen und Preiseffekte eine Einnahmenerhöhung um ein Prozent zu erwarten gewesen wäre. Hinter diesem unterproportionalen Aufkommenseffekt sind verschiedene Mechanismen zu vermuten, wie etwa die Umgehung durch den Kauf des Grundvermögens als Teil einer Kapitalgesellschaft.
In Anbetracht der gestiegenen Steuersätze wurde im letzten Bundestagswahlkampf aus CDU sowie FDP der Ruf laut nach einem Freibetrag für Immobilienkäufer, die erworbenes Wohneigentum selbst nutzen möchten. Die Kinderzahl soll den Freibetrag je nach Vorschlag erhöhen.
Der Beitrag diskutiert kritisch die Forderung nach einer Familienkomponente der Grunderwerbssteuer und zeigt darüber hinaus mögliche Alternativen zur Einschränkung der Steuergestaltungen durch Share Deals auf.
Coming (great) events cast their (long) shadow before. As the financial crisis gave birth to the creation of the European System of Financial Supervision (ESFS), the imminent Brexit now serves as an impulse to rather extensively reorganize it. Pursuant to the preferences of the Commission—as revealed in its draft for a regulation amending the regulations founding the European Supervisory Authorities (ESA)—the supervision (and regulation) of the financial sectors should be further centralized and integrated and additional powers should be given to the ESAs. To a large degree these alterations are intended to adjust the competences of the European Securities and Markets Authority (ESMA) to better meet its new objectives under the Capital Markets Union (“CMU”). In view that an equivalent to the CMU or the Banking Union—in the sense of a European Insurance Union—is not yet on the horizon for the insurance sector (or the occupational pensions sector), one could prima vista take the view that insurance supervision and regulation is once again taken captive by the necessity of regulatory reforms stemming from other financial sectors. However, even if that is partially the case, the outcome of the intended reforms might still be advantageous for the insurance sector and an important step in the right direction. Therefore, it needs to be intensively discussed.
At this stage, some of the most prominent envisioned changes to the structure, tasks and powers of the European Insurance and Occupational Pensions Authority (EIOPA) and their necessity, usefulness or counter-productivity still have to be examined.
The Judgement of the EGC in the Case T-122/15 – Landeskreditbank Baden-Württemberg - Förderbank v European Central Bank is the first statement of the European judiciary on the sub-stantive law of the Banking Union. Beyond its specific holding, the decision is of great importance, because it hints at the methodological approach the EGC will take in interpreting prudential banking regulation in the appeals against supervisory measures that fall in its jurisdiction under TFEU, arts. 256(1) subpara 1 and 263(4). Specifically, the case pertained to the scope of direct ECB oversight of significant banks in the euro area and the reassignment of this competence to national competent authorities (NCAs) in individual circumstances (Single Supervisory Mechanism (SSM) Regulation, art. 6(4) subpara 2; SSM Framework Regulation, arts. 70, 71).
According to the Bank Recovery and Resolution Directive (BRRD), introduced as a lesson from the recent financial crisis, the losses a failing bank incurred should generally be borne by its investors. Before a minimum bail-in has occurred, government money can only be injected in emergency cas-es to remedy a serious disturbance in the economy and to preserve financial stability. This policy letter argues that in case of the Italian Bank Monte dei Paschi di Siena (MPS), which the Italian gov-ernment currently plans to bail out, a resolution would most likely not cause such a systemic event. A bailout contrary to the existing rules will lead to a mispricing of bank capital and retard the re-structuring of the European banking sector, the authors write. They appeal to the European Central Bank, the Systemic Risk Board and the EU Commission to follow the rules as the test-case MPS will have a direct impact on the credibility of the new BRRD regime and the responsible institutions.
Fascicle XVI of the exsiccate "K. KALB & A. APTROOT: LICHENES NEOTROPICI" (new name for "K. KALB: LICHENES NEOTROPIC" from fascicle XVI onwards) with 23 lichen specimens (No. 628–650) from Brazil, Chile, Dominican Republic, Ecuador, Kenya, Peru and Venezuela is distributed. Three species are described as new, namely Lopadium subcoralloideum Aptroot & Kalb, Lecanactis caceresiana Kalb & Aptroot and Rhizocarpon sipmanianum Kalb & Aptroot. The holotypes of the new species are deposited at Universidade Federal de Mato Grosso do Sul (UFMS). Range extensions are reported for Hypocenomyce tinderreyensis (new to the Neo-tropics; so far only known from Australia, but apparently austral), Ocellularia baorucensis (new to Brazil), Physcidia striata (recently described from Rondônia and the Venezuelean Amazon, and subsequently reported from Amapá and Brazilian Amazonas. The collection from Brazil/Mato Grosso do Sul represents a major range extension to the South), Tephromela campestricola (new to the Neotropics; not different in any way from European material) and Xanthoparmelia arvidssonii (new to Venezuela).
We extend the classical ”martingale-plus-noise” model for high-frequency prices by an error correction mechanism originating from prevailing mispricing. The speed of price reversal is a natural measure for informational efficiency. The strength of the price reversal relative to the signal-to-noise ratio determines the signs of the return serial correlation and the bias in standard realized variance estimates. We derive the model’s properties and locally estimate it based on mid-quote returns of the NASDAQ 100 constituents. There is evidence of mildly persistent local regimes of positive and negative serial correlation, arising from lagged feedback effects and sluggish price adjustments. The model performance is decidedly superior to existing stylized microstructure models. Finally, we document intraday periodicities in the speed of price reversion and noise-to-signal ratios.
We show an ambivalent role of high-frequency traders (HFTs) in the Eurex Bund Futures market around high-impact macroeconomic announcements and extreme events. Around macroeconomic announcements, HFTs serve as market makers, post competitive spreads, and earn most of their profits through liquidity supply. Right before the announcement, however, HFTs significantly widen spreads and cause a rapid but short-lived drying-out of liquidity. In turbulent periods, such as after the U.K. Brexit announcement, HFTs shift their focus from market making activities to aggressive (but not necessarily profitable) directional strategies. Then, HFT activity becomes dominant and market quality can degrade.
We theoretically and empirically study large-scale portfolio allocation problems when transaction costs are taken into account in the optimization problem. We show that transaction costs act on the one hand as a turnover penalization and on the other hand as a regularization, which shrinks the covariance matrix. As an empirical framework, we propose a flexible econometric setting for portfolio optimization under transaction costs, which incorporates parameter uncertainty and combines predictive distributions of individual models using optimal prediction pooling. We consider predictive distributions resulting from highfrequency based covariance matrix estimates, daily stochastic volatility factor models and regularized rolling window covariance estimates, among others. Using data capturing several hundred Nasdaq stocks over more than 10 years, we illustrate that transaction cost regularization (even to small extent) is crucial in order to produce allocations with positive Sharpe ratios. We moreover show that performance differences between individual models decline when transaction costs are considered. Nevertheless, it turns out that adaptive mixtures based on high-frequency and low-frequency information yield the highest performance. Portfolio bootstrap reveals that naive 1=N-allocations and global minimum variance allocations (with and without short sales constraints) are significantly outperformed in terms of Sharpe ratios and utility gains.
A counterparty credit limit (CCL) is a limit imposed by a financial institution to cap its maximum possible exposure to a specified counterparty. Although CCLs are designed to help institutions mitigate counterparty risk by selective diversification of their exposures, their implementation restricts the liquidity that institutions can access in an otherwise centralized pool. We address the question of how this mechanism impacts trade prices and volatility, both empirically and via a new model of trading with CCLs. We find empirically that CCLs cause little impact on trade. However, our model highlights that in extreme situations, CCLs could serve to destabilize prices and thereby influence systemic risk.
Exploiting NASDAQ order book data and difference-in-differences methodology, we identify the distinct effects of trading pause mechanisms introduced on U.S. stock exchanges after May 2010. We show that the mere existence of such a regulation constitutes a safeguard which makes market participants behave differently in anticipation of a pause. Pauses tend to break local price trends, make liquidity suppliers revise positions, and enhance price discovery. In contrast, pauses do not have a “cool off” effect on markets, but rather accelerate volatility and bid-ask spreads. This implies a regulatory trade-off between the protective role of trading pauses and their adverse effects on market quality.
Effective market discipline incentivizes financial institutions to limit their risk-taking behavior, making it a key element for financial regulation. However, without adequate incentives to monitor and control the risk-taking behavior of financial institutions market discipline erodes. As a consequence, bailing out financial institutions, as happened unprecedentedly during the recent financial crisis, may impose indirect costs to financial stability if bailout expectations of investors change. Analyzing US data covering the period between 2004 and 2014, Hett und Schmidt (2017) find that market participants adjusted their bailout expectations in response to government interventions, undermining market discipline mechanisms. Given these findings, policymakers need to take into account the potential effects on market discipline when deciding about public support to troubled financial institutions in the future. Considering the parallelism of events and public responses during the financial crisis as well as the recent developments of Italian banks, these results not only concern the US, but also have important implications for European financial markets and policy makers.
Eine neuere Entscheidung des Bundesgerichtshofs zu den Anforderungen an die Mitteilung nach § 20 AktG über die Mitteilung eines Beteiligungserwerbs2 gibt Anlass zu Überlegungen zu den Rechtsfolgen einer Verletzung von Mitteilungspflichten durch mittelbar beteiligte Gesellschafter.
Der Bundesgerichthof hat, ohne auf abweichende Ansichten einzugehen, die h.M.3 bestätigt, nach der bei Verletzungen einer Mitteilungspflicht durch ein herrschendes Unternehmen die Rechtsfolge des Rechtsverlustes das unmittelbar beteiligte Tochterunternehmen selbst dann trifft, wenn dieses seine eigene Mitteilungspflicht ordnungsgemäß erfüllt hat.4 Im Hinblick auf den (zeitweiligen) Verlust von Dividendenansprüchen, um die es in dem vom BGH entschiedenen Fall ging, dürfte die in der Sache entscheidende Erwägung sein, dass anderenfalls dem herrschenden Unternehmen die mittelbaren Folgen der Gewinnausschüttung auch dann erhalten blieben, wenn es den eigenen Verstoß gegen die Mitteilungspflicht und den daraus folgenden temporären Wegfall des Gewinnbezugsrechts kannte oder kennen musste.
This Chapter explores how an environment of persistent low returns influences saving, investing, and retirement behaviors, as compared to what in the past had been thought of as more “normal” financial conditions. Our calibrated lifecycle dynamic model with realistic tax, minimum distribution, and Social Security benefit rules produces results that agree with observed saving, work, and claiming age behavior of U.S. households. In particular, our model generates a large peak at the earliest claiming age at 62, as in the data. Also in line with the evidence, our baseline results show a smaller second peak at the (system-defined) Full Retirement Age of 66. In the context of a zero-return environment, we show that workers will optimally devote more of their savings to non-retirement accounts and less to 401(k) accounts, since the relative appeal of investing in taxable versus tax-qualified retirement accounts is lower in a low return setting. Finally, we show that people claim Social Security benefits later in a low interest rate environment.
Given rising life expectations around the world, it seems that old-age pension benefits will need to be cut and pension contributions boosted in many nations. Yet our research on old-age system reforms does not require raising mandatory retirement ages or contributions. Instead, we offer ways to enhance incentives for people to work longer and delay retirement. There are good reasons to incentivize older people to work longer and delay retirement. These include rising longevity, the shrinking workforce, and emerging evidence indicating that working longer can be associated with better mental and physical health for many people. Nevertheless, old age Social Security systems in many nations find that people tend to claim benefits early, usually leading to reduced benefits.In the United States, for instance, a majority of Americans claim their Social Security benefits at the earlier feasible age, namely 62, even though their monthly benefits would be 75% higher if they waited until age 70. To test whether this is the result of people underweighting the economic value of higher lifetime benefit streams, we examine whether people would claim later and work longer if they were rewarded with a lump sum instead of a higher lifetime benefit stream for deferring. Two arguments have been offered to explain early claiming. One is that workers claim early to avoid potentially “forfeiting” their deferred benefits should they die too soon (Brown et al., 2016). A second explanation is that many people underweight the economic value of lifetime benefit streams (Brown et al., 2017). This latter rationale motivates the present study.
People who delay claiming Social Security receive higher lifelong benefits upon retirement. We survey individuals on their willingness to delay claiming later, if they could receive a lump sum in lieu of a higher annuity payment. Using a moment-matching approach, we calibrate a lifecycle model tracking observed claiming patterns under current rules and predict optimal claiming outcomes under the lump sum approach. Our model correctly predicts that early claimers under current rules would delay claiming most when offered actuarially fair lump sums, and for lump sums worth 87% as much, claiming ages would still be higher than at present.
We test two hypotheses, based on sexual selection theory, about gender differences in costly social interactions. Differential selectivity states that women invest less than men in interactions with new individuals. Differential opportunism states that women’s investment in social interactions is less responsive to information about the interaction’s payoffs. The hypotheses imply that women’s social networks are more stable and path dependent and composed of a greater proportion of strong relative to weak links. During their introductory week, we let new university students play an experimental trust game, first with one anonymous partner, then with the same and a new partner. Consistent with our hypotheses, we find that women invest less than men in new partners and that their investments are only half as responsive to information about the likely returns to the investment. Moreover, subsequent formation of students’ real social networks is consistent with the experimental results: being randomly assigned to the same introductory group has a much larger positive effect on women’s likelihood of reporting a subsequent friendship.
The currrent debate on monetary and fiscal policy is heavily influenced by estimates of the equilibrium real interest rate. Beyer and Wieland re-estimate the U.S. equilibrium rate with the methodology of Laubach and Williams and further modifications. They provide new estimates for the United States, the euro area and Germany and subject them to sensitivity tests. Beyer and Wieland conclude that due to the great uncertainty and sensitivity, the observed decline in the estimates is not a reliable indicator of a need for expansionary monetary and fiscal policy. Yet, if those estimates are employed to determine the appropriate monetary policy stance, such estimates are better used together with the consistent estimate of the level of potential output.
This paper examines the welfare implications of rising temperatures. Using a standard VAR, we empirically show that a temperature shock has a sizable, negative and statistically significant impact on TFP, output, and labor productivity. We rationalize these findings within a production economy featuring long-run temperature risk. In the model, macro-aggregates drop in response to a temperature shock, consistent with the novel evidence in the data. Such adverse effects are long-lasting. Over a 50-year horizon, a one-standard deviation temperature shock lowers both cumulative output and labor productivity growth by 1.4 percentage points. Based on the model, we also show that temperature risk is associated with non-negligible welfare costs which amount to 18.4% of the agent's lifetime utility and grow exponentially with the size of the impact of temperature on TFP. Finally, we show that faster adaptation to temperature shocks results in lower welfare costs. These welfare benefits become substantially higher in the presence of permanent improvements in the speed of adaptation.
We propose a model for measuring the runtime of concurrent programs by the minimal number of evaluation steps. The focus of this paper are improvements, which are program transformations that improve this number in every context, where we distinguish between sequential and parallel improvements, for one or more processors, respectively. We apply the methods to CHF, a model of Concurrent Haskell extended by futures. The language CHF is a typed higher-order functional language with concurrent threads, monadic IO and MVars as synchronizing variables. We show that all deterministic reduction rules and 15 further program transformations are sequential and parallel improvements. We also show that introduction of deterministic parallelism is a parallel improvement, and its inverse a sequential improvement, provided it is applicable. This is a step towards more automated precomputation of concurrent programs during compile time, which is also formally proven to be correctly optimizing.
We explore space improvements in LRP, a polymorphically typed call-by-need functional core language. A relaxed space measure is chosen for the maximal size usage during an evaluation. It Abstracts from the details of the implementation via abstract machines, but it takes garbage collection into account and thus can be seen as a realistic approximation of space usage. The results are: a context lemma for space improving translations and for space equivalences; all but one reduction rule of the calculus are shown to be space improvements, and the exceptional one, the copy-rule, is shown to increase space only moderately.
Several further program transformations are shown to be space improvements or space equivalences, in particular the translation into machine expressions is a space equivalence. These results are a step Forward in making predictions about the change in runtime space behavior of optimizing transformations in callbyneed functional languages.
We explore space improvements in LRP, a polymorphically typed call-by-need functional core language. A relaxed space measure is chosen for the maximal size usage during an evaluation. It Abstracts from the details of the implementation via abstract machines, but it takes garbage collection into account and thus can be seen as a realistic approximation of space usage. The results are: a context lemma for space improving translations and for space equivalences; all but one reduction rule of the calculus are shown to be space improvements, and the exceptional one, the copy-rule, is shown to increase space only moderately.
Several further program transformations are shown to be space improvements or space equivalences, in particular the translation into machine expressions is a space equivalence. These results are a step Forward in making predictions about the change in runtime space behavior of optimizing transformations in callbyneed functional languages.
Coming early to the party
(2017)
We examine the strategic behavior of High Frequency Traders (HFTs) during the pre-opening phase and the opening auction of the NYSE-Euronext Paris exchange. HFTs actively participate, and profitably extract information from the order flow. They also post "flash crash" orders, to gain time priority. They make profits on their last-second orders; however, so do others, suggesting that there is no speed advantage. HFTs lead price discovery, and neither harm nor improve liquidity. They "come early to the party", and enjoy it (make profits); however, they also help others enjoy the party (improve market quality) and do not have privileges (their speed advantage is not crucial).
Der europäische Arbeitnehmerbegriff ist aus der arbeitsrechtlichen Praxis inzwischen nicht mehr wegzudenken. Das Ausmaß des Einflusses des Europarechts auf das nationale Arbeitsrecht ist insbesondere seit den Entscheidungen des EuGH in den Rechtssachen Danosa (EuGH, 11.11.2010 - C-232/09) und Balkaya (EuGH, 9.7.2015 - C-229/14) zum Arbeitnehmerstatus des Geschäftsführers einer Kapitalgesellschaft erheblich. Dieser Beitrag beleuchtet die Auswirkungen dieser Rechtsprechung auf den nationalen Arbeitnehmerbegriff.
During the last IAIS Global Seminar in June 2017, IAIS disclosed the agenda for a gradual shift in the systemic risk assessment methodology from the current Entity Based Approach (EBA) to a new Activity Based Approach(ABA). The EBA, which was developed in the aftermath of the 2008/2009 financial crisis, defines a list of Global Systemically Important Insurers (G-SIIs) based on a pre-defined set of criteria related to the size of the institution. These G-SIIs are subject to additional regulatory requirements since their distress or disorderly failure would potentially cause significant disruption to the global financial system and economic activity. Even if size is still a needed element of a systemic risk assessment, the strong emphasis put on the too-big-to-fail approach in insurance, i.e. EBA, might be partially missing the underlying nature of systemic risk in insurance. Not only certain activities, including insurance activities such as life or non-life lines of business, but also common exposures or certain managerial practices such as leverage or funding structures, tend to contribute to systemic risk of insurers but are not covered by the current EBA (Berdin and Sottocornola, 2015). Therefore, we very much welcome the general development of the systemic risk assessment methodology, even if several important questions still need to be answered.
Fleckenstein et al. (2014) document that nominal Treasuries trade at higher prices than inflation-swapped indexed bonds, which exactly replicate the nominal cash flows. We study whether this mispricing arises from liquidity premiums in inflation-indexed bonds (TIPS) and inflation swaps. Using US data, we show that the level of liquidity affects TIPS, whereas swap yields include a liquidity risk premium. We also allow for liquidity effects in nominal bonds. These results are based on a model with a systematic liquidity risk factor and asset-specific liquidity characteristics. We show that these liquidity (risk) premiums explain a substantial part of the TIPS underpricing.
Causality is a widely-used concept in theoretical and empirical economics. The recent financial economics literature has used Granger causality to detect the presence of contemporaneous links between financial institutions and, in turn, to obtain a network structure. Subsequent studies combined the estimated networks with traditional pricing or risk measurement models to improve their fit to empirical data. In this paper, we provide two contributions: we show how to use a linear factor model as a device for estimating a combination of several networks that monitor the links across variables from different viewpoints; and we demonstrate that Granger causality should be combined with quantile-based causality when the focus is on risk propagation. The empirical evidence supports the latter claim.
We establish a benchmark result for the relationship between the loanable funds and the money-creation approach to banking. In particular, we show that both processes yield the same allocations when there is no uncertainty and thus no bank default. In such cases, using the much simpler loanable funds approach as a shortcut does not imply any loss of generality.
The impact of network connectivity on factor exposures, asset pricing and portfolio diversification
(2017)
This paper extends the classic factor-based asset pricing model by including network linkages in linear factor models. We assume that the network linkages are exogenously provided. This extension of the model allows a better understanding of the causes of systematic risk and shows that (i) network exposures act as an inflating factor for systematic exposure to common factors and (ii) the power of diversification is reduced by the presence of network connections. Moreover, we show that in the presence of network links a misspecified traditional linear factor model presents residuals that are correlated and heteroskedastic. We support our claims with an extensive simulation experiment.
The growth and popularity of defined contribution pensions, along with the government’s increasing attention to retirement plan costs and investment choices provided, make it important to understand how people select their retirement plan investments. This paper shows how employees in a large firm altered their fund allocations when the employer streamlined its pension fund menu and deleted nearly half of the offered funds. Using administrative data, we examine the changes in plan participant investment choices that resulted from the streamlining and how these changes might affect participants’ eventual retirement wellbeing. We show that streamlined participants’ new allocations exhibited significantly lower within-fund turnover rates and expense ratios, and we estimate this could lead to aggregate savings for these participants over a 20-year period of $20.2M, or in excess of $9,400 per participant. Moreover, after the reform, streamlined participants’ portfolios held significantly less equity and exhibited significantly lower risks by way of reduced exposures to most systematic risk factors, compared to their non-streamlined counterparts.
During the 1970s, industrial countries, including the US and continental Europa, experienced a combination of slow productivity growth and high unemplyoment. Subsequent research has shown that the standard model of unemployment actually gives counterfactual predictions. Motivated by the observation that the 1970s were also characterized by high and rising inflation, Tesfaselassie and Wolters examine the effect of growth on unemployment in the presence of nominal price rigidity.
The authors demonstrate that the effect of growth on unemployment may be positive or negative. Faster growth leads to lower unemployment if the rate of inflation is high enough. There is a threshold level of inflation below which faster growth leads to higher unemployment and above which faster growth leads to lower unemployment. The threshold level in turn depends on labor market characteristics, such as hiring efficiency, the job destruction rate, workers' relative bargaining power and the opportunity cost of work.
This study provides a graphic overview on core legislation in the area of economic and financial services. The presentation essentially covers the areas within the responsibility of the Economic and Monetary Affairs Committee (ECON); hence it starts with core ECON areas but also displays neighbouring areas of other Committees' competences which are closely connected to and impacting on ECON's work. It shows legislation in force, proposals and other relevant provisions on banking, securities markets and investment firms, market infrastructure, insurance and occupational pensions, payment services, consumer protection in financial services, the European System of Financial Supervision, European Monetary Union, euro bills and coins and statistics, competition, taxation, commerce and company law, accounting and auditing. Moreover, it notes selected provisions that might become relevant in the upcoming Article 50 TEU negotiations.
We compare the cost effectiveness of two pronatalist policies:
(a) child allowances; and
(b) daycare subsidies.
We pay special attention to estimating how intended fertility (fertility before children are born) responds to these policies. We use two evaluation tools:
(i) a dynamic model on fertility, labor supply, outsourced childcare time, parental time, asset accumulation and consumption; and
(ii) randomized vignette-survey policy experiments.
We implement both tools in the United States and Germany, finding consistent evidence that daycare subsidies are more cost effective. Nevertheless, the required public expenditure to increase fertility to the replacement level might be viewed as prohibitively high.
After the Lehman-Brothers collapse, the stock index has exceeded its pre-Lehman-Brothers peak by 36% in real terms. Seemingly, markets have been demanding more stocks instead of bonds. Yet, instead of observing higher bond rates, paradoxically, bond rates have been persistently negative after the Lehman-Brothers collapse. To explain this paradox, we suggest that, in the post-Lehman-Brothers period, investors changed their perceptions on disasters, thinking that disasters occur once every 30 years on average, instead of disasters occurring once every 60 years. In our asset-pricing calibration exercise, this rise in perceived market fragility alone can explain the drop in both bond rates and price-dividend ratios observed after the Lehman-Brothers collapse, which indicates that markets mostly demanded bonds instead of stocks.
For some time now, structural macroeconomic models used at central banks have been predominantly New Keynesian DSGE models featuring nominal rigidities and forwardlooking decision-making. While these features are widely deemed crucial for policy evaluation exercises, most central banks have added more detailed characterizations of the financial sector to these models following the Great Recession in order to improve their fit to the data and their forecasting performance. We employ a comparative approach to investigate the characteristics of this new generation of New Keynesian DSGE models and document an elevated degree of model uncertainty relative to earlier model generations. Policy transmission is highly heterogeneous across types of financial frictions and monetary policy causes larger effects, on average. The New Keynesian DSGE models we analyze suggest that a simple policy rule robust to model uncertainty involves a weaker response to inflation and the output gap in the presence of financial frictions as compared to earlier generations of such models. Leaning-against-the-wind policies in models of this class estimated for the Euro Area do not lead to substantial gains. With regard to forecasting performance, the inclusion of financial frictions can generate improvements, if conditioned on appropriate data. Looking forward, we argue that model-averaging and embracing alternative modelling paradigms is likely to yield a more robust framework for the conduct of monetary policy.
Since 2014 the ECB has implemented a massive expansion of monetary policy including large-scale asset purchases and negative policy rates. As the euro area economy has improved and inflation has risen, questions concerning the future normalization of monetary policy are starting to dominate the public debate.
The study argues that the ECB should develop a strategy for policy normalization and communicate it very soon to prepare the ground for subsequent steps towards tightening. It provides analysis and makes proposals concerning key aspects of this strategy. The aim is to facilitate the emergence of expectations among market participants that are consistent with a smooth process of policy normalization.
What processes transform (im)mobile individuals into ‘migrants’ and geographic movements across political-territorial borders into ‘migration’? To address this question, the article develops the doing migration approach, which combines perspectives from social constructivism, praxeology and the sociologies of knowledge and culture. ‘Doing migration’ starts with the processes of social attribution that differentiate between ‘migrants’ and ‘non-migrants’. Embedded in institutional, organizational and interactional routines these attributions generate unique social orders of migration. By illustrating these conceptual ideas, the article provides insights into the elements of the contemporary European order of ‘migration’. Its institutional routines contribute to the emergence of a European migration regime that involves narratives of economization, securitization and humanitarization. The organizational routines of the European migration order involve surveillance and diversity management, which have disciplining effects on those defined as ‘migrants’. The routines of everyday face-to-face interactions produce various micro-forms of doing ‘migration’ through stigmatization and othering, but they also provide opportunities to resist a social attribution as ‘migrant’.
This paper reviews social network analysis (SNA) as a method to be utilized in biographical research which is a novel contribution. We argue that applying SNA in the context of biography research through standardized data collection as well as visualization of networks can open up participants’ interpretations of relations throughout their lives, and allow a creative and innovative way of data collection that is responsive to participants’ own meanings and associations while allowing the researchers to conduct systematical data analysis. The paper discusses the analytical potential of SNA in biographical research, where the efficacy of this method is critically discussed, together with its limitations, and its potential within the context of biographical research.
Bank regulators have the discretion to discipline banks by executing enforcement actions to ensure that banks correct deficiencies regarding safe and sound banking principles. We
highlight the trade-offs regarding the execution of enforcement actions for financial stability. Following this we provide an overview of the differences in the legal framework governing supervisors’ execution of enforcement actions in the Banking Union and the United States. After discussing work on the effect of enforcement action on bank behaviour and the real economy, we present data on the evolution of enforcement actions
and monetary penalties by U.S. regulators. We conclude by noting the importance of supervisors to levy efficient monetary penalties and stressing that a division of competences among different regulators should not lead to a loss of efficiency regarding
the execution of enforcement actions.
This paper aims to analyze the effects of financial constraints and the financial crisis on the financing and investment policies of newly founded firms. Thereby, the analysis adds important new insights on a crucial segment of the economy. We make use of a large and comprehensive data set of French firms founded in the years 2004-2006, i.e. well before the financial crisis. Our panel data analysis shows that the global financial crisis imposed a shock (mostly demand-driven) on the financing as well as on the investments of these firms. Moreover, we find that financially constrained firms use less external debt financing and invest smaller amounts. They also rely on less trade credit. With regard to bank financing, newly founded firms which are more financially constrained accumulate less bank debt and repay initial bank debt slower than their non-financially constraint counterparts. Finally, we find that financially constrained firms are affected to a smaller degree by the financial crisis than their less financially constrained counterparts.
We develop a state-space model to decompose bid and ask quotes of CDS into two components, fair default premium and liquidity premium. This approach gives a better estimate of the default premium than mid quotes, and it allows to disentangle and compare the liquidity premium earned by the protection buyer and the protection seller. In contrast to other studies, our model is structurally much simpler, while it also allows for correlation between liquidity and default premia, as supported by empirical evidence. The model is implemented and applied to a large data set of 118 CDS for a period ranging from 2004 to 2010. The model-generated output variables are analyzed in a difference-in-difference framework to determine how the default premium, as well as the liquidity premium of protection buyers and sellers, evolved during different periods of the financial crisis and to which extent they differ for financial institutions compared to non-financials.
This paper examines the relationship between oil movements and systemic risk of financial institution in major petroleum-based economies. We estimate ΔCoVaR for those institutions and observe the presence of elevated increases in its levels corresponding to the subprime and global financial crises. The results provide evidence in favor of risk measurement improvements by accounting for oil returns in the risk functions. The spread between the standard CoVaR and the CoVaR that includes oil absorbs in a time range longer than the duration of the oil shock. This indicates that the drop in the oil price has a longer effect on risk and requires more time to be discounted by the financial institutions. To support the analysis, we consider also the other major market-based systemic risk measures.
Motivated by tools for automaed deduction on functional programming languages and programs, we propose a formalism to symbolically represent $\alpha$-renamings for meta-expressions. The formalism is an extension of usual higher-order meta-syntax which allows to $\alpha$-rename all valid ground instances of a meta-expression to fulfill the distinct variable convention. The renaming mechanism may be helpful for several reasoning tasks in deduction systems. We present our approach for a meta-language which uses higher-order abstract syntax and a meta-notation for recursive let-bindings, contexts, and environments. It is used in the LRSX Tool -- a tool to reason on the correctness of program transformations in higher-order program calculi with respect to their operational semantics. Besides introducing a formalism to represent symbolic $\alpha$-renamings, we present and analyze algorithms for simplification of $\alpha$-renamings, matching, rewriting, and checking $\alpha$-equivalence of symbolically $\alpha$-renamed meta-expressions.
We introduce rewriting of meta-expressions which stem from a meta-language that uses higher-order abstract syntax augmented by meta-notation for recursive let, contexts, sets of bindings, and chain variables. Additionally, three kinds of constraints can be added to meta-expressions to express usual constraints on evaluation rules and program transformations. Rewriting of meta-expressions is required for automated reasoning on programs and their properties. A concrete application is a procedure to automatically prove correctness of program transformations in higher-order program calculi which may permit recursive let-bindings as they occur in functional programming languages. Rewriting on meta-expressions can be performed by solving the so-called letrec matching problem which we introduce. We provide a matching algorithm to solve it. We show that the letrec matching problem is NP-complete, that our matching algorithm is sound and complete, and that it runs in non-deterministic polynomial time.
Financial market interactions can lead to large and persistent booms and recessions. Instability is an inherent threat to economies with speculative financial markets. A central bank’s interest rate setting can amplify the expectation feedback in the financial market and this can lead to unstable dynamics and excess volatility. The paper suggests that policy institutions may be well-advised to handle tools like asset price targeting with care since such instruments might add a structural link between asset prices and macroeconomic aggregates. Neither stock prices nor indices are a good indicator to base decisions on.
Der urheberrechtlich konnotierte Begriff des Plagiats zählt zu den anerkannten Grundtatbeständen wissenschaftlichen Fehlverhaltens. Der Beitrag zeigt indes, dass das Urheberrecht und das Wissenschaftsrecht keine konzentrischen Kreise bilden, sondern unterschiedliche Zwecke mit je anderen Regelungskonzepten verfolgen. Die Übernahme urheberrechtlicher Argumentationsmuster in die Wissenschaftsethik und das Wissenschaftsrecht erschwert die Herausbildung spezifisch wissenschaftsbezogener Kriterien zur Beurteilung wissenschaftlichen Fehlverhaltens. Als Alternative entwickelt der Beitrag ein Konzept wissenschaftlicher Redlichkeit, das sich am Recht gegen unlauteren Wettbewerb orientiert. Dazu werden weitreichende teleologische und strukturelle Gemeinsamkeiten des Lauterkeitsrechts und der Regeln zu wissenschaftlichem Fehlverhalten aufgedeckt. Insbesondere verfolgen beide Materien eine funktionale Teleologie. Das Lauterkeitsrecht gewährleistet die Funktionsbedingungen des wirtschaftlichen Wettbewerbs, das Verbot wissenschaftlichen Fehlverhaltens sichert die Funktionsbedingungen und damit zugleich den Zielerreichungsgrad des offenen Wissenschaftsprozesses und des Wettbewerbs um wissenschaftliche Reputation.
I propose a dynamic stochastic general equilibrium model in which the leverage of borrowers as well as banks and housing finance play a crucial role in the model dynamics. The model is used to evaluate the relative effectiveness of a policy to inject capital into banks versus a policy to relieve households of mortgage debt. In normal times, when the economy is near the steady state and policy rates are set according to a Taylor-type rule, capital injections to banks are more effective in stimulating the economy in the long-run. However, in the middle of a housing debt crisis, when households are highly leveraged, the short-run output effects of the debt relief are more substantial. When the zero lower bound (ZLB) is additionally considered, the debt relief policy can be much more powerful in boosting the economy both in the short-run and in the longrun. Moreover, the output effects of the debt relief become increasingly larger, the longer the ZLB is binding.
This paper analyses the bail-in tool under the BRRD and predicts that it will not reach its policy objective. To make this argument, this paper first describes the policy rationale that calls for mandatory PSI. From this analysis the key features for an effective bail-in tool can be derived. These insights serve as the background to make the case that the European resolution framework is likely ineffective in establishing adequate market discipline through risk-reflecting prices for bank capital. The main reason for this lies in the avoidable embeddedness of the BRRD’s bail-in tool in the much broader resolution process which entails ample discretion of the authorities also in forcing private sector involvement. Finally, this paper synthesized the prior analysis by putting forward an alternative regulatory approach that seeks to disentangle private sector involvement as a precondition for effective bank-resolution as much as possible form the resolution process as such.
The bail-in tool as implemented in the European bank resolution framework suffers from severe shortcomings. To some extent, the regulatory framework can remedy the impediments to the desirable incentive effect of private sector involvement (PSI) that emanate from a lack of predictability of outcomes, if it compels banks to issue a sufficiently sized minimum of high-quality, easy to bail-in (subordinated) liabilities. Yet, even the limited improvements any prescription of bail-in capital can offer for PSI’s operational effectiveness seem compromised in important respects.
The main problem, echoing the general concerns voiced against the European bail-in regime, is that the specifications for minimum requirements for own funds and eligible liabilities (MREL) are also highly detailed and discretionary and thus alleviate the predicament of investors in bail-in debt, at best, only insufficiently. Quite importantly, given the character of typical MREL instruments as non-runnable long-term debt, even if investors are able to gauge the relevant risk of PSI in a bank’s failure correctly at the time of purchase, subsequent adjustment of MREL-prescriptions by competent or resolution authorities potentially change the risk profile of the pertinent instruments. Therefore, original pricing decisions may prove inadequate and so may market discipline that follows from them.
The pending European legislation aims at an implementation of the already complex specifications of the Financial Stability Board (FSB) for Total Loss Absorbing Capacity (TLAC) by very detailed and case specific amendments to both the regulatory capital and the resolution regime with an exorbitant emphasis on proportionality and technical fine-tuning. What gets lost in this approach, however, is the key policy objective of enhanced market discipline through predictable PSI: it is hardly conceivable that the pricing of MREL-instruments reflects an accurate risk-assessment of investors because of the many discretionary choices a multitude of agencies are supposed to make and revisit in the administration of the new regime. To prove this conclusion, this chapter looks in more detail at the regulatory objectives of the BRRD’s prescriptions for MREL and their implementation in the prospectively amended European supervisory and resolution framework.
Das Clearing von Euro-OTC-Derivaten post Brexit – eine Analyse der vorliegenden Kostenschätzungen
(2017)
Im Zusammenhang mit dem Brexit wird über die Kosten einer Relokation des Clearing des Euro-OTC-Derivate-Geschäftes auf ein EU-CCP diskutiert. Das vorliegende Papier zeigt, dass die bislang vorliegenden Kostenschätzungen, die von Kosten in Höhe von bis zu USD 100 Mrd. für einen Zeitraum von fünf Jahren ausgehen, viel zu hoch sind. Die erwarteten Kosten einer Relokation liegen vielmehr bei ca. USD 0,6 Mrd. p.a. bzw. ca. USD 3,2 Mrd. für eine Übergangsphase von fünf Jahren. Angesichts der hohen Bedeutung von systemrelevanten CCPs für die Stabilität der Eurozone sollten diese Kosten nicht entscheidungsrelevant für eine Relokation sein.
In the context of the upcoming Brexit, a relocation of the clearing of euro-OTC derivatives for EU-based firms is the subject of controversial discussion. The opponents of a relocation argue that a relocation would cause additional costs for market participants of up to USD 100 bn over a period of 5 years. This paper shows that this cost estimate is fairly unrealistic and that relocation costs would amount to approximately USD 0.6 bn p.a., which translates to cumulative costs of around USD 3.2 bn for a transition period of 5 years. In light of the strategic importance of systemically relevant CCPs for the financial stability of the eurozone, the potential relocation costs should not be a decision criterion.
Inhalt
1. Strauß, Johann (Vater:/Sr.) (* 14.3.1804 – † 25.9.1849) [3]
2. Strauß, Johann (Sohn:/Jr.) (* 25.10.1825 – † 3.6.1899) [6]
3. Die Operetten-Adaptionen [12]
3.1 Die Fledermaus (Operette in 3 Akten, Johann Strauß, Sohn) :/:/ UA: 5.4.1874 [12]
3.2 Eine Nacht in Venedig (Operette in 3 Akten, Johann Strauß, Sohn) :/:/ UA: 3.10.1883 [17]
3.3 Der Zigeunerbaron (Operette in 3 Akten, Johann Strauß, Sohn) :/:/ UA: 24.10.1885 [18]
3.4 Wiener Blut (Johann Strauß, Sohn) :/:/ UA: 25.10.1899 [19]
3.5 Frühlingsluft (Jose. Strauß) :/:/ UA: 9.5.1903 [
Les Blank
(2017)
To broaden the scope of monetary policy, cash abolishment is often suggested as a means of breaking through the zero lower bound. However, practically nothing is said about the welfare costs of such a proposal. Rösl, Seitz and Tödter argue that the welfare costs of bypassing the zero lower bound can be analyzed analytically and empirically by assuming negative interest rates on cash holdings. They gauge the welfare effects of abolishing cash, both, for the euro area and for Germany.
Their findings suggest that the welfare losses of negative interest rates incurred by money holders are large, notably if implemented in the current low interest rate environment. Imposing a negative interest rate of 3 percentage points on cash holdings and reducing the interest on all assets included in M3 creates a deadweight loss of € 62bn for the euro area and of €18bn for Germany. Therefore, the authors argue that cash abolishment or negative interest rates on cash to break through the zero lower bound at any price can hardly be a meaningful policy goal.
Wie verhalten sich Freiheit und Geld zueinander? In der liberalen Tradition der Philosophie und der Ökonomik wird Geld meist als bloßes Mittel gefasst, dessen Einführung den Austausch von Waren erleichtert, darüber hinaus jedoch keine tiefergreifenden sozialen Folgen zeitigt. Im Gegensatz hierzu wird in diesem Working Paper der Zusammenhang von Geld und (Un-)Freiheit herausgearbeitet. Im Anschluss an die Tradition kritischer Sozialphilosophie und in Auseinandersetzung mit Marx, Simmel und der neueren Geldsoziologie wird dabei in einem ersten Schritt der paradoxe Charakter dieser gesellschaftlich eröffneten Freiheit dargelegt: Zum einen kultiviert Geld in kapitalistischen Ökonomien eine individuelle Form von Wahlfreiheit. Zum anderen wird über Geld der Zugang zum gesellschaftlichen Reichtum auf ungleiche und disziplinierende Weise strukturiert: Je nach individueller Verfügung über finanzielle Mittel ist man auf unterschiedliche Weise zum Verkauf der eigenen Arbeitskraft angehalten, um den Zugriff auf Güter und die eigene Reproduktion zu sichern. Diese paradoxe Form von Freiheit wird in einem zweiten Schritt hinsichtlich ihrer Entfremdungstendenz befragt: Insofern die über die Institution des Geldes eröffnete Freiheit ihren gesellschaftlichen Ermöglichungsgrund verdeckt, kann sie als eine fetischisierte Form von Freiheit begriffen werden.
Rechtspopulistische Bewegungen machen sich zur Zeit in vielen westlichen Staaten zum Sprachrohr angeblich bisher unterdrückter Bevölkerungsgruppen und Meinungen. Die identitäre Bewegung entwickelt diesen Ansatz weiter zu einem Projekt der autoritären Staatlichkeit gegen Multikulturalismus, Islam und Einwanderung. Dabei verbindet sie ihre Kampagne für einen ethnisch geschlossen Nationalstaat mit der Kritik an der kapitalistischen Globalisierung. Mit einem Sprachduktus, der Politik emotionalisiert, wird durch «geistige Verschärfung» das Programm eines defensiven Ethnonationalismus entfaltet. Dieser beruft sich auf Traditionsbestandteile eines völkischen Antimodernismus und eine von dem russischen Philosophen Alexander Dugin entworfene eurasische Geopolitik.
Ein europäischer Keynesianismus als Grundlage für ein gesamteuropäisches Wirtschaftskonzept würde als offensive Gegenstrategie die Idee einer sozialstaatlichen Erneuerung propagieren können. Zudem sind Akteure aus der Zivilgesellschaft aufgefordert, gegen Fremdenfeindlichkeit und Orientierungsverlust aufklärerisch zu wirken.
Despite various policy and management responses, biodiversity continues to decline worldwide. We must redouble our efforts to halt biodiversity loss. The current lack of policy action can be partly linked to an insufficient knowledge base regarding the conservation and sustainable use of biodiversity. Biodiversity research needs to incorporate both social and ecological factors to gain a deeper understanding of the interrelations between society and nature that affect biodiversity. A transdisciplinary research approach is crucial to fulfilling these requirements. It aims to produce new insights by integrating scientific and nonscientific knowledge. Several measures need to be taken to strengthen transdisciplinary social-ecological biodiversity research: Within the science community: firstly, scientists themselves must promote transdisciplinarity; secondly, the reward system for scientists must be brought into line with transdisciplinary research processes; and thirdly, academic training needs to advocate transdisciplinarity. As for research policies, research funding priorities need to be linked to large scale biodiversity policy frameworks, and funding for transdisciplinary social-ecological research on biodiversity must be increased significantly.
We propose a 2-country asset-pricing model where agents' preferences change endogenously as a function of the popularity of internationally traded goods. We determine the effect of the time-variation of preferences on equity markets, consumption and portfolio choices. When agents are more sensitive to the popularity of domestic consumption goods, the local stock market reacts more strongly to the preferences of local agents than to the preferences of foreign agents. Therefore, home bias arises because home-country stock represents a better investment opportunity for hedging against future fluctuations in preferences. We test our model and find that preference evolution is a plausible driver of key macroeconomic variables and stock returns.
The international diffusion of technology plays a key role in stimulating global growth and explaining co-movements of international equity returns. Existing empirical evidence suggests that countries are heterogeneous in their attitude toward innovation: Some countries rely more on technology adoption while other countries rely more on internal technology production. European countries that rely more on adoption are also typically characterized by lower fiscal policy exibility and higher labor market rigidity. We develop a two-country model – where both countries rely on R&D and adoption – to study the short-run and long-run effects of aggregate technology and adoption probability shocks on economic growth in the presence of the aforementioned asymmetries. Our framework suggests that an increase in the ability to adopt technology from abroad stimulates economic growth in the country that benefits from higher adoption rates but the beneficial effects also spread to the foreign country. Moreover, it helps explaining the differences in macro quantities and equity returns observed in the international data.
On average young people \undersave" whereas old people \oversave" with respect to the rational expectations model of life-cycle consumption and savings. According to numerous studies on subjective survival beliefs, young people also \underestimate" whereas old people \overestimate" their objective survival chances on average. We take a structural behavioral economics approach to jointly address both empirical phenomena by embedding subjective survival beliefs that are consistent with these biases into a rank-dependent utility (RDU) model over life-cycle consumption. The resulting consumption behavior is dynamically inconsistent. Considering both naive and sophisticated RDU agents we show that within this framework underestimation of young age and overestimation of old age survival probabilities may (but need not) give rise to the joint occurrence of undersaving and oversaving. In contrast to this RDU model, the familiar quasi-hyperbolic discounting (QHD), which is nested as a special case, cannot generate oversaving.
We analyze the market reaction to the sentiment of the CEO speech at the Annual General Meeting (AGM). As the AGM is typically preceded by several information disclosures, the CEO speech may be expected to contribute only marginally to investors’ decision-making. Surprisingly, however, we observe from the transcripts of 338 CEO speeches of German corporates between 2008 and 2016 that their sentiment is significantly related to abnormal stock returns and trading volumes following the AGM. Using a novel business-specific German dictionary based on Loughran and McDonald (2011), we find a negative association of the post-AGM returns with the speeches’ negativity and a positive association with the speeches’ relative positivity (i.e. positivity relative to negativity). Relative positivity moreover corresponds with a lower trading volume in a short time window surrounding the AGM. Investors hence seem to perceive the sentiment of CEO speeches at AGMs as a valuable indicator of future firm performance.
Under Solvency II, corporate governance requirements are a complementary, but nonetheless essential, element to build a sound regulatory framework for insurance undertakings, also to address risks not specifically mitigated by the sole solvency capital requirements. After recalling the provisions of the second pillar concerning the system of governance, the paper is devoted to highlight the emerging regulatory trends in the corporate governance of insurance firms. Among others, it signals the exceptional extension of the duties and responsibilities assigned to the Board of directors, far beyond the traditional role of both monitoring the chief executive officer, and assessing the overall direction and strategy of the business. However, a better risk governance is not necessarily built on narrow rule-based approaches to corporate governance.
This paper investigates the effects of a rise in interest rate and lapse risk of endowment life insurance policies on the liquidity and solvency of life insurers. We model the book and market value balance sheet of an average German life insurer, subject to both GAAP and Solvency II regulation, featuring an existing back book of policies and an existing asset allocation calibrated by historical data. The balance sheet is then projected forward under stochastic financial markets. Lapse rates are modeled stochastically and depend on the granted guaranteed rate of return and prevailing level of interest rates. Our results suggest that in the case of a sharp increase in interest rates, policyholders sharply increase lapses and the solvency position of the insurer deteriorates in the short-run. This result is particularly driven by the interaction between a reduction in the market value of assets, large guarantees for existing policies, and a very slow adjustment of asset returns to interest rates. A sharp or gradual rise in interest rates is associated with substantial and persistent liquidity needs, that are particularly driven by lapse rates.
Different insurance activities exhibit different levels of persistence of shocks and volatility. For example, life insurance is typically more persistent but less volatile than non-life insurance. We examine how diversification among life, non-life insurance, and active reinsurance business affects an insurer's contribution and exposure to the risk of other companies. Our model shows that a counterparty's credit risk exposure to an insurance group substantially depends on the relative proportion of the insurance group's life and non-life business. The empirical analysis confirms this finding with respect to several measures for spillover risk. The optimal proportion of life business that minimizes spillover risk decreases with leverage of the insurance group, and increases with active reinsurance business.
A tontine provides a mortality driven, age-increasing payout structure through the pooling of mortality. Because a tontine does not entail any guarantees, the payout structure of a tontine is determined by the pooling of individual characteristics of tontinists. Therefore, the surrender decision of single tontinists directly affects the remaining members' payouts. Nevertheless, the opportunity to surrender is crucial to the success of a tontine from a regulatory as well as a policyholder perspective. Therefore, this paper derives the fair surrender value of a tontine, first on the basis of expected values, and then incorporates the increasing payout volatility to determine an equitable surrender value. Results show that the surrender decision requires a discount on the fair surrender value as security for the remaining members. The discount intensifies in decreasing tontine size and increasing risk aversion. However, tontinists are less willing to surrender for decreasing tontine size and increasing risk aversion, creating a natural protection against tontine runs stemming from short-term liquidity shocks. Furthermore we argue that a surrender decision based on private information requires a discount on the fair surrender value as well.
Under Solvency II, corporate governance requirements are a complementary, but nonetheless essential, element to build a sound regulatory framework for insurance undertakings, also to address risks not specifically mitigated by the sole solvency capital requirements. After recalling the provisions of the Second Pillar concerning the system of governance, the paper highlights the emerging regulatory trends in the corporate governance of insurance firms. Among others things, it signals the exceptional extension of the duties and responsibilities assigned to the board of directors, far beyond the traditional role of both monitoring the chief executive officer, and assessing the overall direction and strategy of the business. However, a better risk governance is not necessarily built on narrow rule-based approaches to corporate governance.
Telemonitoring devices can be used to screen consumers' characteristics and mitigate information asymmetries that lead to adverse selection in insurance markets. However, some consumers value their privacy and dislike sharing private information with insurers. In the second-best efficient Wilson-Miyazaki-Spence framework, we allow for consumers to reveal their risk type for an individual subjective cost and show analytically how this affects insurance market equilibria as well as utilitarian social welfare. Our analysis shows that the choice of information disclosure with respect to revelation of their risk type can substitute deductibles for consumers whose transparency aversion is sufficiently low. This can lead to a Pareto improvement of social welfare and a Pareto efficient market allocation. However, if all consumers are offered cross-subsidizing contracts, the introduction of a transparency contract decreases or even eliminates cross-subsidies. Given the prior existence of a WMS equilibrium, utility is shifted from individuals who do not reveal their private information to those who choose to reveal. Our analysis provides a theoretical foundation for the discussion on consumer protection in the context of digitalization. It shows that new technologies bring new ways to challenge crosssubsidization in insurance markets and stresses the negative externalities that digitalization has on consumers who are not willing to take part in this development.
We study the impact of estimation errors of firms on social welfare. For this purpose, we present a model of the insurance market in which insurers face parameter uncertainty about expected loss sizes. As consumers react to under- and overestimation by increasing and decreasing demand, respectively, insurers require a safety loading for parameter uncertainty. If the safety loading is too small, less risk averse consumers benefit from less informed insurers by speculating on them underestimating expected losses. Otherwise, social welfare increases with insurers’ information. We empirically estimate safety loadings in the US property and casualty insurance market, and show that these are likely to be sufficiently large for consumers to benefit from more informed insurers.
Die Reihe „Papers of Excellence 2.0: Ausgewählte Arbeiten aus den Fachdidaktiken und Bildungswissenschaften der Goethe-Universität Frankfurt a.M.“ ist eine neue, erweiterte und zusätzliche Auflage der bekannten Reihe „Papers of Excellence: Ausgewählte Arbeiten aus den Fachdidaktiken“, welche seit 2010 von Daniela Elsner und Anja Wildemann im Shaker-Verlag herausgegeben wird. In alter Tradition werden auch in der ab sofort zusätzlich zur Printausgabe erscheinenden Online Version dieser Buchreihe herausragende Examens- und Masterarbeiten, die sich durch eine ausgewiesene empirische, fachdidaktische Auseinandersetzung mit einem Thema auszeichnen, zusammenfassend vorgestellt. Neu ist, dass die Online Version nun auch Arbeiten mit einem bildungswissenschaftlichen Fokus aufnimmt und solche, die an der Schnittstelle zwischen Fachdidaktik und Bildungswissenschaften an-gelegt sind. Die Papers of Excellence 2.0, die derzeit nur Studien integriert, die an der Goethe Universität Frankfurt am Main angefertigt wurden, werden von Astrid Jurecka (Bildungswissenschaften) und Daniela Elsner (Fachdidaktik) herausgegeben und sind kostenfrei zugänglich.