Refine
Year of publication
- 2011 (92) (remove)
Document Type
- Working Paper (92) (remove)
Has Fulltext
- yes (92)
Is part of the Bibliography
- no (92)
Keywords
- Deutschland (4)
- USA (4)
- China (3)
- Digital Humanities (3)
- Financial Crisis (3)
- Japan (3)
- Monetary Policy (3)
- Adaptive Erwartung (2)
- Adverse Selection Risk (2)
- Außenwirtschaftliches Gleichgewicht (2)
Institute
- Center for Financial Studies (CFS) (35)
- Wirtschaftswissenschaften (8)
- Informatik (7)
- Institute for Law and Finance (ILF) (7)
- Institute for Monetary and Financial Stability (IMFS) (7)
- Exzellenzcluster Die Herausbildung normativer Ordnungen (6)
- House of Finance (HoF) (4)
- Interdisziplinäres Zentrum für Ostasienstudien (IZO) (4)
- Gesellschaftswissenschaften (3)
- Medizin (3)
This paper compares the shareholder-value-maximizing capital structure and pricing policy of insurance groups against that of stand-alone insurers. Groups can utilise intra-group risk diversification by means of capital and risk transfer instruments. We show that using these instruments enables the group to offer insurance with less default risk and at lower premiums than is optimal for standalone insurers. We also take into account that shareholders of groups could find it more difficult to prevent inefficient overinvestment or cross-subsidisation, which we model by higher dead-weight costs of carrying capital. The tradeoff between risk diversification on the one hand and higher dead-weight costs on the other can result in group building being beneficial for shareholders but detrimental for policyholders.
Depending on the point of time and location, insurance companies are subject to different forms of solvency regulation. In modern regulation regimes, such as the future standard Solvency II in the EU, insurance pricing is liberalized and risk-based capital requirements will be introduced. In many economies in Asia and Latin America, on the other hand, supervisors require the prior approval of policy conditions and insurance premiums, but do not conduct risk-based capital regulation. This paper compares the outcome of insurance rate regulation and risk-based capital requirements by deriving stock insurers’ best responses. It turns out that binding price floors affect insurers’ optimal capital structures and induce them to choose higher safety levels. Risk-based capital requirements are a more efficient instrument of solvency regulation and allow for lower insurance premiums, but may come at the cost of investment efforts into adequate risk monitoring systems. The paper derives threshold values for regulator’s investments into risk-based capital regulation and provides starting points for designing a welfare-enhancing insurance regulation scheme.
Depending on the point of time and location, insurance companies are subject to different forms of solvency regulation. In modern regulation regimes, such as the future standard Solvency II in the EU, insurance pricing is liberalized and risk-based capital requirements will be introduced. In many economies in Asia and Latin America, on the other hand, supervisors require the prior approval of policy conditions and insurance premiums, but do not conduct risk-based capital regulation. This paper compares the outcome of insurance rate regulation and riskbased capital requirements by deriving stock insurers’ best responses. It turns out that binding price floors affect insurers’ optimal capital structures and induce them to choose higher safety levels. Risk-based capital requirements are a more efficient instrument of solvency regulation and allow for lower insurance premiums, but may come at the cost of investment efforts into adequate risk monitoring systems. The paper derives threshold values for regulator’s investments into risk-based capital regulation and provides starting points for designing a welfare-enhancing insurance regulation scheme.
Der vorliegende Abschlussbericht fasst die Ergebnisse der Studie „Berufliche Weiterbildung von Teilzeitkräften“ zusammen. Der Projektzeitraum erstreckte sich vom 15.06.2010 bis zum 31.03.2011, Gefördert wurde die Studie durch das Hessische Ministerium für Wirtschaft, Verkehr und Landesentwicklung (HMWVL) und den Europäischen Sozialfonds (ESF).
If there is one thing to be learned from David Foster Wallace, it is that cultural transmission is a tricky game. This was a problem Wallace confronted as a literary professional, a university-based writer during what Mark McGurl has called the Program Era. But it was also a philosophical issue he grappled with on a deep level as he struggled to combat his own loneliness through writing. This fundamental concern with literature as a social, collaborative enterprise has also gained some popularity among scholars of contemporary American literature, particularly McGurl and James English: both critics explore the rules by which prestige or cultural distinction is awarded to authors (English; McGurl). Their approach requires a certain amount of empirical work, since these claims move beyond the individual experience of the text into forms of collective reading and cultural exchange influenced by social class, geographical location, education, ethnicity, and other factors. Yet McGurl and English's groundbreaking work is limited by the very forms of exclusivity they analyze: the protective bubble of creative writing programs in the academy and the elite economy of prestige surrounding literary prizes, respectively. To really study the problem of cultural transmission, we need to look beyond the symbolic markets of prestige to the real market, the site of mass literary consumption, where authors succeed or fail based on their ability to speak to that most diverse and complicated of readerships: the general public. Unless we study what I call the social lives of books, we make the mistake of keeping literature in the same ascetic laboratory that Wallace tried to break out of with his intense authorial focus on popular culture, mass media, and everyday life.
In the last few years, literary studies have experienced what we could call the rise of quantitative evidence. This had happened before of course, without producing lasting effects, but this time it’s probably going to be different, because this time we have digital databases, and automated data retrieval. As Michel’s and Lieberman’s recent article on "Culturomics" made clear, the width of the corpus and the speed of the search have increased beyond all expectations: today, we can replicate in a few minutes investigations that took a giant like Leo Spitzer months and years of work. When it comes to phenomena of language and style, we can do things that previous generations could only dream of.
When it comes to language and style. But if you work on novels or plays, style is only part of the picture. What about plot – how can that be quantified? This paper is the beginning of an answer, and the beginning of the beginning is network theory. This is a theory that studies connections within large groups of objects: the objects can be just about anything – banks, neurons, film actors, research papers, friends... – and are usually called nodes or vertices; their connections are usually called edges; and the analysis of how vertices are linked by edges has revealed many unexpected features of large systems, the most famous one being the so-called "small-world" property, or "six degrees of separation": the uncanny rapidity with which one can reach any vertex in the network from any other vertex. The theory proper requires a level of mathematical intelligence which I unfortunately lack; and it typically uses vast quantities of data which will also be missing from my paper. But this is only the first in a series of studies we’re doing at the Stanford Literary Lab; and then, even at this early stage, a few things emerge.
This paper is the report of a study conducted by five people – four at Stanford, and one at the University of Wisconsin – which tried to establish whether computer-generated algorithms could "recognize" literary genres. You take 'David Copperfield', run it through a program without any human input – "unsupervised", as the expression goes – and ... can the program figure out whether it's a gothic novel or a 'Bildungsroman'? The answer is, fundamentally, Yes: but a Yes with so many complications that it is necessary to look at the entire process of our study. These are new methods we are using, and with new methods the process is almost as important as the results.
The article discusses the methodology adopted for a cross-linguistic synchronic and diachronic corpus study on indefinites. The study covered five indefinite expressions, each in a different language. The main goal of the study was to verify the distribution of these indefinites synchronically and to attest their historical development. The methodology we used is a form of functional labeling which combines both context (syntax) and meaning (semantics) using as a starting point Haspelmath’s (1997) functional map. In the article we identify Haspelmath’s functions with logico-semantic interpretations and propose a binary branching decision tree assigning each instance of an indefinite exactly one function in the map.
This paper examines to what extent the build-up of 'global imbalances' since the mid-1990s can be explained in a purely real open-economy DSGE model in which agents' perceptions of long-run growth are based on filtering observed changes in productivity. We show that long-run growth estimates based on filtering U.S. productivity data comove strongly with long-horizon survey expectations. By simulating the model in which agents filter data on U.S. productivity growth, we closely match the U.S. current account evolution. Moreover, with household preferences that control the wealth effect on labor supply, we can generate output movements in line with the data.