Refine
Year of publication
Document Type
- Working Paper (3394) (remove)
Language
- English (2358)
- German (1016)
- Spanish (8)
- French (7)
- Multiple languages (2)
Is part of the Bibliography
- no (3394) (remove)
Keywords
- Deutschland (223)
- USA (64)
- Corporate Governance (53)
- Geldpolitik (53)
- Schätzung (52)
- Europäische Union (51)
- monetary policy (47)
- Bank (41)
- Sprachtypologie (34)
- Monetary Policy (31)
Institute
- Wirtschaftswissenschaften (1504)
- Center for Financial Studies (CFS) (1477)
- Sustainable Architecture for Finance in Europe (SAFE) (811)
- House of Finance (HoF) (669)
- Rechtswissenschaft (403)
- Institute for Monetary and Financial Stability (IMFS) (216)
- Informatik (119)
- Exzellenzcluster Die Herausbildung normativer Ordnungen (75)
- Gesellschaftswissenschaften (75)
- Geographie (64)
Valenz ist eine Zeitbombe, die im Lexikon deponiert ist und in der Grammatik detoniert. Im vorliegenden Beitrag geht es um die Grundlegung einer neuen Valenztheorie, der die Aufgabe zukommt, diese Bombe so empfindlich zu konstruieren, daß sie nicht mehr entschärft werden kann. Dabei möchte ich gleich am Anfang betonen, daß die Valenztheorie – genau und nur im Sinne der obigen Metapher – eine grammatische Teiltheorie darstellt, die nicht an ein bestimmtes Grammatikmodell gebunden ist. Zwar ist die Valenztheorie m enger Verbindung mit der Dependenzgrammatik entstanden, Valenztheorie und Dependenzgrammatik haben jedoch klar unterschiedliche Gegenstände. Auf die Bestimmung dieser Gegenstände komme ich am Ende meiner Erörterungen zu sprechen (vgl. 5.). Es soll von folgenden Arbeitsdefinitionen ausgegangen werden: (I) Valenzpotenz (kurz: Valenz) ist die Potenz relationaler Lexemwörter (Lexemwort' im Sinne von Coseriu), die zu realisierende grammatische Struktur zu prädeterminieren (vgl. auch Welke 1993; zur Relationalität vgl. Lehmann 1992:437f.). Aus dieser Arbeitsdefinition folgt, (a) daß Valenz für einen Teil der grammatischen Realisierung verantwortlich ist, aber auch (b) daß Valenz bei weitem nicht für alles in der grammatischen Realisierung verantwortlich ist. Eine ganze Reihe von morphologischen, syntaktischen, semantischen und konzeptuellen Prozessen wIe z.B. Derivation (verbale Präfixbildung), Konjugationstyp, syntaktische Konversion, Serialisierung, Graduierungen der Transitivität, Determinierung, Fokussierung usw. interagiert mit der Valenz, sobald diese eine grammatische Struktur mitzuerzeugen hat (vgl. auch 3.6).
Aus Anlass des Fernseh-Sendebeginns in West- und Ostdeutschland vor 50 Jahren fand im Hans-Bredow-Institut in Hamburg am 5. und 6. Dezember 2002 ein Symposion unter dem Titel "Fernsehgeschichte als Zeitgeschichte - Zeitgeschichte als Fernsehgeschichte" statt. In einem kritischen Beitrag untersuchte Peter Zimmermann vor allem die "Feindbildkonstruktionen" des westlichen Fernsehens, die sich nach Aufffassung des Referenten auch bis nach dem Fall der Mauer nachweisen lassen. Zimmermann: "Im freudetrunkenen Monat November des Jahres 1989 schien das deutsche Wintermärchen mit dem Fall der Mauer endlich ein glückliches Ende zu finden. Ganz ungetrübt verlief die Wiedervereinigung der 'deutschen Brüder und Schwestern' allerdings auch in medialer Hinsicht nicht. Mit der sogenannten Abwicklung des DDR-Fernsehens und der DEFA übernahm das Fernsehen der Bundesrepublik auch die ostdeutsche 'Bilderhoheit'. Die in Film und Fernsehen der DDR bislang dominanten positiven Selbstbilder wurden fortan durch die im Westen dominierenden negativen Fremdbilder ersetzt. Es ist daher wenig verwunderlich, dass seit der Wiedervereinigung in den Fernsehdokumentationen zur deutschen Geschichte fast ausnahmslos der westdeutsche Blick dominiert und die Geschichte der DDR marginalisiert, abgewertet oder karikiert wird."
Die fragmentierte Verrechtlichung des internationalen Raums, die Proliferation von Regelungsarrangements jenseits des Staates und die Diffusion globaler Normen sowie die daraus resultierenden Geltungs-, Kompetenz- und Autoritätskonflikte sind seit geraumer Zeit ein in der sozialwissenschaftlichen Literatur viel diskutiertes Phänomen. Überlappungen von nationalen Regierungssystemen und von im Völkerrecht verankerten klassischen internationalen Regimen existieren seit der Schaffung des Westfälischen Staatensystems.In jüngerer Zeit verstärkte sich der Pluralismus normativer Ordnungen jedoch global durch neuartige Typen von Regelungsarrangements jenseits des Staates. Auch unter den zwischenstaatlich geschaffenen internationalen Institutionen finden sich solche, die autonome Handlungs- und Entscheidungskompetenzen zugesprochen bekommen haben und diese als Akteure mit eigener Subjektivität ausüben. Hinzu kommt eine immer stärkere Aufnahme von „behind the border issues“ in den Aufgabenkatalog dieser Regime und Organisationen (Zürn 2004). Diese Entwicklungen führen zu einem neuen Grad an Kontestation und Umstrittenheit globaler normativer Ordnungen. Weder die Herstellung einer einheitlichen globalen normativen Ordnung noch eine Re-Nationalisierung des Rechts erscheinen heute als realistische Zukunftsprognosen. Umso wichtiger ist es daher, sich mit den Auswirkungen dieses Pluralismus’ normativer Ordnungen zu beschäftigen.
This paper is the first to conduct an incentive-compatible experiment using real monetary payoffs to test the hypothesis of probabilistic insurance which states that willingness to pay for insurance decreases sharply in the presence of even small default probabilities as compared to a risk-free insurance contract. In our experiment, 181 participants state their willingness to pay for insurance contracts with different levels of default risk. We find that the willingness to pay sharply decreases with increasing default risk. Our results hence strongly support the hypothesis of probabilistic insurance. Furthermore, we study the impact of customer reaction to default risk on an insurer’s optimal solvency level using our experimentally obtained data on insurance demand. We show that an insurer should choose to be default-free rather than having even a very small default probability. This risk strategy is also optimal when assuming substantial transaction costs for risk management activities undertaken to achieve the maximum solvency level.
Die Hauptthese dieser Dissertation ist, dass Nord-Sotho keinen obligatorischen Gebrauch von grammatischen Mitteln zur Markierung von Fokus macht, weder in der Syntax noch in der Prosodie oder Morphologie. Trotzdem strukturiert diese Sprache eine Äußerung nach informationsstrukturellen Aspekten. Konstituenten, die im Diskurs gegeben sind, werden entweder getilgt, pronominalisiert oder an den rechten oder linken Satzrand versetzt. Diese (morpho-)syntaktischen Prozesse wirken so zusammen, dass die fokussierte Konstituente oft final in ihrem Teilsatz erscheint. Obwohl die finale Position keine designierte Fokusposition ist, ist das Wissen um diese Tendenz doch entscheidend für das Verständnis einer morphologischen Alternation, die in Nord-Sotho am Verb erscheint und die in der Literatur im Zusammenhang mit Fokus diskutiert wurde.
Obwohl also ein direkter grammatischer Ausdruck von formaler F(okus)-Markierung im Nord-Sotho fehlt, ist F-Markierung trotzdem entscheidend für die Grammatik dieser Sprache: Fokussierte logische Subjekte können nicht in kanonischer präverbaler Position erscheinen. Sie erscheinen stattdessen entweder postverbal oder in einem Spaltsatz, abhängig von der Valenz des Verbs. Obwohl Nord-Sotho bei Objekten im Gebrauch von Spaltsätzen eine Korrespondenz von komplexer Form mit komplexer Bedeutung zeigt, gilt diese Korrespondenz nicht für logische Subjekte.
Die vorliegende Dissertation modelliert die oben genannten Ergebnisse im theoretischen Rahmen der Optimalitätstheorie (OT). Syntaktischer in situ Fokus und die Abwesenheit von prosodischer Fokusmarkierung können mit unkontroversen Beschränkungen erfasst werden. Für die Ungrammatikaliät fokussierter logischer Subjekte in präverbaler Position schlägt die vorliegende Arbeit die Modifizierung einer in der Literatur vorhandenen Beschränkung vor, die in Nord-Sotho von entscheidener Bedeutung ist. Die Form-Bedeutungs-Korrespondenz wird, wie andere Phänomene pragmatischer Arbeitsteilung auch, innerhalb der schwach bidirektionalen Optimalitätstheorie behandelt.
Eine wesentliche Voraussetzung für die Entschlüsselung herrschender Justizverständnisse ist die Auseinandersetzung mit den Rollen, die die beteiligten Akteure in einem Rechtssystem einnehmen sowie die Untersuchung der rechtlichen und institutionellen Bedingungen unter denen diese Akteure handeln. Der vorliegende Beitrag beschäftigt sich zunächst mit der Macht- und Aufgabenverteilung zwischen Richtern und Parteien. Dabei wird deutlich, dass die Rollenallokation nicht einheitlich ist, sondern in Abhängigkeit von unterschiedlichen verfahrensrechtlichen und institutionellen Voraussetzungen variiert. In Verfahren vor einer Jury wird die richterliche Autorität durch eine maximal ausgeprägte Parteiautonomie stark eingeschränkt. Als Rechthonoratioren (im Weberschen Sinne) agieren Richter dagegen immer dann, wenn Sie ohne Geschworene Recht sprechen. Dies geschieht insbesondere in den einzelstaatlichen Obergerichten und den Bundesberufungsgereichten, aber auch in Verfahren erster Instanz, in denen „claims in equity“ zu entscheiden sind. Der Beitrag beschäftigt sich abschließend mit dem Einfluss, den die Besonderheiten der amerikanischen Juristenausbildung auf das amerikanische Justizverständnis ausüben: Sie prägen und reproduzieren eine der Rollen und Selbstbilder unter amerikanischen Juristen, sowohl in der Anwaltschaft als auch auf Seiten der Richter.
Biodiversity loss poses a significant threat to the global economy and affects ecosystem services on which most large companies rely heavily. The severe financial implications of such a reduced species diversity have attracted the attention of companies and stakeholders, with numerous calls to increase corporate transparency. Using textual analysis, this study thus investigates the current state of voluntary biodiversity reporting of 359 European blue-chip companies and assesses the extent to which it aligns with the upcoming disclosure framework of the Task Force on Nature-related Financial Disclosures (TNFD). The descriptive results suggest a substantial gap between current reporting practices and the proposed TNFD framework, with disclosures largely lacking quantification, details and clear targets. In addition, the disclosures appear to be relatively unstandardized. Companies in sectors or regions exposed to higher nature-related risks as well as larger companies are more likely to report on aspects of biodiversity. This study contributes to the emerging literature on nature-related risks and provides detailed insights on the extent of the reporting gap in light of the upcoming standards.
To monitor one's speech means to check the speech plan for errors, both before and after talking. There are several theories as to how this process works. We give a short overview on the most influential theories only to focus on the most widely received one, the Perceptual Loop Theory of monitoring by Levelt (1983). One of the underlying assumptions of this theory is the existence of an Inner Loop, a monitoring device that checks for errors before speech is articulated. This paper collects evidence for the existence of such an internal monitoring device and questions how it might work. Levelt's theory argues that internal monitoring works by means of perception, but there are other empirical findings that allow for the assumption that an Inner Loop could also use our speech production devices. Based on data from both experimental and aphasiological papers we develop a model based on Levelt (1983) which shows that internal monitoring might in fact make use of both perception and production means.
With free delivery of products virtually being a standard in E-commerce, product returns pose a major challenge for online retailers and society. For retailers, product returns involve significant transportation, labor, disposal, and administrative costs. From a societal perspective, product returns contribute to greenhouse gas emissions and packaging disposal and are often a waste of natural resources. Therefore, reducing product returns has become a key challenge. This paper develops and validates a novel smart green nudging approach to tackle the problem of product returns during customers’ online shopping processes. We combine a green nudge with a novel data enrichment strategy and a modern causal machine learning method. We first run a large-scale randomized field experiment in the online shop of a German fashion retailer to test the efficacy of a novel green nudge. Subsequently, we fuse the data from about 50,000 customers with publicly-available aggregate data to create what we call enriched digital footprints and train a causal machine learning system capable of optimizing the administration of the green nudge. We report two main findings: First, our field study shows that the large-scale deployment of a simple, low-cost green nudge can significantly reduce product returns while increasing retailer profits. Second, we show how a causal machine learning system trained on the enriched digital footprint can amplify the effectiveness of the green nudge by “smartly” administering it only to certain types of customers. Overall, this paper demonstrates how combining a low-cost marketing instrument, a privacy-preserving data enrichment strategy, and a causal machine learning method can create a win-win situation from both an environmental and economic perspective by simultaneously reducing product returns and increasing retailers’ profits.
By focusing on the cost conditions at issuance, I find that not only the Covid-19 pandemic effects were different across bonds and firms at different stages, but also that the market composition was significantly affected, collapsing on investment- grade bonds, a segment in which the share of bonds eligible to the ECB corporate programmes strikingly increased from 15% to 40%. At the same time the high-yield segment shrunk to almost disappear at 4%. In addition to a market segmentation along the bond grade and the eligibility to the ECB programmes, another source of risk detected in the pricing mechanism is the weak resilience to pandemic: the premium requested is around 30 basis points and started to be priced only after the early containment actions taken by the national authorities. On the contrary, I do not find evidence supporting an increased risk for corporations headquartered in countries with a reduced fiscal space, nor the existence of a premium in favour of green bonds, which should be the backbone of a possible “green recovery”.
We assess the degree of market fragmentation in the euro-area corporate bond market by disentangling the determinants of the risk premium paid on bonds at origination. By looking at over 2,400 bonds we are able to isolate the country-specific effects which are a suitable indicator of the market fragmentation. We find that, after peaking during the sovereign debt crisis, fragmentation shrank in 2013 and receded to pre-crisis levels only in 2014. However, the low level of estimated market fragmentation is coupled with a still high heterogeneity in actual bond yields, challenging the consistency of the new equilibrium.
We analyze the risk premium on bank bonds at origination with a special focus on the role of implicit and explicit public guarantees and the systemic relevance of the issuing institutions. By looking at the asset swap spread on 5,500 bonds, we find that explicit guarantees and sovereign creditworthiness have a substantial effect on the risk premium. In addition, while large institutions still enjoy lower issuance costs linked to the TBTF framework, we find evidence of enhanced market disciple for systemically important banks which face, since the onset of the financial crisis, an increased premium on bond placements.
Unconventional green
(2023)
We analyze the effects of the PEPP (Pandemic Emergency Purchase Programme), the temporary quantitative easing implemented by the ECB immediately after the burst of the Covid-19 pandemic. We show that the differences in aim, size and flexibility with respect to the traditional Corporate Sector Purchase Programme (CSPP) were able to significantly involve, in addition to the directly targeted bonds, also the green bond segment. Via a standard difference- in-differences model we estimate that the yield on green bonds declined by more than 20 basis points after the PEPP. In order to take into account also the differences attributable to the eligibility to the programme, we employ a triple difference estimator. Bonds that at the same time were green and eligible benefitted of an additional premium of 39 basis points.
Chen and Zadrozny (1998) developed the linear extended Yule-Walker (XYW) method for determining the parameters of a vector autoregressive (VAR) model with available covariances of mixed-frequency observations on the variables of the model. If the parameters are determined uniquely for available population covariances, then, the VAR model is identified. The present paper extends the original XYW method to an extended XYW method for determining all ARMA parameters of a vector autoregressive moving-average (VARMA) model with available covariances of single- or mixed-frequency observations on the variables of the model. The paper proves that under conditions of stationarity, regularity, miniphaseness, controllability, observability, and diagonalizability on the parameters of the model, the parameters are determined uniquely with available population covariances of single- or mixed-frequency observations on the variables of the model, so that the VARMA model is identified with the single- or mixed-frequency covariances.
Linear rational-expectations models (LREMs) are conventionally "forwardly" estimated as follows. Structural coefficients are restricted by economic restrictions in terms of deep parameters. For given deep parameters, structural equations are solved for "rational-expectations solution" (RES) equations that determine endogenous variables. For given vector autoregressive (VAR) equations that determine exogenous variables, RES equations reduce to reduced-form VAR equations for endogenous variables with exogenous variables (VARX). The combined endogenous-VARX and exogenous-VAR equations comprise the reduced-form overall VAR (OVAR) equations of all variables in a LREM. The sequence of specified, solved, and combined equations defines a mapping from deep parameters to OVAR coefficients that is used to forwardly estimate a LREM in terms of deep parameters. Forwardly-estimated deep parameters determine forwardly-estimated RES equations that Lucas (1976) advocated for making policy predictions in his critique of policy predictions made with reduced-form equations.
Sims (1980) called economic identifying restrictions on deep parameters of forwardly-estimated LREMs "incredible", because he considered in-sample fits of forwardly-estimated OVAR equations inadequate and out-of-sample policy predictions of forwardly-estimated RES equations inaccurate. Sims (1980, 1986) instead advocated directly estimating OVAR equations restricted by statistical shrinkage restrictions and directly using the directly-estimated OVAR equations to make policy predictions. However, if assumed or predicted out-of-sample policy variables in directly-made policy predictions differ significantly from in-sample values, then, the out-of-sample policy predictions won't satisfy Lucas's critique.
If directly-estimated OVAR equations are reduced-form equations of underlying RES and LREM-structural equations, then, identification 2 derived in the paper can linearly "inversely" estimate the underlying RES equations from the directly-estimated OVAR equations and the inversely-estimated RES equations can be used to make policy predictions that satisfy Lucas's critique. If Sims considered directly-estimated OVAR equations to fit in-sample data adequately (credibly) and their inversely-estimated RES equations to make accurate (credible) out-of-sample policy predictions, then, he should consider the inversely-estimated RES equations to be credible. Thus, inversely-estimated RES equations by identification 2 can reconcile Lucas's advocacy for making policy predictions with RES equations and Sims's advocacy for directly estimating OVAR equations.
The paper also derives identification 1 of structural coefficients from RES coefficients that contributes mainly by showing that directly estimated reduced-form OVAR equations can have underlying LREM-structural equations.
Over the past few decades, changes in market conditions such as globalisation and deregulation of financial markets as well as product innovation and technical advancements have induced financial institutions to expand their business activities beyond their traditional boundaries and to engage in cross-sectoral operations. As combining different sectoral businesses offers opportunities for operational synergies and diversification benefits, financial groups comprising banks, insurance undertakings and/or investment firms, usually referred to as financial conglomerates, have rapidly emerged, providing a wide range of services and products in distinct financial sectors and oftentimes in different geographic locations. In the European Union (EU), financial conglomerates have become part of the biggest and most active financial market participants in recent years. Financial conglomerates generally pose new problems for financial authorities as they can raise new risks and exacerbate existing ones. In particular, their cross-sectoral business activities can involve prudentially substantial risks such as the risk of regulatory arbitrage and contagion risk arising from intra-group transactions. Moreover, the generally large size of financial conglomerates as well as the high complexity and interconnectedness of their corporate structures and risk exposures can entail substantial systemic risk and can therefore threaten the stability of the financial system as a whole. Until a few years ago, there was no supervisory framework in place which addressed a financial conglomerate in its entirety as a group. Instead, each group entity within a financial conglomerate was subject to the supervisory rules of its pertinent sector only. Such silo supervisory approach had the drawback of not taking account of risks which arise or aggravate at the group level. It also failed to consider how the risks from different business lines within the group interrelate with each other and affect the group as a whole. In order to address this lack of group-wide prudential supervision of financial conglomerates, the European legislator adopted the Financial Conglomerates Directive 2002/87/EC8 (‘FCD’) on 16 December 2002. The FCD was transposed into national law in the member states of the EU (‘Member States’) by 11 August 2004 for application to financial years beginning on 1 January 2005 and after. The FCD primarily aims at supplementing the existing sectoral directives to address the additional risks of concentration, contagion and complexity presented by financial conglomerates. It therefore provides for a supervisory framework which is applicable in addition to the sectoral supervision. Most importantly, the FCD has introduced additional capital requirements at the conglomerate level so as to prevent the multiple use of the same capital by different group entities. This paper seeks to examine to what extent the FCD provides for an adequate capital regulation of financial conglomerates in the EU while taking into account the underlying sectoral capital requirements and the inherent risks associated with financial conglomerates. In Part 1, the definition and the basic corporate models of financial conglomerates will be presented (I), followed by an illustration of the core motives behind the phenomenon of financial conglomeration (II) and an overview of the development of the supervision over financial conglomerates in the EU (III). Part 2 begins with a brief elaboration on the role of regulatory capital (I) and gives a general overview of the EU capital requirements applicable to banks and insurance undertakings respectively. A delineation of the commonalities and differences of the banking and the insurance capital requirements will be provided (II). It continues to further examine the need for a group-wide capital regulation of financial conglomerates and analyses the adequacy of the FCD capital requirements. In this context, the technical advice rendered by the Joint Committee on Financial Conglomerates (JCFC) as well as the currently ongoing legislative reforms at the EU level will be discussed (III). The paper finally closes with a conclusion and an outlook on remaining open issues (IV).
The financial services industry worldwide has undergone major transformation since the late 1970s. Technological advancements in information processing and communication facilitated financial innovation and narrowed traditional distinctions in financial products and services, allowing them to become close substitutes for one another. The deregulation process in many major economies prior to the recent financial crisis blurred the traditional lines of demarcation between the distinct types of financial institutions, exposing those firms to new competitors in their traditional business areas, while the increasing globalization of financial markets fostered the provision of financial services across national borders. Against this backdrop, a trend toward consolidation across financial sectors as well as across national borders increasingly manifested itself since the 1990s. The developments in the financial markets ever more intensified competition in the financial services industry and induced financial institutions to redefine their business strategies in search of higher profitability and growth opportunities. Consolidation across distinct financial sectors, i.e. financial conglomeration, in particular became a popular business strategy in light of the potential operational synergies and diversification benefits it can offer. This trend spurred the growth of diversified financial groups, the so-called financial conglomerates, which commingle banking, securities, and insurance activities under one corporate umbrella.5 Still today, large, complex financial conglomerates are represented among major players in the financial markets worldwide, whose activities not only sway across traditional boundaries of banking, securities, and insurance sectors but also across national borders.
Notwithstanding the economic benefits that conglomeration may produce as a business strategy, the emergence of financial conglomerates also exacerbated existing and created new prudential risks in the financial system. 6 The mixing of a variety of financial products and services under one corporate roof and the generally large and complex group structure of financial conglomerates expose such organizations to specific group risks such as contagion and arbitrage risk as well as systemic risk. When realized, these risks may not only cause the failure of an entire financial group but threaten the stability of the financial system as a whole, as evidenced by the events during recent financial crisis of 2007-2009...
I propose a dynamic stochastic general equilibrium model in which the leverage of borrowers as well as banks and housing finance play a crucial role in the model dynamics. The model is used to evaluate the relative effectiveness of a policy to inject capital into banks versus a policy to relieve households of mortgage debt. In normal times, when the economy is near the steady state and policy rates are set according to a Taylor-type rule, capital injections to banks are more effective in stimulating the economy in the long-run. However, in the middle of a housing debt crisis, when households are highly leveraged, the short-run output effects of the debt relief are more substantial. When the zero lower bound (ZLB) is additionally considered, the debt relief policy can be much more powerful in boosting the economy both in the short-run and in the longrun. Moreover, the output effects of the debt relief become increasingly larger, the longer the ZLB is binding.
Permanent conflict resolution at the high courts was one of the Holy Roman Empire’s main characteristics. This applies even to conflicts between rural communities and their lords, who could be dealt with, at least under certain circumstances, at the Imperial Chamber Court or the Aulic Council. These trials, however, were embedded in complicated processes of establishing and legitimizing claims on a local level as well as attempts to achieve a solution by violence or by arbitration. Researchers have stated that conflict resolution underwent, in the long run, a process of “juridification” (“Verrechtlichung”). This working paper proposes a method, based on Niklas Luhmann’s theory of procedural legitimation (“Legitimation durch Verfahren”), which possibly allows to detect elements of juridification and conflict resolution in the actions of parties and courts.
Die Öffnung des deutschen Bilanzrechts bewirkt eine zunehmende Anwendungsbreite von internationalen Rechnunungslegungsnormen (wie insbesondere der US-GAAP und der IAS) für deutsche Rechtsanwender;1 die heterogenen Normtypen und die – damit einhergehend – unterschiedlichen ökonomischen Eigenschaften dieser Normen erfordern für einen sinnvollen Rechnungslegungsvergleich eine komparative Rechnungslegungstheorie. Eine Besinnung auf die ökonomische Theorie ist – auch ausgelöst durch die Internationalisierung der Rechnungslegung – hier grundsätzlich festzustellen,2 wie auch das moderne deutsche Bilanzrecht seine heutige Prägung durch die ökonomische Theorie – und nicht vornehmlich durch die Anwender – erhielt.3 Es ist das Ziel des Aufsatzes, einen Beitrag zu einer institutionenökonomischen Theorie der Rechnungslegung zum Zweck der Bestimmung von Informationsinhalten und Gewinnansprüchen sowie zur vergleichenden Rechnungslegungstheorie zu leisten. In einem ersten Hauptteil (2) wird im folgenden – auf dem institutionenökonomischen Forschungsprogramm aufbauend – skizziert, welche Bedeutung Institutionen im Rahmen des Nutzenkalküls von Entscheidern zuzumessen ist; danach werden die einzelnen für eine vergleichende Rechnungslegung relevanten Institutionsarten typisiert (in formale und informelle Regeln) sowie deren Attribute im individuellen Zielstromkalkül eingeführt (nämlich Prädikate der Manipulationsfreiheit und Prädikate der Entscheidungsverbundenheit). Das Verhältnis der Institutionen zueinander wird im folgenden Abschnitt (3) anhand eines rechtlich geprägten und eines ökonomischen Systemverständnisses entwickelt. Es wird gezeigt, daß beide Systembegriffe auf einer Nichtadditivität der sie konstituierenden Institutionen gründen, die den qualitativen Vergleich unterschiedlicher Systeme erschweren; man überschätzt hingegen die Unterschiede zwischen juristischem und ökonomischem Systemverständnis: beide sind funktionsähnlich. Im letzten Hauptteil (4) werden schließlich vor dem Hintergrund einer gestaltenden Theorie die hierfür relevanten Teilbereiche (Sub-Systeme) der Rechnungslegungsordnung vorgestellt sowie einzelne Publizitätsnormen funktional ausgelegt. Der Beitrag schließt mit zusammenfassenden Thesen (5).
Mängel bei der Abschlußprüfung : Tatsachenberichte und Analysen aus betriebswirtschaftlicher Sicht
(2001)
Unternehmenskrisen, „überraschende“ zumal, standen am Anfang der gesetzlichen Normierung der Abschlußprüfung in Deutschland. Es entspricht daher einem legitimen Anliegen von Öffentlichkeit und Fachwelt, die herrschende Maßstäblichkeit der Qualität von Abschlußprüfungen und die Glaubwürdigkeit4 von Abschlußprüfern insbesondere dann in gesteigertem Maße als Problem zu begreifen, wenn den gesetzlichen Schutzzwecken und Schutznormen der etablierten Abschlußprüfung zum Trotz Unternehmen in eine Krise geraten: Denn innerhalb der institutionellen Mechanismen ihrer Früherkennung – eines funktionalen Teils des deutschen Systems von corporate governance – gilt die Pflichtprüfung mit Recht als pivotales Element. Vieles an festzustellender Kritik mag hierbei einem der Komplexität der zu verhandelnden Sachzusammenhänge unüberbrückbaren Laienverständnis geschuldet sein; manches aber ist sicherlich erklärlich durch verbesserbare gesetzliche Vorschriften, zu lösende theoretische (ökonomische und rechtswissenschaftliche) Problemstellungen und eine zu fördernde gute Berufspraxis. Jüngste fragliche Mängel der Abschlußprüfung geben den Anlaß zu vorliegenden Tatsachenberichten und betriebswirtschaftlichen Analysen. Die getroffene Auswahl der Unternehmen ist hierbei ebenso willkürlich wie die der betroffenen Wirtschaftsprüfungsgesellschaften – nicht zufällig ist indes die Auswahl der betriebswirtschaftlichen Grundprobleme: Betreffen diese doch wesentliche Erwartungen an die Abschlußprüfung, die offenbar so regelmäßig enttäuscht wurden, daß selbst in Regierungsbegründungen von Gesetzesvorlagen nunmehr eine „sog. Erwartungslücke“5 bemüht wird. Die Erwartungslücken ergeben sich hierbei insbesondere aus der Vorstellung, daß (a) der gesetzliche Pflichtprüfer bei einer ordnungsmäßigen Prüfung zwingend doloses Handeln aufzudecken habe, daß (b) bilanzielle Wertansätze hinreichend zuverlässige Größen bilden, die über die Vermögenslage des Unternehmens berichten und schließlich (c) die Prüfung der tatsächlichen wirtschaftlichen Lage des Unternehmens und die Unterrichtung hiervon in Prüfungsbericht und Bestätigungsvermerk eine Selbstverständlichkeit der Pflichtprüfung sei. Diesen Erwartungen folgt der Gang der Untersuchung.
Vorliegende Publikation verfolgt ein anderes Konzept. Auch sie will auf interessante Angebote im digitalen Reich aufmerksam machen. Sie beschränkt sich jedoch nicht auf den bloßen Link oder den knapp kommentierten Hinweis auf ein Programm, sondern charakterisiert das jeweilige Angebot im Blick auf seine praktische Brauchbarkeit. Das Kriterium der Brauchbarkeit sind die Erfahrungen, die der Autor jeweils damit gemacht hat. Man könnte eine solche Auswahl subjektiv nennen, aber wer, außer Subjekten, kann überhaupt etwas beurteilen?
Wer gerne auf Entdeckungsreisen geht und wen immer einmal das Fernweh packt, ohne dass er Zeit und Geld für große Expeditionen hätte, für den kann Google Earth eine Art Suchtmittel darstellen. Von Ayers Rock zum Fujijama und aus den Straßenschluchten Manhattans in die afrikanische Steppe: Mit Google Earth eine Sache von Sekunden.
The first part of the following paper deals with varying points of criticism forwarded against Ordoliberalism. Here, it is not the aim to directly falsify each argument on its own; rather, the author tries to give a precise overview of the spectrum of critique. The second section picks out one argument of critical review – namely that the ordoliberal concept of the state is somewhat elitist and grounded on intellectual experts. Based on the previous sections, the final part differentiates two kinds of genesis of norms: an evolutionary and an elitist one – both (latently) present within Ordoliberalism. In combination with the two-level differentiation between individual and regulatory ethics, the essay allows for a distinction between individual-ethical norms based on an evolutionary genesis of norms and regulatory-ethical norms based on an elitist understanding of norms. A by-product of the author’s argument is a (further) demarcation within neoliberalism.
Based on Foucault’s analysis of German Neoliberalism and his thesis of ambiguity, the following paper draws a two-level distinction between individual and regulatory ethics. The individual ethics level – which has received surprisingly little attention – contains the Christian foundation of values and the liberal-Kantian heritage of so called Ordoliberalism – as one variety of neoliberalism. The regulatory or formal-institutional ethics level on the contrary refers to the ordoliberal framework of a socio-economic order. By differentiating these two levels of ethics incorporated in German Neoliberalism, it is feasible to distinguish dissimilar varieties of neoliberalism and to link Ordoliberalism to modern economic ethics. Furthermore, it allows a revision of the dominant reception of Ordoliberalism which focuses solely on the formal-institutional level while mainly neglecting the individual ethics level.
June 4th, 2013 marks the formal launch of the third generation of the Equator Principles (EP III) and the tenth anniversary of the EPs – enough reasons for evaluating the EPs initiative from an economic ethics and business ethics perspectives. In particular, this essay deals with the following questions: What are the EPs and where are they going? What has been achieved so far by the EPs? What are the strengths and weaknesses of the EPs? Which necessary reform steps need to be adopted in order to further strengthen the EPs framework? Can the EPs be regarded as a role-model in the field of sustainable finance and CSR? The paper is structured as follows: The first chapter defines the term EPs and introduces the keywords related to the EPs framework. The second chapter gives a brief overview of the history of the EPs. The third chapter discusses the Equator Principles Association, the governing, administering, and managing institution behind the EPs. The fourth chapter summarizes the main features and characteristics of the newly released third generation of the EPs. The fifth chapter critically evaluates the EP III from an economic ethics and business ethics perspectives. The paper concludes with a summary of the main findings.
This paper analyzes liquidity in an order driven market. We only investigate the best limits in the limit order book, but also take into account the book behind these inside prices. When subsequent prices are close to the best ones and depth at them is substantial, larger orders can be executed without an extensive price impact and without deterring liquidity. We develop and estimate several econometric models, based on depth and prices in the book, as well as on the slopes of the limit order book. The dynamics of different dimensions of liquidity are analyzed: prices, depth at and beyond the best prices, as well as resiliency, i.e. how fast the different liquidity measures recover after a liquidity shock. Our results show a somewhat less favorable image of liquidity than often found in the literature. After a liquidity shock (in the spread or depth or in the book beyond the best limits), several dimension of liquidity deteriorate at the same time. Not only does the inside spread increase, and depth at the best prices decrease, also the difference between subsequent bid and ask prices may become larger and depth provided at them decreases. The impacts are both econometrically and economically significant. Also, our findings point to an interaction between different measures of liquidity, between liquidity at the best prices and beyond in the book, and between ask and bid side of the market.
Sissi im Film
(2018)
Ludwig van Beethoven im Film
(2018)
Inhalt
1. Strauß, Johann (Vater:/Sr.) (* 14.3.1804 – † 25.9.1849) [3]
2. Strauß, Johann (Sohn:/Jr.) (* 25.10.1825 – † 3.6.1899) [6]
3. Die Operetten-Adaptionen [12]
3.1 Die Fledermaus (Operette in 3 Akten, Johann Strauß, Sohn) :/:/ UA: 5.4.1874 [12]
3.2 Eine Nacht in Venedig (Operette in 3 Akten, Johann Strauß, Sohn) :/:/ UA: 3.10.1883 [17]
3.3 Der Zigeunerbaron (Operette in 3 Akten, Johann Strauß, Sohn) :/:/ UA: 24.10.1885 [18]
3.4 Wiener Blut (Johann Strauß, Sohn) :/:/ UA: 25.10.1899 [19]
3.5 Frühlingsluft (Jose. Strauß) :/:/ UA: 9.5.1903 [
Les Blank
(2017)
Venture capital (VC) investment has long been conceptualized as a local business , in which the VC’s ability to source, syndicate, fund, monitor, and add value to portfolio firms critically depends on their access to knowledge obtained through their ties to the local (i.e., geographically proximate) network. Consistent with the view that local networks matter, existing research confirms that local and geographically distant portfolio firms are sourced, syndicated, funded, and monitored differently. Curiously, emerging research on VC investment practice within the United States finds that distant investments, as measured by “exits” (either initial public offering or merger & acquisition) out-perform local investments. These findings raise important questions about the assumed benefits of local network membership and proximity. To more deeply probe these questions, we contrast the deal structure of cross-border VC investment with domestic VC investment, and contrast the deal structure of cross-border VC investments that include a local
partner with those that do not. Evidence from 139,892 rounds of venture capital financing in the period 1980-2009 suggests that cross-border investment practice, in terms of deal sourcing, syndication, and performance indeed change with proximity, but that monitoring practices do not. Further, we find that the inclusion of a local partner in the investment syndicate yields surprisingly few benefits. This evidence, we argue, raises important questions about VC investment practice as well as the ability of firms to capture and lever the presumed benefits of network membership.
We examine the dynamics of assets under management (AUM) and management fees at the portfolio manager level in the closed-end fund industry. We find that managers capitalize on good past performance and favorable investor perception about future performance, as reflected in fund premiums, through AUM expansions and fee increases. However, the penalties for poor performance or unfavorable investor perception are either insignificant, or substantially mitigated by manager tenure. Long tenure is generally associated with poor performance and high discounts. Our findings suggest substantial managerial power in capturing CEF rents. We also document significant diseconomies of scale at the manager level.
This paper considers the desirability of the observed tendency of central banks to adjust interest rates only gradually in response to changes in economic conditions. It shows, in the context of a simple model of optimizing private-sector behavior, that such inertial behavior on the part of the central bank may indeed be optimal, in the sense of minimizing a loss function that penalizes inflation variations, deviations of output from potential, and interest-rate variability. Sluggish adjustment characterizes an optimal policy commitment, even though no such inertia would be present in the case of a reputationless (Markovian) equilibrium under discretion. Optimal interest-rate feedback rules are also characterized, and shown to involve substantial positive coefficients on lagged interest rates. This provides a theoretical explanation for the numerical results obtained by Rotemberg and Woodford (1998) in their quantitative model of the U.S. economy.