Refine
Year of publication
Document Type
- Working Paper (3395) (remove)
Language
- English (2358)
- German (1017)
- Spanish (8)
- French (7)
- Multiple languages (2)
Keywords
- Deutschland (223)
- USA (64)
- Corporate Governance (53)
- Geldpolitik (53)
- Schätzung (52)
- Europäische Union (51)
- monetary policy (47)
- Bank (41)
- Sprachtypologie (34)
- Monetary Policy (31)
Institute
- Wirtschaftswissenschaften (1504)
- Center for Financial Studies (CFS) (1477)
- Sustainable Architecture for Finance in Europe (SAFE) (811)
- House of Finance (HoF) (669)
- Rechtswissenschaft (403)
- Institute for Monetary and Financial Stability (IMFS) (216)
- Informatik (119)
- Exzellenzcluster Die Herausbildung normativer Ordnungen (75)
- Gesellschaftswissenschaften (75)
- Geographie (64)
This paper examines the political-economy and cultural dynamics and discourses underlying the emergence of the Palestinian Hamas and the Algerian Islamic Salvation Front. Both movements emerged in the late 1980s as responses to continuing (neo) colonial conditions in their countries. I explore to what extent the various processes commonly referred to as “globalization,” both the world-wide economic transformations epitomized by post-fordism on the macro/system level and neo-liberal structural adjustment programs within countries, and—perhaps more important—its cultural dynamics contributed to the rise and power of both movements. I examine the socio-economic situation in Algeria and Palestine-Israel during the 1980s and link it to the politics developments in both countries. Next I review the events behind the founding of both movements and the main components of their ideologies and strategies. Finally I explore their arguments to determine whether the political-economic or cultural pressures unleashed by globalization were the determining factor in their emergence and ideological development. I conclude by comparing the two case studies to determine if there are common threads that can serve as the basis for a region-wide investigation of the role of globalization in the emergence and/or rise to social hegemony of Islamist movements in other MENA countries.
Anfang Juli dieses Jahres hat die „Regierungskommission Corporate Governance“ ihren Bericht abschließend beraten und dem Bundeskanzler übergeben. Der wissenschaftlichen Öffentlichkeit wird er dagegen heute vorgelegt. Der Generalbericht dazu ist in drei Teile gegliedert: Der erste Teil informiert über Hintergrund und Zuschnitt des Auftrags der Regierungskommission (unten II.). Der zweite Teil weist auf diejenigen Themen hin, die aus der Sicht der Kommissionsarbeit besonders bedeutsam erscheinen (unten III.). Der Schluß wird sich mit der Umsetzung der Empfehlungen der Regierungskommission befassen (unten IV.).
This paper uses laboratory experiments to provide a systematic analysis of how di↵erent presentation formats a↵ect individuals’ investment decisions. The results indicate that the type of presentation as well as personal characteristics influence both, the consistency of decisions and the riskiness of investment choices. However, while personal characteristics have a larger impact on consistency, the chosen risk level is determined more by framing e↵ects. On the level of personal characteristics, participants’ decisions show that better financial literacy and a better understanding of the presentation format enhance consistency and thus decision quality. Moreover, female participants on average make less consistent decisions and tend to prefer less risky alternatives. On the level of framing dimensions, subjects choose riskier investments when possible outcomes are shown in absolute values rather than rates of return and when the loss potential is less obvious. In particular, reducing the emphasis on downside risk and upside potential simultaneously leads to a substantial increase in risk taking.
German Expressionist cinema is a movement that began in 1919. Expressionist film is marked by distinct visual features and performance styles that rebel against prior realist art movements. More than 20 years prior to the Expressionist movement, Sigmund Frued published "The Interpretation of Dreams" in 1899, a ground breaking study that links dreams to unconcious impulses. This thesis argues that the unexplained dream - like imagery found in two Expressionist films, The Cabinet of Dr. Caligari (Robert Wiene, 1920) and Dr. Mabus, the Gambler (Fritz Lang, 1922) - can be seen in terms of Freud's model of dreaming.
Der vorstehende Beitrag hat untersucht, ob die Praxis, „defensive bids“ in Staatsanleiheauktionen abzugeben oder die Veröffentlichung einer künstlich erhöhten „Bid to Cover-Ratio“ durch die Emittenten von Staatsanleihen sowie ihre „Primary Dealers“ eine verbotene Marktmanipulation im Sinne des § 20 a WpHG darstellen. Das rechtspolitisch wenig überzeugende Ergebnis ist, dass der Emittent und Auslöser dieser Vorgänge – mangels Anwendbarkeit der Norm – nicht gegen das Verbot des § 20 a WpHG verstößt, die privatrechtlich organisierten „Primary Dealers“ – durch die Veröffentlichung der Kennzahl – hingegen durchaus. Inwiefern das aufgezeigte und als Marktmanipulation erkannte Verhalten strafrechtlich sanktioniert werden kann, soll hier nicht erörtert werden, hängt es doch sehr stark vom jeweiligen Einzelfall ab. Der Reputationsschaden, der aus dem Vorwurf der Marktmanipulation entsteht, ist indes nicht zu unterschätzen. Vielleicht gelingt es den „Primary Dealers“ aber gerade deswegen, unter Hinweis auf § 20 a WpHG die Finanzagenturen der Emittenten davon abzuhalten, „defensive bids“ von ihnen einzufordern. Insbesondere der gemeinschaftsrechtliche Hintergrund des Verbots der Marktmanipulation und das Streben nach einem Gleichlauf der Aufsichtspraxis für Marktmanipulationen legen es nahe, dass sich die Aufsichtsbehörden der entsprechenden Emittenten genauer mit den Vorgängen um die Staatsanleiheauktionen befassen. Schließlich besteht auch in Frankreich ein Verbot der Marktmanipulation in Form der Art. L. 465-2 Code monétaire et financier i.V.m. Art. 631-1 Règlement général de l’autorité des marchés financiers (AMF). Darüber hinaus stehen die Anforderungen der Richtlinie über Märkte für Finanzinstrumente (MiFiD) zur „best practice“ und dem Transparenzgebot zum Schutz der Anleger im Widerspruch zu dem geschilderten Verhalten. Vor diesem Hintergrund sollten weder „defensive bids“ eingefordert noch abgegeben werden. Damit wäre nicht nur den „Primary Dealers“, sondern vor allem dem Markt für Staatsanleihen und den Anlegern gedient.
Die Grundlagen der heutigen modernen Wortartenklassifikationen gehen bis in die Antike zurück: Bereits zu dieser Zeit hat Dionysius Thrax ein Schema mit acht Wortarten etabliert. Die darin auftretenden Wortarten sind Substantive, Verben, Adjektive, Artikel, Pronomen, Präpositionen, Adverbien und Konjunktionen. Diese Zahl wird wiederum in den unterschiedlichen Grammatikansätzen unserer Zeit variiert. So verwendet der generative Ansatz beispielsweise vier Wortarten – Bergenholtz/Schaeder (1977) verzeichnen dagegen ganze 51 verschiedene Wortarten und zusätzlich 5 Lexemklassen. Allein diese starken Schwankungen in der angenommenen Anzahl der Wortarten verdeutlichen die allgemeinen Schwierigkeiten bei der Abgrenzung der Wortarten in ihren Kriterien.
Das Zitat "Denn sie gliedern sich in Stämme wie die Menschen" aus Érik Orsennas "Die Grammatik ist ein sanftes Lied" leitet den Titel dieser Arbeit ein und markiert gleichzeitig eine Schnittstelle zwischen der Literaturwissenschaft und der Linguistik und speziell der Grammatik. Als metasprachliche Erzählung setzt sich Orsennas Erzählung literarisch mit der Sprache und ihrer Grammatik auseinander. In der vorliegenden Arbeit beschäftige ich mich vorrangig mit der Analyse der Kriterien zur Klassifikation von Wortarten und ihrer literarischen Darstellung und Ausgestaltung in Orsennas Text über die Wörter, die in Stämmen in der Stadt der Wörter zusammenleben und in einer Fabrik miteinander zu Sätzen verbunden werden können. Der Originaltext von Orsenna ist eine Erzählung in französischer Sprache. Die Übersetzerin Caroline Vollmann hat den Text an die Gegebenheiten und speziellen Phänomene der deutschen Sprache angepasst. Aus diesem Grund spreche ich in der Arbeit von Orsenna und Vollmann als Verfassern.
Da die Darstellung der Wortarten bei Orsenna und Vollmann primär durch Metaphern realisiert wird und den Wörtern als "Stämmen" in einer Stadt menschliche Eigenschaften zugewiesen werden, möchte ich besonders auf die Grundlagen der kognitiven Metapherntheorie von Lakoff und Johnson eingehen. Um eine möglichst wissenschaftlich fundierte Grundlage für die Analyse von Kriterien zur Wortartenklassifikation zu gewährleisten, habe ich drei Grammatiken als Vergleichsmedium für die spätere Analyse von Orsennas und Vollmanns Text ausgewählt. Dadurch gewinne ich sowohl eine syntaktisch als auch morphologisch und semantisch orientierte Perspektive auf den Untersuchungsgegenstand. Aus den Grammatiken von Hentschel/Weydt (2003), Helbig/Buscha (2005) und Boettcher (2009) soll im Verlauf der Arbeit ein Kriterienkatalog erstellt werden, der in einem weiteren Schritt auf die Analyse der Wortartenklassifikation des literarischen Textes angewendet werden kann.
The experience in the period during and after the Asian crisis of 1997-98 has provoked an extensive debate about the credit rating agencies' evaluation of sovereign risk in emerging markets lending. This study analyzes the role of credit rating agencies in international finan-cial markets, particularly whether sovereign credit ratings have an impact on the financial stability in emerging market economies. The event study and panel regression results indicate that credit rating agencies have substantial influence on the size and volatility of emerging markets lending. The empirical results are significantly stronger in the case of government's downgrades and negative imminent sovereign credit rating actions such as credit watches and rating outlooks than positive adjustments by the credit rating agencies while by the market participants' anticipated sovereign credit rating changes have a smaller impact on financial markets in emerging economies.
Eine unübersehbare Menge neuer Anglizismen findet über Fach- und Gruppensprachen Eingang in die deutsche Alltagssprache, in der ein Teil von ihnen inzwischen seinen festen Platz hat. […] Insbesondere in den Bereichen der Lautung und der Schreibung bleibt bei den neueren Entlehnungen oberflächlich eine große Nähe zu gebersprachlichen Strukturen erhalten. Diese Entwicklung wird von einigen Fachleuten und Politikern […] als Indiz für eine schleichende ‚Kolonialisierung’ der deutschen Sprache durch das Englische herangezogen. [...] Dieser Einschätzung widersprechen zahlreiche Organe […] und Autoren […] ausdrücklich. […] Im Kontext dieser Auseinandersetzung ist die vorliegende Arbeit verortet. Ihr Ziel ist es zu zeigen, daß die Sprecher des Deutschen Anglizismen sehr wohl phonologisch, graphematisch und morphologisch in die deutsche Sprache integrieren. Untersuchungsgegenstand sind mehrgliedrige Verben, die aus dem Englischen entlehnt wurden und überwiegend in Fach- und Gruppensprachen und/oder in informellem, vorwiegend mündlichem Text auftreten. Für das Problemfeld der verbalen Wortbildung wird dargelegt, daß morphologische Integration nicht unsystematisch erfolgt, sondern sich an den Flexionsmustern deutscher komplexer Verben orientiert. Der Integrationsgrad der einzelnen Lexeme ist dynamisch und sprecherabhängig.
We assess the relationship between finance and growth over the period 1980-2014. We estimate a cross-country growth regression for 48 countries during 20 periods of 15 years starting in 1980 (to 1995) and ending in 1999 (to 2014). We use OLS and IV estimations and we find that: 1) overall financial development had a positive effect on economic growth during all periods of our sample, i.e., we confirm that from 1980 to 2014 financial services provided by the various financial systems were significant (to various degrees) for firm creation, industrial expansion and economic growth; but that, 2) the structure of financial markets was particularly relevant for economic growth until the financial crisis; while 3) the structure of the banking sector played a major role since; and finally that, 4) the legal system is the primary determinant of the effectiveness of the overall financial system in facilitating innovation and growth in (almost) all of our sample period. Hence, overall our results suggest that the relationship between finance and growth matters but also that it varies over time in strength and in sector origination.
JEL Classification: O16, G16, G20.
It is an established policy in the United States to separate commercial banking (the business of taking deposits and making commercial loans) from other commercial activities. The separation of banking and commercial activities is achieved by federal and state banking laws, which enumerate the powers that banks may exercise, the activities that banks may engage in, and the investments that banks may lawfully make, and expressly exclude banks from certain activities or relationships. Some of these provisions could be circumvented if a nonbank company could carry on banking activities through a banking subsidiary and nonbanking activities either itself or through a nonbanking subsidiary.
Im folgenden sollen Alternativen des Delisting, der Beendigung der Notierung eines zum Börsenhandel zugelassenen Unternehmens, erörtert werden. Dieses Thema hat mit der Berücksichtigung im Dritten Finanzmarktförderungsgesetz (III. FMFG) erneut an Aktualität gewonnen. Einleitend wird ein Überblick über das Delisting im engeren Sinne (d. h. ohne vorangehende Umwandlung), also den Börsenrückzug nur auf Antrag der Gesellschaft ohne Änderung der Rechtsform des betreffenden Unternehmens, gegeben. Dabei wird sich der Beitrag darauf beschränken, den Totalrückzug (sog. going private ) zu untersuchen. Das Interesse an einem going private kann vielfältiger Natur sein. In Betracht kommt beispielsweise ein vollständiger Börsenrückzug, um den börsenrechtlichen Publizitätspflichten zu entkommen und somit durch die abnehmende Transparenz aus den Negativschlagzeilen zu gelangen, um Kosten zu senken, um einer Aufkaufgefahr entgegenzutreten oder sogar um ein freezeout von Kleinaktionären vorzubereiten. Allerdings sind neben den börsenrechtlichen Voraussetzungen insbesondere die gesellschaftsrechtlichen Anforderungen unklar und erschweren daher einen praktikablen Totalrückzug. In der Beratungspraxis kann jedenfalls nicht mit Sicherheit vorhergesagt werden, welche gesellschaftsrechtlichen und börsenrechtlichen Bedingungen erfüllt sein müssen, damit ein reguläres going private vonstatten gehen kann. Dies zeigt sich besonders deutlich, wenn man sich die unterschiedlichen Lösungsvorschläge zur Beteiligung der Hauptversammlung vergegenwärtigt, wobei insbesondere im Hinblick auf die erforderliche Mehrheit die unterschiedlichsten Ansichten vertreten werden [dazu unten I 1) b)] und zudem eine klärende Rechtsprechung fehlt. Deshalb wird eine rechtssichere Alternative zum regulären going private vorgestellt, die es einer rückzugsgeneigten Gesellschaft ermöglicht, in praktikabler Weise ein vollständiges Delisting zu erreichen. Der Hauptteil des vorliegenden Beitrages ist daher dem going private über das UmwG gewidmet, nämlich der Darstellung der Börsenrückzugsmöglichkeit durch Verschmelzung oder Formwechsel einerseits ( kaltes Delisting ) und der Erörterung der auf diesem Weg des Delisting auftretenden Umgehungsprobleme andererseits [dazu unten II].
Gegenstand des Beitrages ist das Konzeptionsproblem der juristischen Schlüsselqualifikationen, das bisher auf der Ebene der Gesetzgebung, der Wissenschaft und der Praxis ungelöst ist. Gerade diese Tatsache könnte paradoxerweise mittel- und langfristig dazu führen, dass sich das Profil rechtswissenschaftlicher Fakultäten schärft, die Schlüsselqualifikationen in ihre rechtswissenschaftliche Ausbildung systematisch, aber nicht naiv, integrieren. Dazu muss ein funktionaler Blick auf die in den universitären Alltag zu integrierenden Schlüsselqualifikationen geworfen werden. Diese sind nicht selbsterklärend, sondern lediglich Mittel, die einem bestimmten Zweck dienen, der selbst wieder begründungsbedürftig ist. Wer als Studierender von einer Profilbildung juristischer Fakultäten profitieren will, sollte sich mit den zunehmend deutlicher werdenden Entwicklungen auseinander setzen, die dazu führen, Schlüsselqualifikationen auf eine je charakteristische Weise in den Ausbildungsalltag einer Fakultät einzubinden - oder aus diesem auszuschließen. Zu wünschen wäre, dass sich über kurz oder lang ein hochschulübergreifendes Forum herausbildet. Ziel wäre es, die Praxisrelevanz der rechtswissenschaftlichen Ausbildung zu steigern und deren Hinwendung zu rechtsdidaktischen, deontologischen und konzeptionellen Fragen zu ermöglichen, ohne das wissenschaftliche Fundament der Hochschulen und deren Autonomie einzuschränken, sondern beides im Idealfall zu stimulieren. Die Verfasser, die beide seit Jahren Seminare am Fachbereichszentrum für Schlüsselqualifikationen des Fachbereichs FB 01 der Johann Wolfgang Goethe-Universität Frankfurt am Main leiten, stellen damit keine Prognose auf, dass die Integration von Schlüsselqualifikationen im hier beschriebenen Sinn flächendeckend stattfinden oder gelingen wird. Nichtsdestoweniger zeigt sich am Umgang mit den Schlüsselqualifikationen exemplarisch, welche Analysen und Ableitungen eine Hochschule aus dem gesetzlich vorgegebenen Dialog zwischen Theorie und Praxis entwickelt hat. Hieraus lassen sich wiederum Schlussfolgerungen über den Zustand der Hochschulausbildung in Deutschland ziehen. Nach Auffassung der Autoren haben Hochschulen mit inklusivem Ansatz eine bessere Chance zur Bewahrung der eigenen Autonomie als solche, die sich gegenüber tendenziell übergriffigen Akteuren aus Wirtschaft und Politik (vermeintlich) kategorisch abschotten. Hochschulen, die das Stadium einer leitbildgetreuen Dialogfähigkeit erreichen, haben es einfacher, neben ihrer Wettbewerbsfähigkeit auch ihr Kernanliegen zu behaupten. Die Bedeutung von Hochschulen, die ihre Augen vor einer Aufweichung des ihnen zukommenden Forschungs-, Lehr- und Bildungsauftrags verschließen, wird, so die Prognose der Autoren, in Gesellschaft, Wirtschaft und Politik auf Dauer schwinden.
Die Bilder der Terroranschläge des 11. September in den Medien erinnerten viele Menschen an Katastrophenfilme aus Hollywood. Michael Staiger geht davon aus, dass die Bildsprache solcher Filme auch unsere Wahrnehmungsweise des realen Terrors beeinflusst. Er stellt eine zunehmende Verflechtung der Ästhetik des Hollywood-Kinos mit der Medienberichterstattung realer Ereignisse fest. "Die Inszenierung von Fernsehnachrichten verweist inzwischen ebenso auf fiktionale Bildwelten, wie sich die Spielfilmästhetik seit Jahren der Machart von Fernsehbildern bedient." Am Beispiel des Thrillers "Ausnahmezustand" (1998) analysiert Michael Staiger die Inszenierung des Terrors in Hollywood-Filmen und zeigt damit eine medienpädagogische Umsetzungsmöglichkeit des Themas auf.
Rechtswissenschaftliche Abhandlungen und Veranstaltungen zu internationalen Gerichten stehen häufig unter dem Titel „Internationale Streitbeilegung“. Es wäre aber viel besser, so die Leitthese dieses Beitrags, solche Texte und Veranstaltungen als „internationale Gerichtsbarkeit“ zu betiteln. Dies ist keineswegs ein bloßer Streit um Worte, da hinter diesen Alternativen unterschiedliche rechtswissenschaftliche Auffassungen stehen. Im Folgenden sei gezeigt, dass anders als die Be-zeichnung „Internationale Streitbeilegung“ suggeriert, nicht nur eine, sondern vier Funktionen die Rechtsprechung heutiger internationaler Gerichte kennzeichnen. Es handelt sich dabei um: Streitbeilegung im Einzelfall, Stabilisierung normativer Erwartungen, Rechtschöpfung sowie Kontrolle und Legitimation öffentlicher Gewalt. Die Ana-lyse dieser Funktionen zeigt, dass die Bezeichnung „Internationale Streitbeilegung“ überkommen ist. Entsprechend sollte die Bezeichnung des Fachs geändert und es als Teil des Fachs internationale Institutionen verortet werden.
We present a simple model of personal finance in which an incumbent lender has an information advantage vis-a-vis both potential competitors and households. In order to extract more consumer surplus, a lender with sufficient market power may engage in "irresponsible"lending, approving credit even if this is knowingly against a household’s best interest. Unless rival lenders are equally well informed, competition may reduce welfare. This holds, in particular, if less informed rivals can free ride on the incumbent’s superior screening ability.
Sur initiative du Professeur Paul Krüger Andersen, Danemark, et de l’auteur du présent article1, les 27 et 28 septembre 2007 a eu lieu au Danemark la première réunion d’une commission qui s’est fixé comme objectif la conception d’un European Model Company Law Act (EMCLA). Le projet sera décrit dans ce qui suit. Il ne vise ni l’harmonisation impérative des droits des sociétés nationaux ni la création d’une forme supplémentaire de société européenne. Le but est d’élaborer des normes modèles pour les sociétés de capitaux, dans un premier temps pour la société anonyme, qui pourraient être reprises tout ou en partie par les législateurs nationaux. Le projet doit donc être conçu comme une alternative ou un complément aux instruments existants d’harmonisation légale au niveau communautaire (II.). Il convient par la suite de décrire l’expérience américaine avec de telles « lois modèles » en matière de droit des sociétés (III.). Enfin une ébauche des problèmes spécifiques auxquels se heurtera le EMCLA sera faite tandis que seront exposés la composition et le plan de travail de la commission (IV.).
Luis de Molina (1535-1600) grants slaves a legal status through which they can take up a position with respect to their masters between equivalent legal entity and legal object. Here, what is decisive is the figure of the subjective right, which both for Molina and modern proponents of this legal concept describes the 'right per se'. According to Molina's definition of ius, the denial of a subjective right or the hindrance of exercising an individual right represents an injustice. The rights granted to a slave in virtue of his being regarded a human being (despite the condition of slavery) serve to protect the slave against unjust acts. Molina does not distinguish the slave as a legal entity as separate from his master insofar as the slave should be protected against injustices committed against him or his property; injustices for which he would be entitled to compensation. Yet, the slave is not able to stake his claim to a particular right because it is not possible for him to take the matter to court. His natural law justified coequal legal status with respect to his master is limited in such a way by the positive legal order (by means of which slavery is generally made possible) that he is to be held legally incompetent as a legal entity with regard to defending and enforcing his 'qua homo'-legal rights. This precarious situation is due to the complicated legal intermediate position of a human legal entity, which, at the same time, represents the legal object of another person.
The concept of length, the concept is synonymous, the concept is nothing more than, the proper definition of a concept ... Forget programs and visions; the operational approach refers specifically to concepts, and in a very specific way: it describes the process whereby concepts are transformed into a series of operations—which, in their turn, allow to measure all sorts of objects. Operationalizing means building a bridge from concepts to measurement, and then to the world. In our case: from the concepts of literary theory, through some form of quantification, to literary texts.
This note discusses the basic economics of central clearing for derivatives and the need for a proper regulation, supervision and resolution of central counterparty clearing houses (CCPs). New regulation in the U.S. and in Europe renders the involvement of a central counterparty mandatory for standardized OTC derivatives’ trading and sets higher capital and collateral requirements for non-centrally cleared derivatives.
From a macrofinance perspective, CCPs provide a trade-off between reduced contagion risk in the financial industry and the creation of a significant systemic risk. However, so far, regulation and supervision of CCPs is very fragmented, limited and ignores two important aspects: the risk of consolidation of CCPs on the one side and the competition among CCPs on the other side. i) As the economies of scale of CCP operations in risk and cost reduction can be large, they provide an argument in favor of consolidation, leading at the extreme to a monopoly CCP that poses the ultimate default risk – a systemic risk for the entire financial sector. As a systemic risk event requires a government bailout, there is a public policy issue here. ii) As long as no monopoly CCP exists, there is competition for market share among existing CCPs. Such competition may undermine the stability of the entire financial system because it induces “predatory margining”: a reduction of margin requirements to increase market share.
The policy lesson from our consideration emphasizes the importance of a single authority supervising all competing CCPs as well as of a specific regulation and resolution framework for CCPs. Our general recommendations can be applied to the current situation in Europe, and the proposed merger between Deutsche Börse and London Stock Exchange.
Das am 01.01.2002 in Kraft getretene Gesetz zur Regelung von öffentlichen Angeboten zum Erwerb von Wertpapieren und von Unternehmensübernahmen (WpÜG) beschränkt sich - anders als noch der Diskussionsentwurf des WpÜG - nicht auf die Regelung von öffentlichen Angeboten zum Erwerb von Wertpapieren, die auf den Erwerb der Kontrolle an einer Zielgesellschaft gerichtet sind oder eine bereits bestehende Kontrollmehrheit voraussetzen, sondern trifft darüber hinaus mit den §§ 10 - 28 WpÜG Bestimmungen für jegliche öffentlichen Angebote zum Erwerb von Wertpapieren. Die naheliegende Frage, ob hierzu auch öffentliche Angebote zum Rückerwerb eigener Aktien, namentlich aufgrund Hauptversammlungsermächtigung gemäß § 71 Abs. 1 Nr. 8 S. 1 AktG, rechnen, lässt das Gesetz unbeantwortet. Erste Stellungnahmen in der Literatur gehen davon aus, daß das WpÜG auch auf solche self tender offers unmittelbar Anwendung finde, einzelne nicht passende Bestimmungen der §§ 10 - 28 WpÜG allerdings teleologisch zu reduzieren seien. Die Verfasser widersprechen der These einer unmittelbaren Anwendbarkeit des WpÜG auf öffentliche Angebote zum Rückerwerb eigener Aktien und befassen sich sodann mit der Frage, ob einzelne Vorschriften des WpÜG auf self tender offers analoge Anwendung finden.
The present article explores perceptions and cultural constructions of the terms capitalism or capitalistic West among ex-Soviet, highly qualified Jewish migrants from Russia and Ukraine after their emigration to Germany between 1990 and 1996. It seems that migration offers a unique opportunity to migrants to realise knowledge that is normally taken for granted, behaviour schemes and values, and to reflect on them. How do they acquire such presumed capitalist knowledge of the new society and new social world, how do they create it, and with what concrete contents do they connect the illusion about monolithic cultural, economic and political capital, the illusion which contributes to group formation and which serves as action orientation? As my research shows, immigrants try to disparage much of what appeared to them in the Soviet Union as normative, right and appropriate; now they often act by way of categories, which were defined in the previous context as "capitalist" and were interpreted as immoral. Without exact ideas or knowledge about behaviour codes, unspoken norms and silent values from the new society, many immigrants orient themselves towards the opposite of what was counted as morally proper in the origin society. Simultaneously they revive old system through the establishing and development of a Russian language enclave. Nevertheless this enclave is not located in a vacuum of "dusty" memories from the past, but build transnational cross-border space connected and corresponding to the processes of to-day's CIS and with the life of those relatives and friends who still live there, und with whom the emigrants share intensive social networks.
In der vorliegenden Studie werden die sozialpolitischen Reformen in den USA und Kanada während der 1990er Jahren in einer vergleichenden Perspektive analysiert. Dabei wird insbesondere die Rolle steuerpolitischer Instrumentarien in den Reformen thematisiert und der Frage nachgegangen, ob sich hier ein neuer Typ von Wohlfahrtsstaat herausbildet. Im ersten Teil des Papiers wird das in der vergleichenden Wohlfahrtsstaatsforschung etablierte Modell des liberalen Wohlfahrtsstaats skizziert, um vor diesem Hintergrund die Reformen in den USA und Kanada zu untersuchen und zu vergleichen. Anschließend wird in einer breiteren vergleichenden Perspektive die out-put-Leistung der beiden Wohlfahrtsstaaten analysiert. Al normative Kriterien hierbei gilt in erster Linie die Umverteilungsfunktion sozialpolitischer Instrumentarien, hier in erster Linie verstanden als Einkommensumverteilung.
On April 24, 2001 the European Commission presented a proposal for a Directive1 introducing supplementary supervision of financial conglomerates (the Proposed Directive). The Proposed Directive requires a closer coordination among supervisory authorities of different sectors of the financial industry and leads to changes in the number of existing Directives relating to the supervision of credit institutions, insurance undertakings and investment firms.
After he had only tightly lost the election in July 2006, Andrés Manuel López Obrador and his Coalición claimed fraud and asserted that unfair conditions during the campaign had diminished his chances to win the presidency. The paper investigates this latter allegation centering on a perceived campaign of hate, unequal access to campaign resources and malicious treatment by the mass media. It further analyzes the mass media’s performance during the conflictual post electoral period until the final decision of the Federal Electoral Tribunal on September 5th, 2006. While the media’s performance during the campaign tells us about their compliance with fair media coverage mechanisms that have been implemented by electoral reforms in the 1990s, the mass media is uncontained by such measures after the election. Thus, their mode of coverage of the postelectoral conflicts allows us to “test” the mass media’s transformation to a more unbiased, social responsible “fourth estate”. Finally the paper scrutinizes whether the claims of fraud and the protests by the leftist movement resulted in lower levels of institutional trust and democratic support. The analysis of the media performance is based on data provided by the Federal Electoral Institute (IFE). Its Media Monitor encompassed more than 150 TV stations, 240 radio stations and 200 press publications. However, there is no comparable data available for the postelectoral period. Interviews with Mexican media experts, which the author has conducted during the postelectoral period, serve as empirical basis for the second part. Data on the public opinions and attitudes of Mexican citizens are taken from the 2007 Latinobarometro, the 2006 Encuesta Nacional and several polls conducted by Grupo Reforma. The results do not support López Obradors notions. Even though a strong party bias is characteristic of the Mexican media system, all findings hint at a continuity of balanced campaign coverage and fair access to mass media publicity. Coverage during the postelectoral period was more polarized, yet both sides remained at least partially open for oppositional views. The claims of fraud, mass protest mobilization and anti-institutional discourse by Lopez Obrador’s leftist movement seem not to have caused significant loss in institutional trust, support of and satisfaction with democracy, even though these levels remain quite low.
Die Verbriefung des Cash Flows eines Unternehmens ist eine in Großbritannien bekannte und etablierte Form der Unternehmensfinanzierung. In Deutschland hat es bisher erst zwei Transaktionen dieser Art gegeben. Die Gründe hierfür liegen in den unterschiedlichen rechtlichen Systemen und den unterschiedlichen Möglichkeiten der Darlehensbesicherung. Dieser Aufsatz beschreibt die diesbezüglichen wesentlichen Unterschiede und stellt Strukturen vor, mit denen auch im deutschen Rechtskreis entsprechende Transaktionen umgesetzt werden können.
Die spätantiken ,Disticha Catonis', von einem unbekannten Autor im 3./4. Jahrhundert verfasst, dienten seit karolingischer Zeit dem Unterricht in der gramatica. Dort hielten sie dem Lateinschüler sprachliches Anschauungsmaterial ebenso bereit wie, das war dem mittelalterlichen Trivialunterricht nicht minder wichtig, elementare Verhaltenslehre in leicht memorierbarer Form. So unterweisen die circa 140 Hexameterdistichen aus einer vulgärstoizistischen Grundhaltung heraus etwa im rechten Umgang mit Besitz, mit den eigenen Affekten, mit Leid und Tod, oder wie man sich Fremden, Freunden oder der eigenen Frau gegenüber verhalten soll. Der Bestand der Distichen wurde bereits in mehreren vorkarolingischen Redaktion auf vier Bücher verteilt und um Prosasentenzen (breves sententiae) vor Buch I und metrische Vorreden zu Buch II-IV erweitert. Eine das Werk eröffnende, knappe Prosavorrede (praefatio) ist einem sorgenden Vater in den Mund gelegt, der die Lehren seinem geliebten Sohn ans Herz legt.
n this paper we analyze an economy with two heterogeneous investors who both exhibit misspecified filtering models for the unobservable expected growth rate of the aggregated dividend. A key result of our analysis with respect to long-run investor survival is that there are degrees of model misspecification on the part of one investor for which there is no compensation by the other investor's deficiency. The main finding with respect to the asset pricing properties of our model is that the two dimensions of asset pricing and survival are basically independent. In scenarios when the investors are more similar with respect to their expected consumption shares, return volatilities can nevertheless be higher than in cases when they are very different.
This paper examines thoroughly the Chilean Pension Reform, giving first an overview of the mandatory saving plan, the relevant institutions, and the rules for transition from the old to the new system. The main part of the paper contains a critical evaluation of the reform, in particular the macroeconomic performance with respect to capital formation and growth, and the effects on the savings rate as well as on the rates of return and labor market are discussed. Furthermore, the development of capital markets is reviewed. A short critique is presented with respect to intergenerational distribution and risk sharing as well as with respect to the social consequences. This paper is the result of a CFS sponsored research project. A preliminary version was presented at the meeting of the committee of Social Policy of the Verein fuer Socialpolitik, May 1999 and at the 55th Congress of IIPF, 23-26 August 1999, in Moskow.
Motivated by the U.S. events of the 2000s, we address whether a too low for too long interest rate policy may generate a boom-bust cycle. We simulate anticipated and unanticipated monetary policies in state-of-the-art DSGE models and in a model with bond financing via a shadow banking system, in which the bond spread is calibrated for normal and optimistic times. Our results suggest that the U.S. boom-bust was caused by the combination of (i) too low for too long interest rates, (ii) excessive optimism and (iii) a failure of agents to anticipate the extent of the abnormally favorable conditions.
The term structure of interest rates is crucial for the transmission of monetary policy to financial markets and the macroeconomy. Disentangling the impact of monetary policy on the components of interest rates, expected short rates and term premia, is essential to understanding this channel. To accomplish this, we provide a quantitative structural model with endogenous, time-varying term premia that are consistent with empirical findings. News about future policy, in contrast to unexpected policy shocks, has quantitatively significant effects on term premia along the entire term structure. This provides a plausible explanation for partly contradictory estimates in the empirical literature.
June 4th, 2013 marks the formal launch of the third generation of the Equator Principles (EP III) and the tenth anniversary of the EPs – enough reasons for evaluating the EPs initiative from an economic ethics and business ethics perspectives. In particular, this essay deals with the following questions: What are the EPs and where are they going? What has been achieved so far by the EPs? What are the strengths and weaknesses of the EPs? Which necessary reform steps need to be adopted in order to further strengthen the EPs framework? Can the EPs be regarded as a role-model in the field of sustainable finance and CSR? The paper is structured as follows: The first chapter defines the term EPs and introduces the keywords related to the EPs framework. The second chapter gives a brief overview of the history of the EPs. The third chapter discusses the Equator Principles Association, the governing, administering, and managing institution behind the EPs. The fourth chapter summarizes the main features and characteristics of the newly released third generation of the EPs. The fifth chapter critically evaluates the EP III from an economic ethics and business ethics perspectives. The paper concludes with a summary of the main findings.
We collect data on the size distribution of all U.S. corporate businesses for 100 years. We document that corporate concentration (e.g., asset share or sales share of the top 1%) has increased persistently over the past century. Rising concentration was stronger in manufacturing and mining before the 1970s, and stronger in services, retail, and wholesale after the 1970s. Furthermore, rising concentration in an industry aligns closely with investment intensity in research and development and information technology. Industries with higher increases in concentration also exhibit higher output growth. The long-run trends of rising corporate concentration indicate increasingly stronger economies of scale.
This policy letter provides an overview of the strengths, weaknesses, risks and opportunities of the upcoming comprehensive risk assessment, a euro area-wide evaluation of bank balance sheets and business models. If carried out properly, the 2014 comprehensive assessment will lead the euro area into a new era of banking supervision. Policy makers in euro area countries are now under severe pressure to define a credible backstop framework for banks. This framework, as the author argues, needs to be a broad, quasi-European system of mutually reinforcing backstops.
We introduce a regularization and blocking estimator for well-conditioned high-dimensional daily covariances using high-frequency data. Using the Barndorff-Nielsen, Hansen, Lunde, and Shephard (2008a) kernel estimator, we estimate the covariance matrix block-wise and regularize it. A data-driven grouping of assets of similar trading frequency ensures the reduction of data loss due to refresh time sampling. In an extensive simulation study mimicking the empirical features of the S&P 1500 universe we show that the ’RnB’ estimator yields efficiency gains and outperforms competing kernel estimators for varying liquidity settings, noise-to-signal ratios, and dimensions. An empirical application of forecasting daily covariances of the S&P 500 index confirms the simulation results.
Consumers purchase energy in many forms. Sometimes energy goods are consumed directly, for instance, in the form of gasoline used to operate a vehicle, electricity to light a home, or natural gas to heat a home. At other times, the cost of energy is embodied in the prices of goods and services that consumers buy, say when purchasing an airline ticket or when buying online garden furniture made from plastic to be delivered by mail. Previous research has focused on quantifying the pass-through of the price of crude oil or the price of motor gasoline to U.S. inflation. Neither approach accounts for the fact that percent changes in refined product prices need not be proportionate to the percent change in the price of oil, that not all energy is derived from oil, and that the correlation of price shocks across energy markets is far from one. This paper develops a vector autoregressive model that quantifies the joint impact of shocks to several energy prices on headline and core CPI inflation. Our analysis confirms that focusing on gasoline price shocks alone will underestimate the inflationary pressures emanating from the energy sector, but not enough to overturn the conclusion that much of the observed increase in headline inflation in 2021 and 2022 reflected non-energy price shocks.
Im Mai 2008 verwüstete der Sturm Nargis über Myanmar/Burma hinweg, 140.000 Menschen wurden getötet. Das autokratisch regierte Land wies jedoch Katastrophenhilfe als innere Einmischung zurück und verweigerte die Einfuhr von Medikamenten und Lebensmitteln. Der französische Außenminister Kouschner drängte angesichts dieser Situation die UN zum Handeln, auf Grundlage der Responsibility to Protect (kurz R2P).
Dieser Akt der Versicherheitlichung steht allerdings im Kontrast zur Medienberichterstattung, wie Gabi Schlag in diesem Papier untersucht. Besonders das Bildmaterial aus dem Katastrophengebiet erzählt eine andere Geschichte. Die Photos der Berichterstattung von BBC.com zum Thema bilden ein visuelles Narrativ, welches keine Hilfsbedürftigkeit suggeriert, sondern kontrolliertes, besonnenes Vorgehen der lokalen Kräfte. Dieser Kontrast verweist auf die sprichwörtliche Macht der Bilder, welche die jeweiligen Bedingungen von Handlungsmöglichkeiten vorstrukturieren.
I present a new business cycle model in which decision making follows a simple mental process motivated by neuroeconomics. Decision makers first compute the value of two different options and then choose the option that offers the highest value, but with errors. The resulting model is highly tractable and intuitive. A demand function in level replaces the traditional Euler equation. As a result, even liquid consumers can have a large marginal propensity to consume. The interest rate affects consumption through the cost of borrowing and not through intertemporal substitution. I discuss the implications for stimulus policies.
A call on art investments
(2010)
The art market has seen boom and bust during the last years and, despite the downturn, has received more attention from investors given the low interest environment following the financial crisis. However, participation has been reserved for a few investors and the hedging of exposures remains dificult. This paper proposes to overcome these problems by introducing a call option on an art index, derived from one of the most comprehensive data sets of art market transactions. The option allows investors to optimize their exposure to art. For pricing purposes, non-tradability of the art index is acknowledged and option prices are derived in an equilibrium setting as well as by replication arguments. In the former, option prices depend on the attractiveness of gaining exposure to a previously non-traded risk. This setting further overcomes the problem of art market exposures being dificult to hedge. Results in the replication case are primarily driven by the ability to reduce residual hedging risk. Even if this is not entirely possible, the replication approach serves as pricing benchmark for investors who are significantly exposed to art and try to hedge their art exposure by selling a derivative. JEL Classification: G11, G13, Z11
We present a higher-order call-by-need lambda calculus enriched with constructors, case-expressions, recursive letrec-expressions, a seq-operator for sequential evaluation and a non-deterministic operator amb that is locally bottom-avoiding. We use a small-step operational semantics in form of a single-step rewriting system that defines a (nondeterministic) normal order reduction. This strategy can be made fair by adding resources for bookkeeping. As equational theory we use contextual equivalence, i.e. terms are equal if plugged into any program context their termination behaviour is the same, where we use a combination of may- as well as must-convergence, which is appropriate for non-deterministic computations. We show that we can drop the fairness condition for equational reasoning, since the valid equations w.r.t. normal order reduction are the same as for fair normal order reduction. We evolve different proof tools for proving correctness of program transformations, in particular, a context lemma for may- as well as mustconvergence is proved, which restricts the number of contexts that need to be examined for proving contextual equivalence. In combination with so-called complete sets of commuting and forking diagrams we show that all the deterministic reduction rules and also some additional transformations preserve contextual equivalence.We also prove a standardisation theorem for fair normal order reduction. The structure of the ordering <=c a is also analysed: Ω is not a least element, and <=c already implies contextual equivalence w.r.t. may-convergence.
We present a higher-order call-by-need lambda calculus enriched with constructors, case-expressions, recursive letrec-expressions, a seq-operator for sequential evaluation and a non-deterministic operator amb that is locally bottom-avoiding. We use a small-step operational semantics in form of a single-step rewriting system that defines a (nondeterministic) normal order reduction. This strategy can be made fair by adding resources for bookkeeping. As equational theory we use contextual equivalence, i.e. terms are equal if plugged into any program context their termination behaviour is the same, where we use a combination of may- as well as must-convergence, which is appropriate for non-deterministic computations. We show that we can drop the fairness condition for equational reasoning, since the valid equations w.r.t. normal order reduction are the same as for fair normal order reduction. We evolve different proof tools for proving correctness of program transformations, in particular, a context lemma for may- as well as mustconvergence is proved, which restricts the number of contexts that need to be examined for proving contextual equivalence. In combination with so-called complete sets of commuting and forking diagrams we show that all the deterministic reduction rules and also some additional transformations preserve contextual equivalence.We also prove a standardisation theorem for fair normal order reduction. The structure of the ordering <=c a is also analysed: Ω is not a least element, and <=c already implies contextual equivalence w.r.t. may-convergence.
We present a higher-order call-by-need lambda calculus enriched with constructors, case-expressions, recursive letrec-expressions, a seq-operator for sequential evaluation and a non-deterministic operator amb, which is locally bottom-avoiding. We use a small-step operational semantics in form of a normal order reduction. As equational theory we use contextual equivalence, i.e. terms are equal if plugged into an arbitrary program context their termination behaviour is the same. We use a combination of may- as well as must-convergence, which is appropriate for non-deterministic computations. We evolve different proof tools for proving correctness of program transformations. We provide a context lemma for may- as well as must- convergence which restricts the number of contexts that need to be examined for proving contextual equivalence. In combination with so-called complete sets of commuting and forking diagrams we show that all the deterministic reduction rules and also some additional transformations keep contextual equivalence. In contrast to other approaches our syntax as well as semantics does not make use of a heap for sharing expressions. Instead we represent these expressions explicitely via letrec-bindings.
Extending the data set used in Beyer (2009) to 2017, we estimate I(1) and I(2) money demand models for euro area M3. After including two broken trends and a few dummies to account for shifts in the variables following the global financial crisis and the ECB's non-standard monetary policy measures, we find that the money demand and the real wealth relations identified in Beyer (2009) have remained remarkably stable throughout the extended sample period. Testing for price homogeneity in the I(2) model we find that the nominal-to-real transformation is not rejected for the money relation whereas the wealth relation cannot be expressed in real terms.
This paper proves correctness of Nöcker's method of strictness analysis, implemented in the Clean compiler, which is an effective way for strictness analysis in lazy functional languages based on their operational semantics. We improve upon the work of Clark, Hankin and Hunt did on the correctness of the abstract reduction rules. Our method fully considers the cycle detection rules, which are the main strength of Nöcker's strictness analysis. Our algorithm SAL is a reformulation of Nöcker's strictness analysis algorithm in a higher-order call-by-need lambda-calculus with case, constructors, letrec, and seq, extended by set constants like Top or Inf, denoting sets of expressions. It is also possible to define new set constants by recursive equations with a greatest fixpoint semantics. The operational semantics is a small-step semantics. Equality of expressions is defined by a contextual semantics that observes termination of expressions. Basically, SAL is a non-termination checker. The proof of its correctness and hence of Nöcker's strictness analysis is based mainly on an exact analysis of the lengths of normal order reduction sequences. The main measure being the number of 'essential' reductions in a normal order reduction sequence. Our tools and results provide new insights into call-by-need lambda-calculi, the role of sharing in functional programming languages, and into strictness analysis in general. The correctness result provides a foundation for Nöcker's strictness analysis in Clean, and also for its use in Haskell.
In this paper we analyze the semantics of a higher-order functional language with concurrent threads, monadic IO and synchronizing variables as in Concurrent Haskell. To assure declarativeness of concurrent programming we extend the language by implicit, monadic, and concurrent futures. As semantic model we introduce and analyze the process calculus CHF, which represents a typed core language of Concurrent Haskell extended by concurrent futures. Evaluation in CHF is defined by a small-step reduction relation. Using contextual equivalence based on may- and should-convergence as program equivalence, we show that various transformations preserve program equivalence. We establish a context lemma easing those correctness proofs. An important result is that call-by-need and call-by-name evaluation are equivalent in CHF, since they induce the same program equivalence. Finally we show that the monad laws hold in CHF under mild restrictions on Haskell’s seq-operator, which for instance justifies the use of the do-notation.
Unquestionably (or: undoubtedly), every competent speaker has already come to doubt with respect to the question of which form is correct or appropriate and should be used (in the standard language) when faced with two or more almost identical competing variants of words, word forms or sentence and phrase structure (e.g. German "Pizzas/Pizzen/Pizze" 'pizzas', Dutch "de drie mooiste/mooiste drie stranden" 'the three most beautiful/most beautiful three beaches', Swedish "större än jag/mig" 'taller than I/me'). Such linguistic uncertainties or "cases of doubt" (cf. i.a. Klein 2003, 2009, 2018; Müller & Szczepaniak 2017; Schmitt, Szczepaniak & Vieregge 2019; Stark 2019 as well as the useful collections of data of Duden vol. 9, Taaladvies.net, Språkriktighetsboken etc.) systematically occur also in native speakers and they do not necessarily coincide with the difficulties of second language learners. In present-day German, most grammatical uncertainties occur in the domains of inflection (nominal plural formation, genitive singular allomorphy of strong masc./neut. nouns, inflectional variation of weak masc. nouns, strong/weak adjectival inflection and comparison forms, strong/weak verb forms, perfect auxiliary selection) and word-formation (linking elements in compounds, separability of complex verbs). As for syntax, there are often doubts in connection with case choice (pseudo-partitive constructions, prepositional case government) and agreement (especially due to coordination or appositional structures). This contribution aims to present a contrastive approach to morphological and syntactic uncertainties in contemporary Germanic languages (mostly German, Dutch, and Swedish) in order to obtain a broader and more fine-grained typology of grammatical instabilities and their causes. As will be discussed, most doubts of competent speakers - a problem also for general linguistic theory - can be attributed to processes of language change in progress, to language or variety contact, to gaps and rule conflicts in the grammar of every language or to psycholinguistic conditions of language processing. Our main concerns will be the issues of which (kinds of) common or different critical areas there are within Germanic (and, on the other hand, in which areas there are no doubts), which of the established (cross-linguistically valid) explanatory approaches apply to which phenomena and, ultimately, the question whether the new data reveals further lines of explanation for the empirically observable (standard) variation.
This paper examines optimal enviromental policy when external financing is costly for firms. We introduce emission externalities and industry equilibrium in the Holmström and Tirole (1997) model of corporate finance. While a cap-and- trading system optimally governs both firms` abatement activities (internal emission margin) and industry size (external emission margin) when firms have sufficient internal funds, external financing constraints introduce a wedge between these two objectives. When a sector is financially constrained in the aggregate, the optimal cap is strictly above the Pigouvian benchmark and emission allowances should be allocated below market prices. When a sector is not financially constrained in the aggregate, a cap that is below the Pigiouvian benchmark optimally shifts market share to less polluting firms and, moreover, there should be no "grandfathering" of emission allowances. With financial constraints and heterogeneity across firms or sectors, a uniform policy, such as a single cap-and-trade system, is typically not optimal.
This paper deals with the proposed use of sovereign credit ratings in the "Basel Accord on Capital Adequacy" (Basel II) and considers its potential effect on emerging markets financing. It investigates in a first attempt the consequences of the planned revisions on the two central aspects of international bank credit flows: the impact on capital costs and the volatility of credit supply across the risk spectrum of borrowers. The empirical findings cast doubt on the usefulness of credit ratings in determining commercial banks' capital adequacy ratios since the standardized approach to credit risk would lead to more divergence rather than convergence between investment-grade and speculative-grade borrowers. This conclusion is based on the lateness and cyclical determination of credit rating agencies' sovereign risk assessments and the continuing incentives for short-term rather than long-term interbank lending ingrained in the proposed Basel II framework.
The merchant language of the Georgian Jews deserves scholarly attention for several reasons. The political and social developments of the last fifty years have caused the extinction of this very interesting form of communication, as most Georgian Jews have emigrated to Israel. In a natural interaction, the type of language described in this article can be found very rarely, if at all. Records of this communication have been preserved in various contexts and received different levels of scholarly attention. Our interest concerns the linguistic aspects as well as the classification.
In the following paper we argue that the specific merchant language of Georgian Jews belongs to the pragmatic phenomenon of “very indirect language.” The use of mostly Hebrew lexemes in Georgian conversation leads to an unfounded assumption that the speakers are equally competent in Hebrew and Georgian. It is reported that a high level of linguistic competence in Hebrew does not guarantee understanding of the Jewish merchant language. In the Georgian context, the decisive factors are membership in the professional interest group of merchants and residential membership in the Jewish community. These factors seem to be equivalent, because Jewish members of other professional groups (and those from outside the particular urban residential area) have difficulties in following the language that are similar to those of the Georgian majority. We describe the pragmatic structure of interactions conducted with the help of the merchant language and take into account the purpose of the language’s use or the intention of the speakers. Relevant linguistic examples are analysed and their sociocultural contexts explained.
n the EU there are longstanding and ongoing pressures towards a tax that is levied on the EU level to substitute for national contributions. We discuss conditions under which such a transition can make sense, starting from what we call a "decentralization theorem of taxation" that is analogous to Oates (1972) famous result that in the absence of spill-over effects and economies of scale decentralized public good provision weakly dominates central provision. We then drop assumptions that turn out to be unnecessary for this results. While spill-over effects of taxation may call for central rules for taxation, as long as spill-over effects do not depend on the intra-regional distribution of the tax burden, decentralized taxation plus tax coordination is found superior to a union-wide tax.
Context unification is a variant of second-order unification and also a generalization of string unification. Currently it is not known whether context uni cation is decidable. An expressive fragment of context unification is stratified context unification. Recently, it turned out that stratified context unification and one-step rewrite constraints are equivalent. This paper contains a description of a decision algorithm SCU for stratified context unification together with a proof of its correctness, which shows decidability of stratified context unification as well as of satisfiability of one-step rewrite constraints.
The emergence of Capitalism is said to always lead to extreme changes in the structure of a society. This view implies that Capitalism is a universal and unique concept that needs an explicit institutional framework and should not discriminate between a German or US Capitalism. In contrast, this work argues that the ‘ideal type’ of Capitalism in a Weberian sense does not exist. It will be demonstrated that Capitalism is not a concept that shapes a uniform institutional framework within every society, constructing a specific economic system. Rather, depending on the institutional environment - family structures in particular - different forms of Capitalism arise. To exemplify this, the networking (Guanxi) Capitalism of contemporary China will be presented, where social institutions known from the past were reinforced for successful development. It will be argued that especially the change, destruction and creation of family and kinship structures are key factors that determined the further development and success of the Chinese economy and the type of Capitalism arising there. In contrast to Weber, it will be argued that Capitalism not necessarily leads to a process of destruction of traditional structures and to large-scale enterprises under rational, bureaucratic management, without leaving space for socio-cultural structures like family businesses. The flexible global production increasingly favours small business production over larger corporations. Small Chinese family firms are able to respond to rapidly changing market conditions and motivate maximum efforts for modest pay. The structure of the Chinese family proved to be very persistent over time and to be able to accommodate diverse economic and political environments while maintaining its core identity. This implies that Chinese Capitalism may be an entirely new economic system, based on Guanxi and the family.
Artificial drainage of agricultural land, for example with ditches or drainage tubes, is used to avoid water logging and to manage high groundwater tables. Among other impacts it influences the nutrient balances by increasing leaching losses and by decreasing denitrification. To simulate terrestrial transport of nitrogen on the global scale, a digital global map of artificially drained agricultural areas was developed. The map depicts the percentage of each 5’ by 5’ grid cell that is equipped for artificial drainage. Information on artificial drainage in countries or sub-national units was mainly derived from international inventories. Distribution to grid cells was based, for most countries, on the "Global Croplands Dataset" of Ramankutty et al. (1998) and the "Digital Global Map of Irrigation Areas" of Siebert et al. (2005). For some European countries the CORINE land cover dataset was used instead of the both datasets mentioned above. Maps with outlines of artificially drained areas were available for 6 countries. The global drainage area on the map is 167 Mio hectares. For only 11 out of the 116 countries with information on artificial drainage areas, sub-national information could be taken into account. Due to this coarse spatial resolution of the data sources, we recommended to use the map of artificially drained areas only for continental to global scale assessments. This documentation describes the dataset, the data sources and the map generation, and it discusses the data uncertainty.
The Land and Water Development Division of the Food and Agriculture Organization of the United Nations and the Johann Wolfgang Goethe University, Frankfurt am Main, Germany, are cooperating in the development of a global irrigation-mapping facility. This report describes an update of the Digital Global Map of Irrigated Areas for the continent of Asia. For this update, an inventory of subnational irrigation statistics for the continent was compiled. The reference year for the statistics is 2000. Adding up the irrigated areas per country as documented in the report gives a total of 188.5 million ha for the entire continent. The total number of subnational units used in the inventory is 4 428. In order to distribute the irrigation statistics per subnational unit, digital spatial data layers and printed maps were used. Irrigation maps were derived from project reports, irrigation subsector studies, and books related to irrigation and drainage. These maps were digitized and compared with satellite images of many regions. In areas without spatial information on irrigated areas, additional information was used to locate areas where irrigation is likely, such as land-cover and land-use maps that indicate agricultural areas or areas with crops that are usually grown under irrigation. Contents 1. Working Report I: Generation of a map of administrative units compatible with statistics used to update the Digital Global Map of Irrigated Areas in Asia 2. Working Report II: The inventory of subnational irrigation statistics for the Asian part of the Digital Global Map of Irrigated Areas 3. Working Report III: Geospatial information used to locate irrigated areas within the subnational units in the Asian part of the Digital Global Map of Irrigated Areas 4. Working Report IV: Update of the Digital Global Map of Irrigated Areas in Asia, Results Maps
The Stanford Project on Language Universals began its activities in October 1967 and brought them to an end in August 1976. Its directors were Joseph H. Greenberg and Charles A. Ferguson. The Cologne Project on Language Universals and Typology [with particular reference to functional aspects], abbreviated UNITYP, had its early beginnings in 1972, but deployed its full activities from 1976 onwards and is still operating. This writer, who is the principal investigator, had the privilege of collaborating with the Stanford Project during spring of 1976. […] One of the leading Greenbergian ideas is that of implicational generalizations, has been integrated as a fundamental principle in the construction of continua and of universal dimensions as proposed by UNITYP. It is hoped that the following considerations on numeral systems will be apt to bear witness to this situation. They would be unthinkable without Greenberg’s pioneering work on "Generalizations about numeral systems" (Greenberg 1978: 249 ff., henceforth referred to as Greenberg, NS). Further work on this domain and on other comparable domains almost inevitably leads one to the view that generalizations of the Greenberg type have a functional significance and that a dimensional framework is apt to bring this to the fore. This is the view on linguistic behaviour as being purposeful, and on language as a problem- solving device. The problem consists in the linguistic representation of cognitive-conceptual ideas. The solution is represented by the corresponding linguistic structures in their diversity and the task of the linguist consists in reconstructing the program and subprograms underlying the process of problem-solving. It is claimed that the construct of continua and of universal dimensions makes these programs intelligible.
Recent models with liquidity constraints and impatience emphasize that consumers use savings to buffer income fluctuations. When wealth is below an optimal target, consumers try to increase their buffer stock of wealth by saving more. When it is above target, they increase consumption. This important implication of the buffer stock model of saving has not been subject to direct empirical testing. We derive from the model an appropriate theoretical restriction and test it using data on working-age individuals drawn from the 2002 and 2004 Italian Surveys of Household Income and Wealth. One of the most appealing features of the survey is that it has data on the amount of wealth held for precautionary purposes, which we interpret as target wealth in a buffer stock model. The test results do not support buffer stock behavior, even among population groups that are more likely, a priori, to display such behavior. The saving behavior of young households is instead consistent with models in which impatience, relative to prudence, is not as high as in buffer stock models. JEL Classification: D91
The article, which summarizes key findings of my German book ‘Die Gemeinfreiheit. Begriff, Funktion, Dogmatik’ (‘The Public Domain: Theory, Func-tion, Doctrine’), asks whether there are any provisions or principles under Ger-man and EU law that protect the public domain from interference by the legisla-ture, courts and private parties. In order to answer this question, it is necessary to step out of the intellectual property (IP) system and to analyze this body of law from the outside, and – even more important – to develop a positive legal conception of the public domain as such. By giving the public domain a proper doctrinal place in the legal system, the structural asymmetry between heavily theorized and protected IP rights on the one hand and a neglected public do-main on the other is countered. The overarching normative purpose is to devel-op a framework for a balanced IP system, which can only be achieved if the public domain forms an integral part of the overall regulation of information.
This paper considers a trading game in which sequentially arriving liquidity traders either opt for a market order or for a limit order. One class of traders is considered to have an extended trading horizon, implying their impatience is linked to their trading orientation. More specifically, sellers are considered to have a trading horizon of two periods, whereas buyers only have a single-period trading scope (the extended buyer-horizon case is completely symmetric). Clearly, as the life span of their submitted limit orders is longer, this setting implies sellers are granted a natural advantage in supplying liquidity. This benefit is hampered, however, by the direct competition arising between consecutively arriving sellers. Closed-form characterizations for the order submission strategies are obtained when solving for the equilibrium of this dynamic game. These allow to examine how these forces affect traders´ order placement decisions. Further, the analysis yields insight into the dynamic process of price formation and into the market clearing process of a non-intermediated, order driven market.
This paper studies constrained portfolio problems that may involve constraints on the probability or the expected size of a shortfall of wealth or consumption. Our first contribution is that we solve the problems by dynamic programming, which is in contrast to the existing literature that applies the martingale method. More precisely, we construct the non-separable value function by formalizing the optimal constrained terminal wealth to be a (conjectured) contingent claim on the optimal non-constrained terminal wealth. This is relevant by itself, but also opens up the opportunity to derive new solutions to constrained problems. As a second contribution, we thus derive new results for non-strict constraints on the shortfall of inter¬mediate wealth and/or consumption.
Often adopting a feminist perspective, the sociological literature on migrant domestic services (MDS) does not make explicit which feminist paradigm it speaks from. This article situates this literature within ongoing debates in feminist theory, in particular the tension between materialist and poststructuralist approaches. Then, it discusses the empirical relevance of each of those two paradigms on the example of the results of original research into the personalization of employment relationships in MDS.
The contribution proposes a new way of making sense of the diversity of feminist theories, distinguishing between modern and postmodern approaches. Indeed, since the 1980s, feminist theory in the US and Western Europe has undergone a ‘postmodern turn’, which renders previous typologies much less up-to-speed with recent developments in the field. Then, the article examines which paradigms are implicit in the sociological literature on MDS. Initially, personalization in MDS was mainly seen in materialist terms, as a way to maximize the quantity and quality of labour (including emotional labour) extracted from domestic workers. The emergence of postmodern approaches in feminist theory set off a progressive shift in MDS literature. First, this literature showed that personalization also fulfils identity functions for employers and
workers, then it widened its focus to include the affective dimensions of domestic labour (not to be confused with emotional labour). The final section shows how modern and postmodern feminist approaches can be combined within a single research, on the example of original research on personalization in MDS in Belgium and Poland. In particular, the contribution shows that the distinction between material functions of personalization on the one hand, and its emotional/identity functions on the other is not empirically operative. Indeed, migrant domestic workers generally use emotional/identity categories to frame material questions, and vice versa. This final part shows that, rather than representing incompatible approaches, modern and postmodern feminisms complete each other, in this case showing a fuller image of personalization processes in MDS.
A version of this paper was originally written for a plenary session about "The Futures of Ethnography" at the 1998 EASA conference in Frankfurt/Main. In the preparation of the paper, I sent out some questions to my former fellow researchers by e-mail. I thank Douglas Anthony, Jan-Patrick Heiß, Alaine Hutson, Matthias Krings, and Brian Larkin for their answers.
The paper proposes a variation of simulation for checking and proving contextual equivalence in a non-deterministic call-by-need lambda-calculus with constructors, case, seq, and a letrec with cyclic dependencies. It also proposes a novel method to prove its correctness. The calculus’ semantics is based on a small-step rewrite semantics and on may-convergence. The cyclic nature of letrec bindings, as well as nondeterminism, makes known approaches to prove that simulation implies contextual equivalence, such as Howe’s proof technique, inapplicable in this setting. The basic technique for the simulation as well as the correctness proof is called pre-evaluation, which computes a set of answers for every closed expression. If simulation succeeds in finite computation depth, then it is guaranteed to show contextual preorder of expressions.
The paper proposes a variation of simulation for checking and proving contextual equivalence in a non-deterministic call-by-need lambda-calculus with constructors, case, seq, and a letrec with cyclic dependencies. It also proposes a novel method to prove its correctness. The calculus' semantics is based on a small-step rewrite semantics and on may-convergence. The cyclic nature of letrec bindings, as well as non-determinism, makes known approaches to prove that simulation implies contextual equivalence, such as Howe's proof technique, inapplicable in this setting. The basic technique for the simulation as well as the correctness proof is called pre-evaluation, which computes a set of answers for every closed expression. If simulation succeeds in finite computation depth, then it is guaranteed to show contextual preorder of expressions.
We selectively survey, unify and extend the literature on realized volatility of financial asset returns. Rather than focusing exclusively on characterizing the properties of realized volatility, we progress by examining economically interesting functions of realized volatility, namely realized betas for equity portfolios, relating them both to their underlying realized variance and covariance parts and to underlying macroeconomic fundamentals.
The human mind may produce prototypization within virtually any realm of cognition and behavior. A "comparative prototype-typology" might prove to be an interesting field of study – perhaps a new subfield of semiotics. This, however, would presuppose a clear view on the samenesses and differences of prototypization in these various fields. It seems realistic for the time being that the linguist first confine himself to describing prototypization within the realm of language proper. The literature on prototypes has steadily grown in the past ten years or so. I confine myself to mentioning the volume on Noun Classes and Categorization, edited by C. Craig (1986), which contains a wealth of factual information on the subject, along with some theoretical vistas. By and large, however, linguistic prototype research is still basically in a taxonomic stage - which, of course, represents the precondition for moving beyond. The procedure is largely per ostensionem, and by accumulating examples of prototypes. We still lack a comprehensive prototype theory. The following pages are intended, not to provide such, a theory, but to do the first steps in this direction. Section 2 will feature some elements of a functional theory of prototypes. They have been developed by this author within the frame of the UNITYP model of research on language universals and typology. Section 3 will bring a discussion of prototypization with regard to selected phenomena of a wide range of levels of analysis: Phonology, morphosyntax, speech acts, and the lexicon. Prototypization will finally be studied within one of the universal dimensions, that of APPREHENSION - the linguistic representation of the concepts of objects – as proposed by Seiler (1986).
Futures markets are a potentially valuable source of information about market expectations. Exploiting this information has proved difficult in practice, because the presence of a time-varying risk premium often renders the futures price a poor measure of the market expectation of the price of the underlying asset. Even though the expectation in principle may be recovered by adjusting the futures price by the estimated risk premium, a common problem in applied work is that there are as many measures of market expectations as there are estimates of the risk premium. We propose a general solution to this problem that allows us to uniquely pin down the best possible estimate of the market expectation for any set of risk premium estimates. We illustrate this approach by solving the long-standing problem of how to recover the market expectation of the price of crude oil. We provide a new measure of oil price expectations that is considerably more accurate than the alternatives and more economically plausible. We discuss implications of our analysis for the estimation of economic models of energy-intensive durables, for the debate on speculation in oil markets, and for oil price forecasting.
We examine how U.S. monetary policy affects the international activities of U.S. Banks. We access a rarely studied US bank‐level dataset to assess at a quarterly frequency how changes in the U.S. Federal funds rate (before the crisis) and quantitative easing (after the onset of the crisis) affects changes in cross‐border claims by U.S. banks across countries, maturities and sectors, and also affects changes in claims by their foreign affiliates. We find robust evidence consistent with the existence of a potent global bank lending channel. In response to changes in U.S. monetary conditions, U.S. banks strongly adjust their cross‐border claims in both the pre and post‐crisis period. However, we also find that U.S. bank affiliate claims respond mainly to host country monetary conditions.
As part of the Next Generation EU (NGEU) program, the European Commission has pledged to issue up to EUR 250 billion of the NGEU bonds as green bonds, in order to confirm their commitment to sustainable finance and to support the transition towards a greener Europe. Thereby, the EU is not only entering the green bond market, but also set to become one of the biggest green bond issuers. Consequently, financial market participants are eager to know what to expect from the EU as a new green bond issuer and whether a negative green bond premium, a so-called Greenium, can be expected for the NGEU green bonds. This research paper formulates an expectation in regards to a potential Greenium for the NGEU green bonds, by conducting an interview with 15 sustainable finance experts and analyzing the public green bond market from September 2014 until June 2021, with respect to a potential green bond premium and its underlying drivers. The regression results confirm the existence of a significant Greenium (-0.7 bps) in the public green bond market and that the Greenium increases for supranational issuers with AAA rating, such as the EU. Moreover, the green bond premium is influenced by issuer sector and credit rating, but issue size and modified duration have no significant effect. Overall, the evaluated expert interviews and regression analysis lead to an expected Greenium for the NGEU green bonds of up to -4 bps, with the potential to further increase in the secondary market.
A hero in the box
(1998)
Zu den bizarrsten und gleichzeitig eindrucksvollsten Passagen im nachgelassenen Teil von Musils Mann ohne Eigenschaften gehören die fragmentarischen Entwürfe, in denen Clarisse die Befreiung des Frauenmörders Moosbrugger aus der Irrenanstalt betreibt. Clarisse zählt zum engen Kreis der Hauptfiguren des Romans. Die Lebensgefährtin Walters ist eine hitzige Nietzsche-Adeptin: im Verein mit Ulrich, dem Mann ohne Eigenschaften, ergibt das eine Dreier-Konstellation, deren Spannung sich bei fortschreitendem Romangeschehen mehr und mehr in szenischen Gewittern entlädt. Es kommt also nicht von ungefähr, daß die Fragmente, die um die Befreiung Moosbruggers kreisen, sowohl einen intellektuellen als auch einen pathologischen Bezug aufweisen. Ausführen soll die Tat Ulrich, und es existiert eine Fassung aus den Jahren 1923/25, in der das Unternehmen mißlingt und der Patient fortan in strengeren Gewahrsam genommen wird. In später geschriebenen Passagen befindet sich Moosbrugger aber auf freiem Fuß. Er begeht sogar einen weiteren Mord, so daß an der Absicht des Autors, an die Stelle des mißlingenden Ausbruchs den gelingenden zu setzen, nicht gut zu zweifeln sein dürfte, wenn nicht im Dickicht der späten Entwürfe der Zweifel und die Zurücknahme selbst bereits zu den entscheidenden Werkzeugen geworden wären, mit deren Hilfe der Erzähler seine Route bestimmt.
We consider an imperfectly competitive loan market in which a local relationship lender has an information advantage vis-à-vis distant transaction lenders. Competitive pressure from the transaction lenders prevents the local lender from extracting the full surplus from projects, so that she inefficiently rejects marginally profitable projects. Collateral mitigates the inefficiency by increasing the local lender’s payoff from precisely those marginal projects that she inefficiently rejects. The model predicts that, controlling for observable borrower risk, collateralized loans are more likely to default ex post, which is consistent with the empirical evidence. The model also predicts that borrowers for whom local lenders have a relatively smaller information advantage face higher collateral requirements, and that technological innovations that narrow the information advantage of local lenders, such as small business credit scoring, lead to a greater use of collateral in lending relationships. JEL classification: D82; G21 Keywords: Collateral; Soft infomation; Loan market competition; Relationship lending
On average, "young" people underestimate whereas "old" people overestimate their chances to survive into the future. We adopt a Bayesian learning model of ambiguous survival beliefs which replicates these patterns. The model is embedded within a non-expected utility model of life-cycle consumption and saving. Our analysis shows that agents with ambiguous survival beliefs (i) save less than originally planned, (ii) exhibit undersaving at younger ages, and (iii) hold larger amounts of assets in old age than their rational expectations counterparts who correctly assess their survival probabilities. Our ambiguity-driven model therefore simultaneously accounts for three important empirical findings on household saving behavior.
Based on a cognitive notion of neo-additive capacities reflecting likelihood insensitivity with respect to survival chances, we construct a Choquet Bayesian learning model over the life-cycle that generates a motivational notion of neo-additive survival beliefs expressing ambiguity attitudes. We embed these neo-additive survival beliefs as decision weights in a Choquet expected utility life-cycle consumption model and calibrate it with data on subjective survival beliefs from the Health and Retirement Study. Our quantitative analysis shows that agents with calibrated neo-additive survival beliefs (i) save less than originally planned, (ii) exhibit undersaving at younger ages, and (iii) hold larger amounts of assets in old age than their rational expectations counterparts who correctly assess their survival chances. Our neo-additive life-cycle model can therefore simultaneously accommodate three important empirical findings on household saving behavior.
The Inuit inhabit a vast area of--from a European point of view--most inhospitable land, stretching from the northeastern tip of Asia to the east coast of Greenland. Inuit peoples have never been numerous, their settlements being scattered over enormous distances. But nevertheless, from an ethnological point of view, all Inuit peoples shared a distinct culture, featuring sea mammal and caribou hunting, sophisticated survival skills, technical and social devices, including the sharing of essential goods and strategies for minimizing and controlling aggression.
This paper solves a dynamic model of households' mortgage decisions incorporating labor income, house price, inflation, and interest rate risk. It uses a zero-profit condition for mortgage lenders to solve for equilibrium mortgage rates given borrower characteristics and optimal decisions. The model quantifies the effects of adjustable vs. fixed mortgage rates, loan-to-value ratios, and mortgage affordability measures on mortgage premia and default. Heterogeneity in borrowers' labor income risk is important for explaining the higher default rates on adjustable-rate mortgages during the recent US housing downturn, and the variation in mortgage premia with the level of interest rates.
We focus on the role of social media as a high-frequency, unfiltered mass information transmission channel and how its use for government communication affects the aggregate stock markets. To measure this effect, we concentrate on one of the most prominent Twitter users, the 45th President of the United States, Donald J. Trump. We analyze around 1,400 of his tweets related to the US economy and classify them by topic and textual sentiment using machine learning algorithms. We investigate whether the tweets contain relevant information for financial markets, i.e. whether they affect market returns, volatility, and trading volumes. Using high-frequency data, we find that Trump’s tweets are most often a reaction to pre-existing market trends and therefore do not provide material new information that would influence prices or trading. We show that past market information can help predict Trump’s decision to tweet about the economy.
We present an empirical study focusing on the estimation of a fundamental multi-factor model for a universe of European stocks. Following the approach of the BARRA model, we have adopted a cross-sectional methodology. The proportion of explained variance ranges from 7.3% to 66.3% in the weekly regressions with a mean of 32.9%. For the individual factors we give the percentage of the weeks when they yielded statistically significant influence on stock returns. The best explanatory power – apart from the dominant country factors – was found among the statistical constructs „success“ and „variability in markets“.
This paper reviews social network analysis (SNA) as a method to be utilized in biographical research which is a novel contribution. We argue that applying SNA in the context of biography research through standardized data collection as well as visualization of networks can open up participants’ interpretations of relations throughout their lives, and allow a creative and innovative way of data collection that is responsive to participants’ own meanings and associations while allowing the researchers to conduct systematical data analysis. The paper discusses the analytical potential of SNA in biographical research, where the efficacy of this method is critically discussed, together with its limitations, and its potential within the context of biographical research.
Central banks have faced a succession of crises over the past years as well as a number of structural factors such as a transition to a greener economy, demographic developments, digitalisation and possibly increased onshoring. These suggest that the future inflation environment will be different from the one we know. Thus uncertainty about important macroeconomic variables and, in particular, inflation dynamics will likely remain high.
In the aftermath of the global financial crisis, the state of macroeconomic modeling and the use of macroeconomic models in policy analysis has come under heavy criticism. Macroeconomists in academia and policy institutions have been blamed for relying too much on a particular class of macroeconomic models. This paper proposes a comparative approach to macroeconomic policy analysis that is open to competing modeling paradigms. Macroeconomic model comparison projects have helped produce some very influential insights such as the Taylor rule. However, they have been infrequent and costly, because they require the input of many teams of researchers and multiple meetings to obtain a limited set of comparative findings. This paper provides a new approach that enables individual researchers to conduct model comparisons easily, frequently, at low cost and on a large scale. Using this approach a model archive is built that includes many well-known empirically estimated models that may be used for quantitative analysis of monetary and fiscal stabilization policies. A computational platform is created that allows straightforward comparisons of models’ implications. Its application is illustrated by comparing different monetary and fiscal policies across selected models. Researchers can easily include new models in the data base and compare the effects of novel extensions to established benchmarks thereby fostering a comparative instead of insular approach to model development.
In the aftermath of the global financial crisis, the state of macroeconomicmodeling and the use of macroeconomic models in policy analysis has come under heavy criticism. Macroeconomists in academia and policy institutions have been blamed for relying too much on a particular class of macroeconomic models. This paper proposes a comparative approach to macroeconomic policy analysis that is open to competing modeling paradigms. Macroeconomic model comparison projects have helped produce some very influential insights such as the Taylor rule. However, they have been infrequent and costly, because they require the input of many teams of researchers and multiple meetings to obtain a limited set of comparative findings. This paper provides a new approach that enables individual researchers to conduct model comparisons easily, frequently, at low cost and on a large scale. Using this approach a model archive is built that includes many well-known empirically estimated models that may be used for quantitative analysis of monetary and fiscal stabilization policies. A computational platform is created that allows straightforward comparisons of models’ implications. Its application is illustrated by comparing different monetary and fiscal policies across selected models. Researchers can easily include new models in the data base and compare the effects of novel extensions to established benchmarks thereby fostering a comparative instead of insular approach to model development
One of the dangers of harmonisation and unification processes taking place within the framework of the EU is that they may result in the codification of the lowest common denominator. This is precisely what is threatening to happen in respect of assignment. Referring the transfer of receivables by way of assignment to the law of the assignor’s residence, as article 13 of the Proposal does, would be opting for the most conservative solution and would for many Member States be a step backward rather than forward. A conflict rule referring assignment to the law of the assignor's residence is too rigid to do justice to the dynamic nature of assignments in cross-border transactions and it is unjustly one-sided. It offers no real advantages when compared to other conflict rules; it even has serious disadvantages which make the conflict rule unsuitable for efficient assignment-based cross-border transactions. It is not unconceivable that this conflict rule would even be contrary to the fundamental freedoms of the ECTreaty. The Community legislators in particular should be careful not to needlessly adopt rules which create insurmountable obstacles for cross-border business where choice-of-law by the parties would perfectly do. Community legislation has a special responsibility to create a smooth legal environment for single market transactions.
A new governance architecture for european financial markets? Towards a european supervision of CCPs
(2018)
Does the new European outlook on financial markets, as voiced by the EU Commission since the beginning of the Capital Market Unions imply a movement of the EU towards an alignment of market integration and direct supervision of common rules? This paper sets out to answer this question for the case of common supervision for Central Counterparties (CCPs) in the European Union. Those entities gained crucial importance post-crisis due to new regulation which requires the mandatory clearing of standardized derivative contracts, transforming clearing houses into central nodes for cross-border financial transactions. While the EU-wide regulatory framework EMIR, enacted in 2012, stipulates common regulatory requirements, the framework still relies on home-country supervision of those rules, arguably leading to regulatory as well as supervisory arbitrage. Therefore, the regulatory reform to stabilize the OTC derivatives market replicated at its center a governance flaw, which had been identified as one of the major causes for the gravity of the financial crisis in the EU: the coupling of intense competition based on private risk management systems with a national supervision of European rules. This paper traces the history of this problem awareness and inquires which factors account for the fact that only in 2017 serious negotiations at the EU level ensued that envisioned a common supervision of CCPs to fix the flawed system of governance. Analyzing this shift in the European governance architecture, we argue that Brexit has opened a window of opportunity for a centralization of supervision for CCPs. Brexit aligns the urgency of the problem with material interests of crucial political stakeholder, in particular of Germany and France, providing the possibility for a grand European bargain.
We develop a utility based model of fluctuations, with nominal rigidities, and unemployment. In doing so, we combine two strands of research: the New Keynesian model with its focus on nominal rigidities, and the Diamond-Mortensen-Pissarides model, with its focus on labor market frictions and unemployment. In developing this model, we proceed in two steps. We first leave nominal rigidities aside. We show that, under a standard utility specification, productivity shocks have no effect on unemployment in the constrained efficient allocation. We then focus on the implications of alternative real wage setting mechanisms for fluctuations in unemployment. We then introduce nominal rigidities in the form of staggered price setting by firms. We derive the relation between inflation and unemployment and discuss how it is influenced by the presence of real wage rigidities. We show the nature of the tradeoff between inflation and unemployment stabilization, and we draw the implications for optimal monetary policy. JEL Classification: E32, E50
We extend the important idea of range-based volatility estimation to the multivariate case. In particular, we propose a range-based covariance estimator that is motivated by financial economic considerations (the absence of arbitrage), in addition to statistical considerations. We show that, unlike other univariate and multivariate volatility estimators, the range-based estimator is highly efficient yet robust to market microstructure noise arising from bid-ask bounce and asynchronous trading. Finally, we provide an empirical example illustrating the value of the high-frequency sample path information contained in the range-based estimates in a multivariate GARCH framework.
The Box-Cox quantile regression model using the two stage method introduced by Chamberlain (1994) and Buchinsky (1995) provides an attractive extension of linear quantile regression techniques. However, a major numerical problem exists when implementing this method which has not been addressed so far in the literature. We suggest a simple solution modifying the estimator slightly. This modification is easy to implement. The modified estimator is still [square root] n-consistent and its asymptotic distribution can easily be derived. A simulation study confirms that the modified estimator works well.
This note argues that in a situation of an inelastic natural gas supply a restrictive monetary policy in the euro zone could reduce the energy bill and therefore has additional merits. A more hawkish monetary policy may be able to indirectly use monopsony power on the gas market. The welfare benefits of such a policy are diluted to the extent that some of the supply (approximately 10 percent) comes from within the euro zone, which may give rise to distributional concerns.
Riley (1979)'s reactive equilibrium concept addresses problems of equilibrium existence in competitive markets with adverse selection. The game-theoretic interpretation of the reactive equilibrium concept in Engers and Fernandez (1987) yields the Rothschild-Stiglitz (1976)/Riley (1979) allocation as an equilibrium allocation, however multiplicity of equilibrium emerges. In this note we imbed the reactive equilibrium's logic in a dynamic market context with active consumers. We show that the Riley/Rothschild-Stiglitz contracts constitute the unique equilibrium allocation in any pure strategy subgame perfect Nash equilibrium.
We build a novel leading indicator (LI) for the EU industrial production (IP). Differently from previous studies, the technique developed in this paper is able to produce an ex-ante LI that is immune to “overlapping information drawbacks”. In addition, the set of variables composing the LI relies on a dynamic and systematic criterion. This ensures that the choice of the variables is not driven by subjective views. Our LI anticipates swings (including the 2007-2008 crisis) in the EU industrial production – on average – by 2 to 3 months. The predictive power improves if the indicator is revised every five or ten years. In a forward-looking framework, via a general-to-specific procedure, we also show that our LI represents the most informative variable in approaching expectations on the EU IP growth.
A partial rehabilitation of side-effecting I/O : non-determinism in non-strict functional languages
(1996)
We investigate the extension of non-strict functional languages like Haskell or Clean by a non-deterministic interaction with the external world. Using call-by-need and a natural semantics which describes the reduction of graphs, this can be done such that the Church-Rosser Theorems 1 and 2 hold. Our operational semantics is a base to recognise which particular equivalencies are preserved by program transformations. The amount of sequentialisation may be smaller than that enforced by other approaches and the programming style is closer to the common one of side-effecting programming. However, not all program transformations used by an optimising compiler for Haskell remain correct in all contexts. Our result can be interpreted as a possibility to extend current I/O-mechanism by non-deterministic deterministic memoryless function calls. For example, this permits a call to a random number generator. Adding memoryless function calls to monadic I/O is possible and has a potential to extend the Haskell I/O-system.