Refine
Year of publication
- 2011 (92) (remove)
Document Type
- Working Paper (92) (remove)
Has Fulltext
- yes (92)
Is part of the Bibliography
- no (92)
Keywords
- Deutschland (4)
- USA (4)
- China (3)
- Digital Humanities (3)
- Financial Crisis (3)
- Japan (3)
- Monetary Policy (3)
- Adaptive Erwartung (2)
- Adverse Selection Risk (2)
- Außenwirtschaftliches Gleichgewicht (2)
Institute
- Center for Financial Studies (CFS) (35)
- Wirtschaftswissenschaften (8)
- Informatik (7)
- Institute for Law and Finance (ILF) (7)
- Institute for Monetary and Financial Stability (IMFS) (7)
- Exzellenzcluster Die Herausbildung normativer Ordnungen (6)
- House of Finance (HoF) (4)
- Interdisziplinäres Zentrum für Ostasienstudien (IZO) (4)
- Gesellschaftswissenschaften (3)
- Medizin (3)
Regulations in the pre-Sarbanes–Oxley era allowed corporate insiders considerable flexibility in strategically timing their trades and SEC filings, for example, by executing several trades and reporting them jointly after the last trade. We document that even these lax reporting requirements were frequently violated and that the strategic timing of trades and reports was common. Event study abnormal re-turns are larger after reports of strategic insider trades than after reports of otherwise similar nonstrategic trades. Our results also imply that delayed reporting is detrimental to market efficiency and lend strong support to the more stringent trade reporting requirements established by the Sarbanes–Oxley Act. JEL Classification: G14, G30, G32 Keywords: Insider Trading , Directors' Dealings , Corporate Governance , Market Efficiency
The overvaluation hypothesis (Miller 1977) predicts that a) stocks are overvalued in the presence of short selling restrictions and that b) the overvaluation increases in the degree of divergence of opinion. We design an experiment that allows us to test these predictions in the laboratory. The results indicate that prices are higher with short selling constraints, but the overvaluation does not increase in the degree of divergence of opinion. We further find that trading volume is lower and bid-ask spreads are higher when short sale restrictions are imposed. JEL Classification: C92, G14 Keywords: Overvaluation Hypothesis , Short Selling Constraints , Divergence of Opinion
We analytically show that a common across rich/poor individuals Stone-Geary utility function with subsistence consumption in the context of a simple two-asset portfolio-choice model is capable of qualitatively and quantitatively explaining: (i) the higher saving rates of the rich, (ii) the higher fraction of personal wealth held in risky assets by the rich, and (iii) the higher volatility of consumption of the wealthier. On the contrary, time-variant “keeping-up-with-the-Joneses” weighted average consumption which plays the role of moving benchmark subsistence consumption gives the same portfolio composition and saving rates across the rich and the poor, failing to reconcile the model with what micro data say. JEL Classification: G11, D91, E21, D81, D14, D11
The papers in this volume were originally presented at the Workshop on Bantu Wh-questions, held at the Institut des Sciences de l’Homme, Université Lyon 2, on 25-26 March 2011, which was organized by the French-German cooperative project on the Phonology/Syntax Interface in Bantu Languages (BANTU PSYN). This project, which is funded by the ANR and the DFG, comprises three research teams, based in Berlin, Paris and Lyon. The Berlin team, at the ZAS, is: Laura Downing (project leader) and Kristina Riedel (post-doc). The Paris team, at the Laboratoire de phonétique et phonologie (LPP; UMR 7018), is: Annie Rialland (project leader), Cédric Patin (Maître de Conférences, STL, Université Lille 3), Jean-Marc Beltzung (post-doc), Martial Embanga Aborobongui (doctoral student), Fatima Hamlaoui (post-doc). The Lyon team, at the Dynamique du Langage (UMR 5596) is: Gérard Philippson (project leader) and Sophie Manus (Maître de Conférences, Université Lyon 2). These three research teams bring together the range of theoretical expertise necessary to investigate the phonology-syntax interface: intonation (Patin, Rialland), tonal phonology (Aborobongui, Downing, Manus, Patin, Philippson, Rialland), phonology-syntax interface (Downing, Patin) and formal syntax (Riedel, Hamlaoui). They also bring together a range of Bantu language expertise: Western Bantu (Aboronbongui, Rialland), Eastern Bantu (Manus, Patin, Philippson, Riedel), and Southern Bantu (Downing).
This paper is the report of a study conducted by five people – four at Stanford, and one at the University of Wisconsin – which tried to establish whether computer-generated algorithms could "recognize" literary genres. You take 'David Copperfield', run it through a program without any human input – "unsupervised", as the expression goes – and ... can the program figure out whether it's a gothic novel or a 'Bildungsroman'? The answer is, fundamentally, Yes: but a Yes with so many complications that it is necessary to look at the entire process of our study. These are new methods we are using, and with new methods the process is almost as important as the results.
Fazit und Ausblick: Gemessen an den hohen Anforderungen, die die Rechtsprechung und das juristische Schrifttum an Qualifikation und Aufgabenwahrnehmung der Aufsichtsratsmitglieder stellen, scheint
erheblicher Bedarf für eine Professionalisierung der Aufsichtsratstätigkeit zu bestehen. Tatsächlich dürften indessen die Erwartungen, die in die Überwachung der Geschäftsführung durch den Aufsichtsrat gesetzt werden, überzogen sein. Zudem brächte die Fortentwicklung des Aufsichtsrats zu einem Gremium professioneller Überwacher eine Reihe von Nachteilen mit sich, die die vermeintlichen Vorteile aufwiegen dürfte. Eine professionelle Überwachung der Geschäftsführung ließe sich daher möglicherweise besser im Vorstand selbst ansiedeln, indem ein oder mehrere Vorstandsmitglied(er) ausschließlich mit Überwachungsaufgaben betraut würde(n). Der Vorstand hat für die Rechtmäßigkeit, Ordnungsmäßigkeit und Zweckmäßigkeit der Organisation und der Entscheidungsprozesse innerhalb der Gesellschaft zu sorgen. Dazu gehört nicht nur die Überwachung des eigenen Ressorts, sondern auch die Pflicht, den Geschäftsbetrieb insgesamt zu beobachten und Missstände auch in anderen Ressorts zur Kenntnis des Gesamtvorstands zu bringen. Bereits das geltende Recht trägt damit dem Umstand Rechnung, dass effektive Überwachung ständige Präsenz im Unternehmen und die Unterstützung durch einen Stab von Mitarbeitern erfordert. Die Betrauung einzelner Vorstandsmitglieder mit hauptamtlichen Überwachungsaufgaben würde die aus diesen Gründen dem Vorstand ohnehin obliegende Aufsicht ausbauen, dem Aufsichtsrat einen auf Überwachung spezialisierten Ansprechpartner im Vorstand zur Verfügung stellen und auf diese Weise dazu beitragen, die Überwachungsaufgabe des Aufsichtsrats auf ein im Nebenamt realistischerweise zu bewältigendes Maß zurückzuführen.
Plagiarismus in der Medizin wird im Ausland im letzten Jahrzehnt zunehmend erforscht,
nicht so in Deutschland. Prominente Plagiatsfälle auch außerhalb der Medizin stellen darüber
hinaus grundlegende Fragen an die Qualität von Wissenschaft. Plagiarismus und
unethisches Verhalten in der Wissenschaft werden in diesem Arbeitspapier im Kontext
des grundlegenden institutionell-organisatorischen Wandels des Wissenschafts- und
Hochschulsystems durch die Übertragung von Konzepten des New Public Management
(NPM) auf die Governance des Hochschul- und Wissenschaftssystems diskutiert. Möglichkeiten
und Grenzen verschiedener Strategien zum Umgang mit Plagiarismus werden
vorgestellt. Dabei wird insbesondere auf die Verwendung von Plagiats-Software eingegangen.
Die Verwendung einer Software-Lösung im Fachbereich Humanmedizin wird aus
verschiedenen Gründen kritisch eingeschätzt. Erste Ergebnisse aus einer empirischen
Studie zum Plagiarismus von Studierenden zeigen ebenfalls, dass der Prävention von
Plagiaten durch Aufklärung und Ausbildung mehr Beachtung geschenkt werden muss. Auf
Grundlage der theoretischen Überlegungen, Recherchen und der eigenen empirischen
Erhebungen werden Bausteine für einen systematischen Umgang mit Plagiarismus für die
Hochschulmedizin entwickelt.
A generalization of the compressed string pattern match that applies to terms with variables is investigated: Given terms s and t compressed by singleton tree grammars, the task is to find an instance of s that occurs as a subterm in t. We show that this problem is in NP and that the task can be performed in time O(ncjVar(s)j), including the construction of the compressed substitution, and a representation of all occurrences. We show that the special case where s is uncompressed can be performed in polynomial time. As a nice application we show that for an equational deduction of t to t0 by an equality axiom l = r (a rewrite) a single step can be performed in polynomial time in the size of compression of t and l; r if the number of variables is fixed in l. We also show that n rewriting steps can be performed in polynomial time, if the equational axioms are compressed and assumed to be constant for the rewriting sequence. Another potential application are querying mechanisms on compressed XML-data bases.
Prepared by Christian Laux, Vienna University of Economics and Business & Center for Financial Studies (CFS) for the “Workshop on Liquidity Premium in Solvency II: Conceptual and Measurement Issues,” DNB Amsterdam, March 18, 2011. The insurance industry and the Committee of European Insurance and Occupational Pension Supervisors (CEIOPS) propose to add a liquidity premium to the risk-free rate when discounting liabilities in times of financial turmoil. The objective is to counterbalance adverse effects on regulatory capital due to a decrease in asset values caused by illiquidity in a crisis. As I argue in this note, although the motive might be sensible, the proposal to add a liquidity premium when discounting liabilities is not the right approach to tackle the problem.
The calculus CHF models Concurrent Haskell extended by concurrent, implicit futures. It is a process calculus with concurrent threads, monadic concurrent evaluation, and includes a pure functional lambda-calculus which comprises data constructors, case-expressions, letrec-expressions, and Haskell’s seq. Futures can be implemented in Concurrent Haskell using the primitive unsafeInterleaveIO, which is available in most implementations of Haskell. Our main result is conservativity of CHF, that is, all equivalences of pure functional expressions are also valid in CHF. This implies that compiler optimizations and transformations from pure Haskell remain valid in Concurrent Haskell even if it is extended by futures. We also show that this is no longer valid if Concurrent Haskell is extended by the arbitrary use of unsafeInterleaveIO.
Central banks have recently introduced new policy initiatives, including a policy called ‘Quantitative Easing’ (QE). Since it has been argued by the Bank of England that “Standard economic models are of limited use in these unusual circumstances, and the empirical evidence is extremely limited” (Bank of England, 2009b), we have taken an entirely empirical approach and have focused on the QE-experience, on which substantial data is available, namely that of Japan (2001-2006). Recent literature on the effectiveness of QE has neglected any reference to final policy goals. In this paper, we adopt the view that ultimately effectiveness will be measured by whether it will be able to “boost spending” (Bank of England, 2009b) and “will ultimately be judged by their impact on the wider macroeconomy” (Bank of England, 2010). In line with a widely held view among leading macroeconomists from various persuasions, while attempting to stay agnostic and open-minded on the distribution of demand changes between real output and inflation, we have thus identified nominal GDP growth as the key final policy goal of monetary policy. The empirical research finds that the policy conducted by the Bank of Japan between 2001 and 2006 makes little empirical difference while an alternative policy targeting credit creation (the original definition of QE) would likely have been more successful.
In the last few years, literary studies have experienced what we could call the rise of quantitative evidence. This had happened before of course, without producing lasting effects, but this time it’s probably going to be different, because this time we have digital databases, and automated data retrieval. As Michel’s and Lieberman’s recent article on "Culturomics" made clear, the width of the corpus and the speed of the search have increased beyond all expectations: today, we can replicate in a few minutes investigations that took a giant like Leo Spitzer months and years of work. When it comes to phenomena of language and style, we can do things that previous generations could only dream of.
When it comes to language and style. But if you work on novels or plays, style is only part of the picture. What about plot – how can that be quantified? This paper is the beginning of an answer, and the beginning of the beginning is network theory. This is a theory that studies connections within large groups of objects: the objects can be just about anything – banks, neurons, film actors, research papers, friends... – and are usually called nodes or vertices; their connections are usually called edges; and the analysis of how vertices are linked by edges has revealed many unexpected features of large systems, the most famous one being the so-called "small-world" property, or "six degrees of separation": the uncanny rapidity with which one can reach any vertex in the network from any other vertex. The theory proper requires a level of mathematical intelligence which I unfortunately lack; and it typically uses vast quantities of data which will also be missing from my paper. But this is only the first in a series of studies we’re doing at the Stanford Literary Lab; and then, even at this early stage, a few things emerge.
To monitor one's speech means to check the speech plan for errors, both before and after talking. There are several theories as to how this process works. We give a short overview on the most influential theories only to focus on the most widely received one, the Perceptual Loop Theory of monitoring by Levelt (1983). One of the underlying assumptions of this theory is the existence of an Inner Loop, a monitoring device that checks for errors before speech is articulated. This paper collects evidence for the existence of such an internal monitoring device and questions how it might work. Levelt's theory argues that internal monitoring works by means of perception, but there are other empirical findings that allow for the assumption that an Inner Loop could also use our speech production devices. Based on data from both experimental and aphasiological papers we develop a model based on Levelt (1983) which shows that internal monitoring might in fact make use of both perception and production means.
We make three points. First, the decade before the financial crisis in 2007 was characterized by a collapse in the yield on TIPS. Second, estimated VARs for the federal funds rate and the TIPS yield show that while monetary policy shocks had negligible effects on the TIPS yield, shocks to the latter had one-to-one effects on the federal funds rate. Third, these findings can be rationalized in a New Keynesian model.
Measuring confidence and uncertainty during the financial crisis: evidence from the CFS survey
(2011)
The CFS survey covers individual situations of banks and other companies of the financial sector during the financial crisis. This provides a rare possibility to analyze appraisals, expectations and forecast errors of the core sector of the recent turmoil. Following standard ways of aggregating individual survey data, we first present and introduce the CFS survey by comparing CFS indicators of confidence and predicted confidence to ifo and ZEW indicators. The major contribution is the analysis of several indicators of uncertainty. In addition to well established concepts, we introduce innovative measures based on the skewness of forecast errors and on the share of ‘no response’ replies. Results show that uncertainty indicators fit quite well with pattern of real and financial time series of the time period 2007 to 2010. Business Sentiment , Financial Crisis , Survey Indicator , Uncertainty CFS working paper series, 2010, 18. Revised Version July 2011