Refine
Year of publication
- 2010 (47) (remove)
Document Type
- Working Paper (47) (remove)
Language
- English (47) (remove)
Has Fulltext
- yes (47)
Is part of the Bibliography
- no (47) (remove)
Keywords
- Formale Semantik (4)
- Finanzkrise (3)
- Logik (3)
- Verifikation (3)
- Asset Allocation (2)
- Bank (2)
- Financial Knowledge (2)
- Financial Markets (2)
- Finanzwirtschaft (2)
- Kreditwesen (2)
Institute
- Center for Financial Studies (CFS) (28)
- Informatik (6)
- Exzellenzcluster Die Herausbildung normativer Ordnungen (4)
- Wirtschaftswissenschaften (4)
- Institute for Law and Finance (ILF) (2)
- Extern (1)
- Institut für sozial-ökologische Forschung (ISOE) (1)
- Institute for Monetary and Financial Stability (IMFS) (1)
According to disposition effect theory, people hold losing investments too long. However, many investors eventually sell at a loss, and little is known about which psychological factors contribute to these capitulation decisions. This study integrates prospect theory, utility maximization theory, and theory on reference point adaptation to argue that the combination of a negative expectation about an investment’s future performance and a low level of adaptation to previous losses leads to a greater capitulation probability. The test of this hypothesis in a dynamic experimental setting reveals that a larger total loss and longer time spent in a losing position lead to downward adaptations of the reference point. Negative expectations about future investment performance lead to a greater capitulation probability. Consistent with the theoretical framework, empirical evidence supports the relevance of the interaction between adaptation and expectation as a determinant of capitulation decisions. Keywords: Investments , Adaptation , Reference Point , Capitulation , Selling Decisions , Disposition Effect , Financial Markets JEL Classification: D91, D03, D81
We show that average excess returns during the last two years of the presidential cycle are significantly higher than during the first two years: 9.8 percent over the period 1948 – 2008. This pattern in returns cannot be explained by business-cycle variables capturing time-varying risk premia, differences in risk levels, or by consumer and investor sentiment. In this paper, we formally test the presidential election cycle (PEC) hypothesis as the alternative explanation found in the literature for explaining the presidential cycle anomaly. PEC states that incumbent parties and presidents have an incentive to manipulate the economy (via budget expansions and taxes) to remain in power. We formulate eight empirically testable propositions relating to the fiscal, monetary, tax, unexpected inflation and political implications of the PEC hypothesis. We do not find statistically significant evidence confirming the PEC hypothesis as a plausible explanation for the presidential cycle effect. The existence of the presidential cycle effect in U.S. financial markets thus remains a puzzle that cannot be easily explained by politicians employing their economic influence to remain in power. JEL Classification: E32; G14; P16 Keywords: Political Economy, Market Efficiency, Anomalies, Calendar Effects
We investigate the incentives for vertical or horizontal integration in the financial security service industry, consisting of trading, clearing and settlement. We thereby focus on firms’ decisions but also look on the implications of these decisions on competition and welfare. Our analysis shows that the incentives for vertical integration crucially depend on industry as well as market characteristics. A more pronounced demand for liquidity clearly favors vertical integration whereas deeper financial integration increases the incentives to undertake vertical integration only if the efficiency gains associated with vertical integration are sufficiently large. Furthermore, we show that market forces can suffer from a coordination problem that end in vertically integrated structures that are not in the best interest of the firms. We believe this problem can be addressed by policy measures such as the TARGET2-Securities program. Furthermore, we use our framework to discuss major industry trends and policy initiatives. Keywords: Vertical Integration , Horizontal Integration , Competition , Trading , Settlement JEL Classification: G15, L13, L22
This paper investigates the accuracy and heterogeneity of output growth and inflation forecasts during the current and the four preceding NBER-dated U.S. recessions. We generate forecasts from six different models of the U.S. economy and compare them to professional forecasts from the Federal Reserve’s Greenbook and the Survey of Professional Forecasters (SPF). The model parameters and model forecasts are derived from historical data vintages so as to ensure comparability to historical forecasts by professionals. The mean model forecast comes surprisingly close to the mean SPF and Greenbook forecasts in terms of accuracy even though the models only make use of a small number of data series. Model forecasts compare particularly well to professional forecasts at a horizon of three to four quarters and during recoveries. The extent of forecast heterogeneity is similar for model and professional forecasts but varies substantially over time. Thus, forecast heterogeneity constitutes a potentially important source of economic fluctuations. While the particular reasons for diversity in professional forecasts are not observable, the diversity in model forecasts can be traced to different modeling assumptions, information sets and parameter estimates. JEL Classification: G14, G15, G24
Towards correctness of program transformations through unification and critical pair computation
(2010)
Correctness of program transformations in extended lambda-calculi with a contextual semantics is usually based on reasoning about the operational semantics which is a rewrite semantics. A successful approach is the combination of a context lemma with the computation of overlaps between program transformations and the reduction rules, which results in so-called complete sets of diagrams. The method is similar to the computation of critical pairs for the completion of term rewriting systems. We explore cases where the computation of these overlaps can be done in a first order way by variants of critical pair computation that use unification algorithms. As a case study of an application we describe a finitary and decidable unification algorithm for the combination of the equational theory of left-commutativity modelling multi-sets, context variables and many-sorted unification. Sets of equations are restricted to be almost linear, i.e. every variable and context variable occurs at most once, where we allow one exception: variables of a sort without ground terms may occur several times. Every context variable must have an argument-sort in the free part of the signature. We also extend the unification algorithm by the treatment of binding-chains in let- and letrec-environments and by context-classes. This results in a unification algorithm that can be applied to all overlaps of normal-order reductions and transformations in an extended lambda calculus with letrec that we use as a case study.
The concept of embeddedness plays a central role in the segment of economic sociology and social theory which is inspired by the works of Karl Polanyi. But to the extent that embeddedness is understood in a substantialist manner, implying the existence of a unitary lifeworld, the desire for embeddedness is an impossible aspiration under modern conditions. Throughout the modern era it is however possible to observe the emergence of complex societal stabilization mechanisms, which serve as substitutes to traditional forms of embeddedness. The emergence of function specific cultures, in the form of, for example, legal, political and scientific cultures, establishing a ‘second nature’ in the Hegelian sense, is one example of this. Other examples are (neo-)corporatist institutions which fulfilled a central stabilising role in classical modernity and the kind of network based governance arrangements which fulfil a similar position in today’s radicalised modernity.
We study the impact of the arrival of macroeconomic news on the informational and noise-driven components in high-frequency quote processes and their conditional variances. Bid and ask returns are decomposed into a common ("efficient return") factor and two market-side-specific components capturing market microstructure effects. The corresponding variance components reflect information-driven and noise-induced volatilities. We find that all volatility components reveal distinct dynamics and are positively influenced by news. The proportion of noise-induced variances is highest before announcements and significantly declines thereafter. Moreover, news-affected responses in all volatility components are influenced by order flow imbalances. JEL Classification: C32, G14, E44
The purpose of this essay is to assess the automatic exchange of information as described in EU Directive 2003/48 of 3 June 2003 on taxation of savings income in the form of interest payments with regard to the fundamental right of the individual to a private life, to banking secrecy and the freedoms on which the European internal market is based. The assessment reveals the conflicts of interests and values involved in the holding by banks (particularly those offering private banking services) of increasingly extensive, detailed and intimate information about their clients and in the automatic processing of that information by ever more powerful and sophisticated systems. Banking secrecy plays an essential role in protecting clients against the dangers which the disclosure of such information without their permission might produce. Banking secrecy exists not only in Luxembourg but also in many other European countries, and in Germany and France in particular it is not very different from the system applying in Luxembourg. While the French and German tax authorities do have some investigative powers not enjoyed by their Luxembourg counterparts, those powers are strictly circumscribed and cannot rely on the electronic exchange of information set out in EU Directive 2003/48/EC. While banking secrecy is totally incompatible with the electronic exchange of information, the core question is whether the latter can be reconciled with the respect for private life. In a Europe that sets itself up as the cradle of human rights, the general and en-masse exchange of private information cannot provide adequate and sufficient guarantees that the information exchanged will not be misused. The amount of interference in private life is clearly out of proportion to the public interest involved and is contrary to sub-section 2, article 8 of the European Convention for the Protection of Human Rights and Fundamental Freedoms and to articles 7 and 8 of the Charter of Fundamental Rights of the European Union. Since the automatic exchange of information at least potentially risks restricting the free flow of capital among Member States and discouraging the use of transborder banking services, its compliance with the fundamental principles of the internal market also needs to be closely examined. The restrictions imposed by such exchange very probably go beyond the limits within which the free movement of capital and services is possible. The European Court of Justice has found that there is no proportionality if the measures supposedly undertaken in the general interest are actually based on a general presumption of tax evasion or tax fraud. However, it would be true to say that the ECJ does not always examine the tax restrictions placed on the free movement of capital particularly thoroughly to ensure that they are necessary or proportionate. The economic effectiveness of the automatic exchange of information is far from being proved and involves significant cost to the banks providing the information and to the tax authorities using it. To date the system does not appear to have produced any significant new tax revenue nor does it prevent the continuing outflow of capital from Europe. Yet withholding at source, which respects individual and economic freedoms, does generate tax revenue that is cost-free to the State. Exchange of information on request in justified cases using the OECD Tax Convention on Income and Capital model does also fight tax fraud while at the same time providing citizens with the guarantees required to ensure their private lives are respected. A combination of these two systems - withholding at source and exchange of information on request in justified cases - would create the proper balance between the public and private interest that the automatic exchange of information cannot provide.
We use a unique, nationally representative cross-national dataset to document the reduction in individuals’ usage of routine non-emergency medical care in the midst of the economic crisis. A substantially larger fraction of Americans have reduced medical care than have individuals in Great Britain, Canada, France, and Germany, all countries with universal health care systems. At the national level, reductions in medical care are related to the degree to which individuals must pay for it, and within countries are strongly associated with exogenous shocks to wealth and employment.
This paper investigates the accuracy and heterogeneity of output growth and inflation forecasts during the current and the four preceding NBER-dated U.S. recessions. We generate forecasts from six different models of the U.S. economy and compare them to professional forecasts from the Federal Reserve’s Greenbook and the Survey of Professional Forecasters (SPF). The model parameters and model forecasts are derived from historical data vintages so as to ensure comparability to historical forecasts by professionals. The mean model forecast comes surprisingly close to the mean SPF and Greenbook forecasts in terms of accuracy even though the models only make use of a small number of data series. Model forecasts compare particularly well to professional forecasts at a horizon of three to four quarters and during recoveries. The extent of forecast heterogeneity is similar for model and professional forecasts but varies substantially over time. Thus, forecast heterogeneity constitutes a potentially important source of economic fluctuations. While the particular reasons for diversity in professional forecasts are not observable, the diversity in model forecasts can be traced to different modeling assumptions, information sets and parameter estimates. JEL Classification: C53, D84, E31, E32, E37 Keywords: Forecasting, Business Cycles, Heterogeneous Beliefs, Forecast Distribution, Model Uncertainty, Bayesian Estimation
This study examines the legal environment of netting agreements covering financial contracts. It concludes that an international instrument should be developed capable of improving the effectiveness of netting agreements in mitigating systemic risk. To this end, two different aspects of the enforceability of netting agreements are considered: (i) the general enforceability of netting, and (ii) the possibility of precluding the operation of netting a mechanism by way of a regulatory moratorium for considerations of systemic stability. The first part of the study presents the use of netting and the various forms it may take before going on to explain the benefits and drawbacks of enforceable netting agreements. Benefits for individual firms consist in lower counterparty risk and more favourable capital requirements. Benefits for the financial market as a whole flow from greater financial market stability since the contagion of systemically relevant institutions by the default or insolvency of another institution is limited, thus helping to avoid systemic effects. Additionally, the use of netting arrangements can improve overall market liquidity. A potential drawback of enforceability of netting, in certain situations, is that the operation of a netting mechanism could actually work against the purpose of systemic stability where the transfer of parts of the business of an insolvent financial institution to a solvent bridge entity would enhance or maintain value to a greater extent than the operation of a netting agreement would. Regulatory authorities are considering under which conditions a moratorium to halt the netting mechanism until the situation is solved could avoid this threat to systemic stability. The second part of the study examines whether there is the potential to support the purpose of enhanced systemic stability by way of international harmonisation of private and insolvency law. As regards the issue of general enforceability, the global picture of netting legislation is heterogeneous. Given the great practical relevance of the matter, an international instrument could be very useful. As to the issue of private law consequences of regulatory moratoria, the absence of a harmonised framework appears to lead to actual cross-border inconsistency and legal uncertainty as regards financial contracts that are governed by a foreign law. Taking these to aspects into account, this paper recommends that work on developing an international instrument be undertaken. The final part of the study suggests a set of preliminary guidelines for the development of suchan instrument. In the light of the findings of the previous sections, a mixed, two-step approach is recommended. First, a non-binding instrument could be developed, serving as a benchmark and reservoir of legal solutions in respect of the relevant issues. Secondly, isolated aspects relating to both the general enforceability of netting and the accommodation of a regulatory moratorium in foreign private and insolvency law could be dealt with in an international Convention, in particular where cross-border situations involving netting require uniformity of applicable legal rules.
This paper shows the equivalence of applicative similarity and contextual approximation, and hence also of bisimilarity and contextual equivalence, in the deterministic call-by-need lambda calculus with letrec. Bisimilarity simplifies equivalence proofs in the calculus and opens a way for more convenient correctness proofs for program transformations. Although this property may be a natural one to expect, to the best of our knowledge, this paper is the first one providing a proof. The proof technique is to transfer the contextual approximation into Abramsky's lazy lambda calculus by a fully abstract and surjective translation. This also shows that the natural embedding of Abramsky's lazy lambda calculus into the call-by-need lambda calculus with letrec is an isomorphism between the respective term-models.We show that the equivalence property proven in this paper transfers to a call-by-need letrec calculus developed by Ariola and Felleisen.
The 'Russian language proficiency test for multilingual children' is a linguistically and psycholinguistically-grounded test for L1-Russian bilingual children of pre-school and elementary school age. It allows the evaluation of language proficiency in Russian for scientific, therapeutic, and pedagogical purposes. The test is based on preliminary norms: data of 167 German-Russian bilingual children between the ages of 3 years and 6 years 11 months were evaluated.
Bilingual children's proficiency is examined in the following language domains:
- productive and receptive lexicon for verbs and nouns
- production of morphological marking on verbs (first and second-person singular present verbal inflection) and nouns (accusative and dative case singular)
- comprehension of grammatical constructions on the sentence level
The test should be administered by a competent – ideally native – speaker of Russian, and takes approximately 60 minutes to administer.
In addition to the test itself, the 'Russian language proficiency test for multilingual children' contains a questionnaire for gathering detailed information on the input situation as well as the child's previous linguistic and extra-linguistic development. The questionnaire is written in English and Russian and is intended to be filled out by the parents.
We show that if an agent is uncertain about the precise form of his utility function, his actual relative risk aversion may depend on wealth even if he knows his utility function lies in the class of constant relative risk aversion (CRRA) utility functions. We illustrate the consequences of this result for asset allocation: poor agents that are uncertain about their risk aversion parameter invest less in risky assets than wealthy investors with identical risk aversion uncertainty. Keywords: Risk Aversion , Preference Uncertainty , Risk-taking , Asset Allocation JEL Classification: D81, D84, G11 This Version: November 25, 2010
We estimate the risk and expected returns of private equity investments based on the market prices of exchange traded funds of funds that invest in unlisted private equity funds. Our results indicate that the market expects unlisted private equity funds to earn abnormal returns of about one to two percent. We also find that the market expects listed private equity funds to earn zero to marginally negative abnormal returns net of fees. Both listed and unlisted private equity funds have market betas close to one and positive factor loadings on the Fama-French SMB factor. Private equity fund returns are positively correlated with GDP growth and negatively correlated with the credit spread. Finally, we find that market returns of exchange traded funds of funds and listed private equity funds predict changes in self-reported book values of unlisted private equity funds.
The interactive verification system VeriFun is based on a polymorphic call-by-value functional language and on a first-order logic with initial model semantics w.r.t. constructors. It is designed to perform automatic induction proofs and can also deal with partial functions. This paper provides a reconstruction of the corresponding logic and semantics using the standard treatment of undefinedness which adapts and improves the VeriFun-logic by allowing reasoning on nonterminating expressions and functions. Equality of expressions is defined as contextual equivalence based on observing termination in all closing contexts. The reconstruction shows that several restrictions of the VeriFun framework can easily be removed, by natural generalizations: mutual recursive functions, abstractions in the data values, and formulas with arbitrary quantifier prefix can be formulated. The main results of this paper are: an extended set of deduction rules usable in VeriFun under the adapted semantics is proved to be correct, i.e. they respect the observational equivalence in all extensions of a program. We also show that certain classes of theorems are conservative under extensions, like universally quantified equations. Also other special classes of theorems are analyzed for conservativity.
The interactive verification system VeriFun is based on a polymorphic call-by-value functional language and on a first-order logic with initial model semantics w.r.t. constructors. This paper provides a reconstruction of the corresponding logic when partial functions are permitted. Typing is polymorphic for the definition of functions but monomorphic for terms in formulas. Equality of terms is defined as contextual equivalence based on observing termination in all contexts. The reconstruction also allows several generalizations of the functional language like mutual recursive functions and abstractions in the data values. The main results are: Correctness of several program transformations for all extensions of a program, which have a potential usage in a deduction system. We also proved that universally quantified equations are conservative, i.e. if a universally quantified equation is valid w.r.t. a program P, then it remains valid if the program is extended by new functions and/or new data types.
A logical framework consisting of a polymorphic call-by-value functional language and a first-order logic on the values is presented, which is a reconstruction of the logic of the verification system VeriFun. The reconstruction uses contextual semantics to define the logical value of equations. It equates undefinedness and non-termination, which is a standard semantical approach. The main results of this paper are: Meta-theorems about the globality of several classes of theorems in the logic, and proofs of global correctness of transformations and deduction rules. The deduction rules of VeriFun are globally correct if rules depending on termination are appropriately formulated. The reconstruction also gives hints on generalizations of the VeriFun framework: reasoning on nonterminating expressions and functions, mutual recursive functions and abstractions in the data values, and formulas with arbitrary quantifier prefix could be allowed.
Price pressures
(2010)
We study price pressures in stock prices—price deviations from fundamental value due to a risk-averse intermediary supplying liquidity to asynchronously arriving investors. Empirically, twelve years of daily New York Stock Exchange intermediary data reveal economically large price pressures. A $100,000 inventory shock causes an average price pressure of 0.28% with a half-life of 0.92 days. Price pressure causes average transitory volatility in daily stock returns of 0.49%. Price pressure effects are substantially larger with longer durations in smaller stocks. Theoretically, in a simple dynamic inventory model the ‘representative’ intermediary uses price pressure to control risk through inventory mean reversion. She trades off the revenue loss due to price pressure against the price risk associated with remaining in a nonzero inventory state. The model’s closed-form solution identifies the intermediary’s relative risk aversion and the distribution of investors’ private values for trading from the observed time series patterns. These allow us to estimate the social costs—deviations from constrained Pareto efficiency—due to price pressure which average 0.35 basis points of the value traded. JEL Classification: G12, G14, D53, D61