Universitätspublikationen
Refine
Year of publication
Document Type
- Working Paper (1903) (remove)
Is part of the Bibliography
- no (1903)
Keywords
- Deutschland (45)
- monetary policy (35)
- Banking Union (19)
- Mobilität (18)
- household finance (17)
- Monetary Policy (16)
- Covid-19 (15)
- ECB (15)
- ESG (15)
- Liquidity (15)
Institute
- Wirtschaftswissenschaften (1257)
- Center for Financial Studies (CFS) (1125)
- Sustainable Architecture for Finance in Europe (SAFE) (817)
- House of Finance (HoF) (672)
- Rechtswissenschaft (185)
- Institute for Monetary and Financial Stability (IMFS) (178)
- Exzellenzcluster Die Herausbildung normativer Ordnungen (73)
- Informatik (68)
- Institut für sozial-ökologische Forschung (ISOE) (60)
- Foundation of Law and Finance (51)
This paper presents a model to analyze the consequences of competition in order-flow between a profit maximizing stock exchange and an alternative trading platform on the decisions concerning trading fees and listing requirements. Listing requirements, set by the exchange, provide public information on listed firms and contribute to a better liquidity on all trading venues. It is sometimes asserted that competition induces the exchange to lower its level of listing standards compared to a situation in which it is a monopolist, because the trading platform can free-ride on this regulatory activity and compete more aggressively on trading fees. The present analysis shows that this is not always true and depends on the existence and size of gains related to multi market trading. These gains relax competition on trading fees. The higher these gains are, the more the exchange can increase its revenue from listing and trading when it raises its listing standards. For large enough gains from multi-market trading, the exchange is not induced to lower the level of listing standards when a competing trading platform appears. As a second result, this analysis also reveals a cross - subsidization effect between the listing and the trading activity when listing is not competitive. This model yields implications about the fee structures on stock markets, the regulation of listings and the social optimality of competition for volume. JEL Classification: G10, G18, G12
This paper proposes the Shannon entropy as an appropriate one-dimensional measure of behavioural trading patterns in financial markets. The concept is applied to the illustrative example of algorithmic vs. non-algorithmic trading and empirical data from Deutsche Börse's electronic cash equity trading system, Xetra. The results reveal pronounced differences between algorithmic and non-algorithmic traders. In particular, trading patterns of algorithmic traders exhibit a medium degree of regularity while non-algorithmic trading tends towards either very regular or very irregular trading patterns. JEL Classification: C40, D0, G14, G15, G20
How ordinary consumers make complex economic decisions: financial literacy and retirement readiness
(2010)
This paper explores who is financially literate, whether people accurately perceive their own economic decision-making skills, and where these skills come from. Self-assessed and objective measures of financial literacy can be linked to consumers’ efforts to plan for retirement in the American Life Panel, and causal relationships with retirement planning examined by exploiting information about respondent financial knowledge acquired in school. Results show that those with more advanced financial knowledge are those more likely to be retirement-ready.
We examined financial literacy among the young using the most recent wave of the 1997 National Longitudinal Survey of Youth. We showed that financial literacy is low; fewer than one-third of young adults possess basic knowledge of interest rates, inflation, and risk diversification. Financial literacy was strongly related to sociodemographic characteristics and family financial sophistication. Specifically, a college-educated male whose parents had stocks and retirement savings was about 45 percentage points more likely to know about risk diversification than a female with less than a high school education whose parents were not wealthy. These findings have implications for consumer policy. JEL Classification: D91
This paper investigates the accuracy and heterogeneity of output growth and inflation forecasts during the current and the four preceding NBER-dated U.S. recessions. We generate forecasts from six different models of the U.S. economy and compare them to professional forecasts from the Federal Reserve’s Greenbook and the Survey of Professional Forecasters (SPF). The model parameters and model forecasts are derived from historical data vintages so as to ensure comparability to historical forecasts by professionals. The mean model forecast comes surprisingly close to the mean SPF and Greenbook forecasts in terms of accuracy even though the models only make use of a small number of data series. Model forecasts compare particularly well to professional forecasts at a horizon of three to four quarters and during recoveries. The extent of forecast heterogeneity is similar for model and professional forecasts but varies substantially over time. Thus, forecast heterogeneity constitutes a potentially important source of economic fluctuations. While the particular reasons for diversity in professional forecasts are not observable, the diversity in model forecasts can be traced to different modeling assumptions, information sets and parameter estimates. JEL Classification: C53, D84, E31, E32, E37 Keywords: Forecasting, Business Cycles, Heterogeneous Beliefs, Forecast Distribution, Model Uncertainty, Bayesian Estimation
This paper analyzes loan pricing when there is multiple banking and borrower distress. Using a unique data set on SME lending collected from major German banks, we can instrument for effective coordination between lenders, carrying out a panel estimation. The analysis allows to distinguish between rents that accrue due to single bank lending, rents that accrue due to relationship lending, and rents that accrue due to the elimination of competition among multiple lenders. We find the relationship lending to have no discernible impact on loan spreads, while both single lending and coordinated multiple lending significantly increase the spread. Thus, contrary to predictions in the literature, multiple lending does not insure the borrower against hold-up. JEL Classification: D74, G21, G33, G34
Zusammenfassung und Ergebnisse Es ist noch zu früh, eine abschließende Bewertung der Entwicklung auf den Finanzmärkten während der letzten zwei Jahre vorzunehmen. In jedem Fall sind aber alle Regelungen auf den Prüfstand zu stellen. Das Aufsichtsrecht hat insgesamt seine Aufgabe, Finanzstabilität zu gewährleisten, nicht erfüllt. Wesentliche Schritte für eine grundlegende Reform sind: - ein striktes Verständnis des Aufsichtsrechts als Sonderordnungsrecht - eine drastische Reduktion der Komplexität der Rechtsvorschriften - die Internationalisierung und Europäisierung der Aufsicht - die Steigerung der Transparenz der Verbriefung einschließlich eines möglichen Zulassungsverfahrens und des Verbots bestimmter gefährlicher „Produkte“ - die vollständige Neuausrichtung der Bewertung von Finanzunternehmen und ihrer „Produkte“ („ratings“) - Die Schaffung geeigneter Regeln und Verfahren, um auch systemisch relevante Institutionen der Marktdisziplin, also ihrem Untergang, auszusetzen - Die Grundlage für kurzfristige Entscheidung über Fortführung, Zerlegung oder Abwicklung eines Instituts als Maßnahme der Gefahrenabwehr muss geschaffen werden. Ein Sonderinsolvenzrecht für Banken ist nicht angezeigt - Die Einbeziehung des menschlichen Verhaltens und der Persönlichkeitsstruktur der maßgebenden Personen in den Finanzinstitutionen
ZUSAMMENFASSUNG UND ERGEBNISSE (1) Die Schaffung des Europäischen Ausschusses für Systemrisiken stößt nicht auf durchgreifende rechtliche Bedenken. (2) Es ist nicht sicher, dass die Errichtung der neuen Europäischen Aufsichtbehörden ohne entsprechende Änderung des Primärrechts zulässig ist. (3) Es kommt entscheidend darauf an, welche rechtsverbindlichen Einzelweisungsbefugnisse tatsächlich den Behörden verliehen werden. (4) Die nach dem Kompromiss vom 2. Dezember 2009 noch verbliebenen Einzelweisungsbefugnisse der Behörden gegenüber Privaten und gegenüber nationalen Aufsichtsbehörden sind rechtlich kaum abgesichert. (5) Wenn die hoheitlichen Befugnisse weitgehend oder vollständig beseitigt werden, bestehen Bedenken im Hinblick auf die Geeignetheit und Erforderlichkeit der Einrichtungen. (6) Die weitreichenden Unabhängigkeitsgarantien sind nicht mit den Anforderungen demokratischer Aufsicht und Kontrolle zu vereinbaren. (7) Für die Einräumung von Unabhängigkeit ist nach deutschem Verfassungsrecht eine ausdrückliche Regelung in der Verfassung, wie in Art. 88 Satz 2 GG, erforderlich. (8) Die transnationale Kooperation von Verwaltungsbehörden bedarf zumindest dann einer gesetzlichen Ermächtigung, wenn faktisch verbindliche Entscheidungen getroffen werden.
This paper shows the equivalence of applicative similarity and contextual approximation, and hence also of bisimilarity and contextual equivalence, in the deterministic call-by-need lambda calculus with letrec. Bisimilarity simplifies equivalence proofs in the calculus and opens a way for more convenient correctness proofs for program transformations. Although this property may be a natural one to expect, to the best of our knowledge, this paper is the first one providing a proof. The proof technique is to transfer the contextual approximation into Abramsky's lazy lambda calculus by a fully abstract and surjective translation. This also shows that the natural embedding of Abramsky's lazy lambda calculus into the call-by-need lambda calculus with letrec is an isomorphism between the respective term-models.We show that the equivalence property proven in this paper transfers to a call-by-need letrec calculus developed by Ariola and Felleisen.
This note shows that in non-deterministic extended lambda calculi with letrec, the tool of applicative (bi)simulation is in general not usable for contextual equivalence, by giving a counterexample adapted from data flow analysis. It also shown that there is a flaw in a lemma and a theorem concerning finite simulation in a conference paper by the first two authors.
A logical framework consisting of a polymorphic call-by-value functional language and a first-order logic on the values is presented, which is a reconstruction of the logic of the verification system VeriFun. The reconstruction uses contextual semantics to define the logical value of equations. It equates undefinedness and non-termination, which is a standard semantical approach. The main results of this paper are: Meta-theorems about the globality of several classes of theorems in the logic, and proofs of global correctness of transformations and deduction rules. The deduction rules of VeriFun are globally correct if rules depending on termination are appropriately formulated. The reconstruction also gives hints on generalizations of the VeriFun framework: reasoning on nonterminating expressions and functions, mutual recursive functions and abstractions in the data values, and formulas with arbitrary quantifier prefix could be allowed.
Opting out of the great inflation: German monetary policy after the break down of Bretton Woods
(2009)
During the turbulent 1970s and 1980s the Bundesbank established an outstanding reputation in the world of central banking. Germany achieved a high degree of domestic stability and provided safe haven for investors in times of turmoil in the international financial system. Eventually the Bundesbank provided the role model for the European Central Bank. Hence, we examine an episode of lasting importance in European monetary history. The purpose of this paper is to highlight how the Bundesbank monetary policy strategy contributed to this success. We analyze the strategy as it was conceived, communicated and refined by the Bundesbank itself. We propose a theoretical framework (following Söderström, 2005) where monetary targeting is interpreted, first and foremost, as a commitment device. In our setting, a monetary target helps anchoring inflation and inflation expectations. We derive an interest rate rule and show empirically that it approximates the way the Bundesbank conducted monetary policy over the period 1975-1998. We compare the Bundesbank´s monetary policy rule with those of the FED and of the Bank of England. We find that the Bundesbank´s policy reaction function was characterized by strong persistence of policy rates as well as a strong response to deviations of inflation from target and to the activity growth gap. In contrast, the response to the level of the output gap was not significant. In our empirical analysis we use real-time data, as available to policy-makers at the time. JEL Classification: E31, E32, E41, E52, E58
Pion and strangeness puzzles
(1996)
Data on the mean multiplicity of strange hadrons produced in minimum bias proton--proton and central nucleus--nucleus collisions at momenta between 2.8 and 400 GeV/c per nucleon have been compiled. The multiplicities for nucleon--nucleon interactions were constructed. The ratios of strange particle multiplicity to participant nucleon as well as to pion multiplicity are larger for central nucleus--nucleus collisions than for nucleon--nucleon interactions at all studied energies. The data at AGS energies suggest that the latter ratio saturates with increasing masses of the colliding nuclei. The strangeness to pion multiplicity ratio observed in nucleon--nucleon interactions increases with collision energy in the whole energy range studied. A qualitatively different behaviour is observed for central nucleus--nucleus collisions: the ratio rapidly increases when going from Dubna to AGS energies and changes little between AGS and SPS energies. This change in the behaviour can be related to the increase in the entropy production observed in central nucleus-nucleus collisions at the same energy range. The results are interpreted within a statistical approach. They are consistent with the hypothesis that the Quark Gluon Plasma is created at SPS energies, the critical collision energy being between AGS and SPS energies.
The data on average hadron multiplicities in central A+A collisions measured at CERN SPS are analysed with the ideal hadron gas model. It is shown that the full chemical equilibrium version of the model fails to describe the experimental results. The agreement of the data with the off-equilibrium version allowing for partial strangeness saturation is significantly better. The freeze-out temperature of about 180 MeV seems to be independent of the system size (from S+S to Pb+Pb) and in agreement with that extracted in e+e-, pp and p{\bar p} collisions. The strangeness suppression is discussed at both hadron and valence quark level. It is found that the hadronic strangeness saturation factor gamma_S increases from about 0.45 for pp interactions to about 0.7 for central A+A collisions with no significant change from S+S to Pb+Pb collisions. The quark strangeness suppression factor lambda_S is found to be about 0.2 for elementary collisions and about 0.4 for heavy ion collisions independently of collision energy and type of colliding system
The transverse momentum and rapidity distributions of net protons and negatively charged hadrons have been measured for minimum bias proton-nucleus and deuteron-gold interactions, as well as central oxygen-gold and sulphur-nucleus collisions at 200 GeV per nucleon. The rapidity density of net protons at midrapidity in central nucleus-nucleus collisions increases both with target mass for sulphur projectiles and with the projectile mass for a gold target. The shape of the rapidity distributions of net protons forward of midrapidity for d+Au and central S+Au collisions is similar. The average rapidity loss is larger than 2 units of rapidity for reactions with the gold target. The transverse momentum spectra of net protons for all reactions can be described by a thermal distribution with temperatures' between 145 +- 11 MeV (p+S interactions) and 244 +- 43 MeV (central S+Au collisions). The multiplicity of negatively charged hadrons increases with the mass of the colliding system. The shape of the transverse momentum spectra of negatively charged hadrons changes from minimum bias p+p and p+S interactions to p+Au and central nucleus-nucleus collisions. The mean transverse momentum is almost constant in the vicinity of midrapidity and shows little variation with the target and projectile masses. The average number of produced negatively charged hadrons per participant baryon increases slightly from p+p, p+A to central S+S,Ag collisions.
A statistical model of the early stage of central nucleus--nucleus (A+A) collisions is developed. We suggest a description of the confined state with several free parameters fitted to a compilation of A+A data at the AGS. For the deconfined state a simple Bag model equation of state is assumed. The model leads to the conclusion that a Quark Gluon Plasma is created in central nucleus--nucleus collisions at the SPS. This result is in quantitative agreement with existing SPS data on pion and strangeness production and gives a natural explanation for their scaling behaviour. The localization and the properties of the transition region are discussed. It is shown that the deconfinement transition can be detected by observation of the characteristic energy dependence of pion and strangeness multiplicities, and by an increase of the event--by--event fluctuations. An attempt to understand the data on J/psi production in Pb+Pb collisions at the SPS within the same approach is presented.