Refine
Year of publication
Document Type
- Article (188) (remove)
Has Fulltext
- yes (188)
Is part of the Bibliography
- no (188)
Keywords
- Machine learning (4)
- Retirement (4)
- Artificial intelligence (3)
- Household finance (3)
- Ordoliberalism (3)
- Walter Eucken (3)
- 401(k) plan (2)
- Aesthetics (2)
- Annuity (2)
- Diseases (2)
Institute
- Wirtschaftswissenschaften (188) (remove)
Die Entscheidung, die »Regelaltersgrenze« von derzeit 65 auf 67 Jahre anzuheben, hat erneut der Debatte Auftrieb verschafft, wie der Übergang von der Erwerbsphase in den Ruhestand sinnvoll zu gestalten ist. Sind die betroffenen Altersjahrgänge noch leistungsfähig genug, um weiter berufstätig zu sein? Wie schätzen Arbeitgeber die Einsatzmöglichkeit älterer Arbeitnehmerinnen und Arbeitnehmer ein? Gibt es genügend Arbeitsplätze, um eine erweiterte Zahl von älteren Erwerbstätigen aufzunehmen? Mindert der spätere Ausstieg der Älteren die Einstiegschancen der Jüngeren? Ist der geplante Aufschub der Regelaltersgrenze lediglich ein »Trick«, um die Rentenabschläge hochzuschrauben?
Commercialization of consumers’ personal data in the digital economy poses serious, both conceptual and practical, challenges to the traditional approach of European Union (EU) Consumer Law. This article argues that mass-spread, automated, algorithmic decision-making casts doubt on the foundational paradigm of EU consumer law: consent and autonomy. Moreover, it poses threats of discrimination and under- mining of consumer privacy. It is argued that the recent legislative reaction by the EU Commission, in the form of the ‘New Deal for Consumers’, was a step in the right direction, but fell short due to its continued reliance on consent, autonomy and failure to adequately protect consumers from indirect discrimination. It is posited that a focus on creating a contracting landscape where the consumer may be properly informed in material respects is required, which in turn necessitates blending the approaches of competition, consumer protection and data protection laws.
We propose a novel approach to the study of international trade based on a theory of country integration that embodies a broad systemic viewpoint on the relationship between trade and growth. Our model leads to an indicator of country openness that measures a country's level of integration through the full architecture of its connections in the trade network. We apply our methodology to a sample of 204 countries and find a sizable and significant positive relationship between our integration measure and a country's growth rate, while that of the traditional measures of outward orientation is only minor and statistically insignificant.
We use census data to show that structural transformation reflects a fundamental reallocation of labour from goods to services, instead of a relabelling that occurs when goods-producing firms outsource their in-house service production. The novelty of our approach is that it categorizes labour by occupations, which are invariant to outsourcing. We find that the reallocation of labour from goods-producing to service-producing occupations is a robust feature in censuses from around the world and different time periods. To understand the underlying forces, we propose a tractable model in which uneven occupation-specific technological change generates structural transformation of occupation employment.
This paper uses historical monthly temperature level data for a panel of 114 countries to identify the effects of within year temperature level variability on productivity growth in five different macro regions, i.e., (1) Africa, (2) Asia, (3) Europe, (4) North America and (5) South America. We find two primary results. First, higher intra-annual temperature variability reduces (increases) productivity in Europe and North America (Asia). Second, higher intra-annual temperature variability has no significant effects on productivity in Africa and South America. Additional empirical tests indicate also the following: (1) rising intra-annual temperature variability reduces productivity (even thought less significantly)in both tropical and non-tropical regions, (2) inter-annual temperature variability reduces (increases) productivity in North America (Europe) and (3) winter and summer inter-annual temperature variability generates a drop in productivity in both Europe and North America. Taken together, these findings indicate that temperature variability shocks tend to have stronger adverse economic effects among richer economies. In a production economy featuring long-run productivity and temperature volatility shocks, we quantify these negative impacts and find welfare losses of 2.9% (1%) in Europe (North America).
While the COVID-19 pandemic had a large and asymmetric impact on firms, many countries quickly enacted massive business rescue programs which are specifically targeted to smaller firms. Little is known about the effects of such policies on business entry and exit, investment, factor reallocation, and macroeconomic outcomes. This paper builds a general equilibrium model with heterogeneous and financially constrained firms in order to evaluate the short- and long-term consequences of small firm rescue programs in a pandemic recession. We calibrate the stationary equilibrium and the pandemic shock to the U.S. economy, taking into account the factual Paycheck Protection Program (PPP) as a specific policy. We find that the policy has only a modest impact on aggregate output and employment because (i) jobs are saved predominately in the smallest firms that account for a minor share of employment and (ii) the grant reduces the reallocation of resources towards larger and less impacted firms. Much of the reallocation effects occur in the aftermath of the pandemic episode. By preventing inefficient liquidations, the policy dampens the long-term declines of aggregate consumption and of the real wage, thus delivering small welfare gains.
Information asymmetry and its implications in online purchasing behaviour: a country case study
(2020)
The objective of this study is to analyse how certain variables in the online market affect the decision-making trajectory and actions toward reducing the information asymmetry faced in online purchasing. A survey and observation are conducted in order to understand the behavior and perceptions of online buyers toward the information given in online platforms. Descriptive and correlation analysis have been employed in order to evaluate the data collected and test the correlation between variables of the research model. It results that most participants take for granted the fact that sellers have more information than them when entering into a transaction agreement and this makes them feel inferior towards the superior power sellers possess in such interactions. This makes the traditional markets more preferred for them, however multiple sources such as reviews and ratings result as an alternative way of reducing the perceived information asymmetry.
The recent COVID-19 pandemic represents an unprecedented worldwide event to study the influence of related news on the financial markets, especially during the early stage of the pandemic when information on the new threat came rapidly and was complex for investors to process. In this paper, we investigate whether the flow of news on COVID-19 had an impact on forming market expectations. We analyze 203,886 online articles dealing with COVID-19 and published on three news platforms (MarketWatch.com, NYTimes.com, and Reuters.com) in the period from January to June 2020. Using machine learning techniques, we extract the news sentiment through a financial market-adapted BERT model that enables recognizing the context of each word in a given item. Our results show that there is a statistically significant and positive relationship between sentiment scores and S&P 500 market. Furthermore, we provide evidence that sentiment components and news categories on NYTimes.com were differently related to market returns.
We model the decisions of young individuals to stay in school or drop out and engage in criminal activities. We build on the literature on human capital and crime engagement and use the framework of Banerjee (1993) that assumes that the information needed to engage in crime arrives in the form of a rumour and that individuals update their beliefs about the profitability of crime relative to education. These assumptions allow us to study the effect of social interactions on crime. In our model, we investigate informational spillovers from the actions of talented students to less talented students. We show that policies that decrease the cost of education for talented students may increase the vulnerability of less talented students to crime. The effect is exacerbated when students do not fully understand the underlying learning dynamics.
This paper proposes tests for out-of-sample comparisons of interval forecasts based on parametric conditional quantile models. The tests rank the distance between actual and nominal conditional coverage with respect to the set of conditioning variables from all models, for a given loss function. We propose a pairwise test to compare two models for a single predictive interval. The set-up is then extended to a comparison across multiple models and/or intervals. The limiting distribution varies depending on whether models are strictly non-nested or overlapping. In the latter case, degeneracy may occur. We establish the asymptotic validity of wild bootstrap based critical values across all cases. An empirical application to Growth-at-Risk (GaR) uncovers situations in which a richer set of financial indicators are found to outperform a commonly-used benchmark model when predicting downside risk to economic activity.
We estimate the causal effect of shared e-scooter services on traffic accidents by exploiting the variation in the availability of e-scooter services induced by the staggered rollout across 93 cities in six countries. Police-reported accidents involving personal injuries in the average month increased by around 8.2% after shared e-scooters were introduced. Effects are large during summer and insignificant during winter. Further heterogeneity analysis reveals the largest estimated effects for cities with limited cycling infrastructure, while no effects are detectable in cities with high bike-lane density. This difference suggests that public policy can play a crucial role in mitigating accidents related to e-scooters and, more generally, to changes in urban mobility.
We analyze limit order book resiliency following liquidity shocks initiated by large market orders. Based on a unique data set, we investigate whether high‐frequency traders are involved in replenishing the order book. Therefore, we relate the net liquidity provision of high‐frequency traders, algorithmic traders, and human traders around these market impact events to order book resiliency. Although all groups of traders react, our results show that only high‐frequency traders reduce the spread within the first seconds after the market impact event. Order book depth replenishment, however, takes significantly longer and is mainly accomplished by human traders’ liquidity provision.
Vehicle registrations have been shown to strongly react to tax reforms aimed at reducing CO2 emissions from passengers’ cars, but are the effects equally strong for positive and negative tax changes? The literature on asymmetric reactions to price and tax changes has documented asymmetries for everyday goods but has not yet considered durables. We leverage multiple vehicle registration tax (VRT) reforms in Norway and estimate their impact on within car-model substitutions. We estimate stronger effects for cars receiving tax cuts and rebates than for those affected by tax increases. The corresponding estimated elasticity is − 1.99 for VRT decreases and 0.77 for increases. As consumers may also substitute across car models, our estimates represent a lower bound.
Digital wealth and its necessary regulation have gained prominence in recent years. The European Commission has published several documents and policy proposals relating, directly or indirectly, to the data economy. A data economy can be defined as an ecosystem of different types of market players collaborating to ensure that data is accessible and usable in order to extract value from data through, for example, creating a variety of applications with great potential to improve daily life. The value of data can increase from EUR 257 billion (1.85 of EU Gross Domestic Product (GDP)) to EUR 643 billion by 2020 (3.17% of EU GDP), according to the EU Commission. The legal implications of the increasing value of the data economy are clear; hence the need to address the challenges presented by its legal regulation.
The importance of agile methods has increased in recent years, not only to manage IT projects but also to establish flexible and adaptive organisational structures, which are essential to deal with disruptive changes and build successful digital business strategies. This paper takes an industry-specific perspective by analysing the dissemination, objectives and relative popularity of agile frameworks in the German banking sector. The data provides insights into expectations and experiences associated with agile methods and indicates possible implementation hurdles and success factors. Our research provides the first comprehensive analysis of agile methods in the German banking sector. The comparison with a selected number of fintechs has revealed some differences between banks and fintechs. We found that almost all banks and fintechs apply agile methods in IT projects. However, fintechs have relatively more experience with agile methods than banks and use them more intensively. Scrum is the most relevant framework used in practice. Scaled agile frameworks are so far negligible in the German banking sector. Acceleration of projects is apparently the most important objective of deploying agile methods. In addition, agile methods can contribute to cost savings and lead to improved quality and innovation performance, though for banks it is evidently more challenging to reach their respective targets than for fintechs. Overall our findings suggest that German banks are still in a maturing process of becoming more agile and that there is room for an accelerated adoption of agile methods in general and scaled agile frameworks in particular.
Die Wettbewerbsfähigkeit der deutschen Wirtschaft steht vor gewaltigen Herausforderungen. Traditionell starke Sektoren wie die Automobilindustrie oder der Maschinenbau befinden sich angesichts disruptiver Veränderungen durch neue Technologien, den Kampf gegen den Klimawandel und veränderte regulatorische Rahmenbedingungen in einer Umbruchphase. Zahlreiche Industriezweige wandeln sich durch den Einsatz von Künstlicher Intelligenz zu „Smart Industries“. Gleichzeitig gewinnt die Kompetenz in Querschnittstechnologien wie Cloud Computing oder Cyber Security an Bedeutung, da diese den effektiven Einsatz von Künstlicher Intelligenz erst ermöglichen. Eine Analyse der Wettbewerbsposition der deutschen Wirtschaft zeigt auf, dass in manchen Zukunftsfeldern ein erheblicher Nachholbedarf besteht.
In a unifying framework generalizing established theories we characterize under which conditions Joint Ownership of assets creates the best cooperation incentives in a partnership. We endogenise renegotiation costs and assume that they weakly increase with additional assets. A salient sufficient condition for optimal cooperation incentives among patient partners is if Joint Ownership is a Strict Coasian Institution for which transaction costs impede an efficient asset reallocation after a breakdown. In contrast to Halonen (2002) the logic behind our results is that Joint Ownership maximizes the value of the relationship and the costs of renegotiating ownership after a broken relationship.
The purpose of the data presented in this article is to use it in ex post estimations of interest rate decisions by the European Central Bank (ECB), as it is done by Bletzinger and Wieland (2017) [1]. The data is of quarterly frequency from 1999 Q1 until 2013 Q2 and consists of the ECB's policy rate, inflation rate, real output growth and potential output growth in the euro area. To account for forward-looking decision making in the interest rate rule, the data consists of expectations about future inflation and output dynamics. While potential output is constructed based on data from the European Commission's annual macro-economic database, inflation and real output growth are taken from two different sources both provided by the ECB: the Survey of Professional Forecasters and projections made by ECB staff. Careful attention was given to the publication date of the collected data to ensure a real-time dataset only consisting of information which was available to the decision makers at the time of the decision.
Strict environmental regulation may deter foreign direct investment (FDI). The paper develops the hypothesis that regulation predominantly discourages FDI that is conducted as Greenfield investment rather than mergers and acquisitions (M&A). The hypothesis is tested with German firm-level FDI data. Empirically, stricter regulation reduces new Greenfield projects in polluting industries, but indeed has a much smaller impact on the number of M&As. This significant difference is compatible with the fact that existing operations often benefit from grandfathering rules, which provide softer regulation for pre-exisiting plants, and with the expectation that for M&As part of the regulation is capitalized in the purchase price. The heterogeneous effects help explaining mixed results in previous studies that have neglected the mode of entry.
The debate on monetary and fiscal policy is heavily influenced by estimates of the equilibrium real interest rate. In particular, this concerns estimates derived from a simple aggregate demand and Phillips curve model with time-varying components as proposed by Laubach and Williams (2003). For example, Summers (2014a) refers to these estimates as important evidence for a secular stagnation and the need for fiscal stimulus. Yellen (2015, 2017) has made use of such estimates in order to explain and justify why the Federal Reserve has held interest rates so low for so long. First, we re-estimate the United States equilibrium rate with the methodology of Laubach and Williams (2003). Then, we build on their approach and an alternative specification to provide new estimates for the United States, Germany, the euro area and Japan. Third, we subject these estimates to a battery of sensitivity tests. Due to the great uncertainty and sensitivity that accompany these equilibrium rate estimates, the observed decline in the estimates is not a reliable indicator of a need for expansionary monetary and fiscal policy. Yet, if these estimates are employed to determine the appropriate monetary policy stance, such estimates are better used together with the consistent estimate of the level of potential output.
Considering the circumstance that literature dealing with the economic performance of agri-food businesses in general, or particularly with the German agricultural sector, mainly deals with strictly agricultural-related theory in order to explain the economic success of agri-food businesses, the present paper aims to extend existing discourses to further areas of thought. Consequently, the characteristics: a) increased size of agribusiness, b) pull-strategies, c) the development of new markets and d) focus on the processing industry, that all correspond to the current picture of the German agricultural sector and are considered to be significantly responsible for recently managing to outpace the French agri-food sector, will be first explained in their success against the background of mainly non-agricultural-related literature. By doing so helpful and rather unnoted perspectives can be contributed to existing discourses. Second, the paper presents scatter plots which portray correlations between a) the added value of agriculture and the regular labor force, b) the added value of agriculture and the number of agricultural holdings and c) the added value of agriculture and the number of enterprises concerning milk consumption. Corresponding scatter plots which show different developments in Germany and France can be related to the findings of the first part of the paper and allow new perspectives in existing discourses as well.
We investigate the default probability, recovery rates and loss distribution of a portfolio of securitised loans granted to Italian small and medium enterprises (SMEs). To this end, we use loan level data information provided by the European DataWarehouse platform and employ a logistic regression to estimate the company default probability. We include loan-level default probabilities and recovery rates to estimate the loss distribution of the underlying assets. We find that bank securitised loans are less risky, compared to the average bank lending to small and medium enterprises.
In diesem Beitrag zur Frage nach dem Ausmaß von Einkommensarmut von Familien stehen zwei Aspekte im Mittelpunkt. – Zum einen ist im Vorfeld von Verteilungsanalysen die Art der Einkommensgewichtung in Mehrpersonenhaushalten zu klären. Nach Abwägung verschiedener Ansätze zur Ableitung einer Äquivalenzskala wurde eine Präferenz für ein institutionell orientiertes Gewichtungsschema, approximiert durch die alte OECD-Skala, begründet. – Zum anderen wurde der Einfluss der Frauenerwerbsbeteiligung auf die Einkommenssituation von Familien mit Kindern empirisch untersucht. Von prekären Einkommensverhältnissen und Einkommensarmut sind vor allem Familien mit geringfügig beschäftigter oder nichterwerbstätiger Partnerin sowie Alleinerziehende – Letztere wiederum bei fehlender Erwerbstätigkeit besonders stark – betroffen, wobei in den neuen Ländern die Situation wesentlich brisanter ist als in den alten Ländern. Bei politischen Maßnahmen sollten Erwerbswünsche der Frauen und Bedürfnisse der Familien berücksichtigt werden. Von daher sind Transfers im Rahmen des Familienleistungsausgleichs und die öffentliche Förderung von Kinderbetreuungseinrichtungen nicht als konkurrierende, sondern eher als komplementäre Konzepte zu diskutieren.
Theory building is not only underdeveloped in IT services management research, but in
general in IS. Given the paradigm shift that comes from the development away from a
networked economy towards a network economy, the lack of spending enough attention to
theorizing in IS becomes even more obvious. In the light of other "megatrends" in IS
research, such as the increasing professionalization and use of statistical methods and the
exploitation of extremely large sets of data (often harvested from social media sites), we
might lose interest in theorizing in the presence of the tremendous amount of available
empirical data. In this position paper, the author advocates that services science researchers
should focus on rigor and relevance in their research approaches.
This research examines the impact of online display advertising and paid search advertising relative to offline advertising on firm performance and firm value. Using proprietary data on annualized advertising expenditures for 1651 firms spanning seven years, we document that both display advertising and paid search advertising exhibit positive effects on firm performance (measured by sales) and firm value (measured by Tobin's q). Paid search advertising has a more positive effect on sales than offline advertising, consistent with paid search being closest to the actual purchase decision and having enhanced targeting abilities. Display advertising exhibits a relatively more positive effect on Tobin's q than offline advertising, consistent with its long-term effects. The findings suggest heterogeneous economic benefits across different types of advertising, with direct implications for managers in analyzing advertising effectiveness and external stakeholders in assessing firm performance.
In the upcoming years, the internet of things (IoT)will enrich daily life. The combination of artificial intelligence(AI) and highly interoperable systems will bring context-sensitive multi-domain services to reality. This paper describesa concept for an AI-based smart living platform with open-HAB, a smart home middleware, and Web of Things (WoT) askey components of our approach. The platform concept con-siders different stakeholders, i.e. the housing industry, serviceproviders, and tenants. These activities are part of the Fore-Sight project, an AI-driven, context-sensitive smart living plat-form.
Even as online advertising continues to grow, a central question remains: Who to target? Yet, advertisers know little about how to select from the hundreds of audience segments for targeting (and combinations thereof) for a profitable online advertising campaign. Utilizing insights from a field experiment on Facebook (Study 1), we develop a model that helps advertisers solve the cold-start problem of selecting audience segments for targeting. Our model enables advertisers to calculate the break-even performance of an audience segment to make a targeted ad campaign at least as profitable as an untargeted one. Advertisers can use this novel model to decide whether to test specific audience segments in their campaigns (e.g., in randomized controlled trials). We apply our model to data from the Spotify ad platform to study the profitability of different audience segments (Study 2). Approximately half of those audience segments require the click-through rate to double compared to an untargeted campaign, which is unrealistically high for most ad campaigns. Our model also shows that narrow segments require a lift that is likely not attainable, specifically when the data quality of these segments is poor. We confirm this theoretical finding in an empirical study (Study 3): A decrease in data quality due to Apple’s introduction of the App Tracking Transparency (ATT) framework more negatively affects the click-through rate of narrow (versus broad) audience segments.
This article discusses the counterpart of interactive machine learning, i.e., human learning while being in the loop in a human-machine collaboration. For such cases we propose the use of a Contradiction Matrix to assess the overlap and the contradictions of human and machine predictions. We show in a small-scaled user study with experts in the area of pneumology (1) that machine-learning based systems can classify X-rays with respect to diseases with a meaningful accuracy, (2) humans partly use contradictions to reconsider their initial diagnosis, and (3) that this leads to a higher overlap between human and machine diagnoses at the end of the collaboration situation. We argue that disclosure of information on diagnosis uncertainty can be beneficial to make the human expert reconsider her or his initial assessment which may ultimately result in a deliberate agreement. In the light of the observations from our project, it becomes apparent that collaborative learning in such a human-in-the-loop scenario could lead to mutual benefits for both human learning and interactive machine learning. Bearing the differences in reasoning and learning processes of humans and intelligent systems in mind, we argue that interdisciplinary research teams have the best chances at tackling this undertaking and generating valuable insights.
Artificial Intelligence (AI) and Machine Learning (ML) are currently hot topics in industry and business practice, while management-oriented research disciplines seem reluctant to adopt these sophisticated data analytics methods as research instruments. Even the Information Systems (IS) discipline with its close connections to Computer Science seems to be conservative when conducting empirical research endeavors. To assess the magnitude of the problem and to understand its causes, we conducted a bibliographic review on publications in high-level IS journals. We reviewed 1,838 articles that matched corresponding keyword-queries in journals from the AIS senior scholar basket, Electronic Markets and Decision Support Systems (Ranked B). In addition, we conducted a survey among IS researchers (N = 110). Based on the findings from our sample we evaluate different potential causes that could explain why ML methods are rather underrepresented in top-tier journals and discuss how the IS discipline could successfully incorporate ML methods in research undertakings.
Optimal investment decisions by institutional investors require accurate predictions with respect to the development of stock markets. Motivated by previous research that revealed the unsatisfactory performance of existing stock market prediction models, this study proposes a novel prediction approach. Our proposed system combines Artificial Intelligence (AI) with data from Virtual Investment Communities (VICs) and leverages VICs’ ability to support the process of predicting stock markets. An empirical study with two different models using real data shows the potential of the AI-based system with VICs information as an instrument for stock market predictions. VICs can be a valuable addition but our results indicate that this type of data is only helpful in certain market phases.
Immer auf den ersten Rängen : Leibniz-Preis für den Frankfurter Volkswirtschaftler Roman Inderst
(2010)
Wenn es um Superlative geht, dann steht der 40-jährige Prof. Roman Inderst immer ganz oben auf dem Treppchen: jung und schon auf den ersten Rängen unter den Top Ten der europäischen Wirtschaftswissenschaftler. Und am 15. März wurde er auch noch als jüngster unter den zehn Preisträgern mit dem wichtigsten deutschen Forschungspreis, dem mit 2,5 Millionen Euro dotierten Gottfried Wilhelm Leibniz-Preis 2010, ausgezeichnet.