Refine
Year of publication
Document Type
- Article (209) (remove)
Has Fulltext
- yes (209)
Is part of the Bibliography
- no (209)
Keywords
- Machine learning (4)
- Retirement (4)
- Artificial intelligence (3)
- COVID-19 (3)
- Household finance (3)
- Ordoliberalism (3)
- Walter Eucken (3)
- machine learning (3)
- 401(k) plan (2)
- Aesthetics (2)
Institute
- Wirtschaftswissenschaften (209) (remove)
The World Health Organization declared the emergence of the novel coronavirus (SARS-CoV-2) in January 2020. To trace infection chains, Germany launched its smartphone contact tracing app, the “Corona-Warn-App” (CWA), in June 2020. In order to be successful as a tool for fighting the pandemic, a high adoption rate is required in the population. We analyse the respective factors influencing app adoption based on the health belief model (HBM) with a cross-sectional online study including 1752 participants from Germany. The study was conducted with a certified panel provider from the end of December 2020 to January 2021. This model is primarily known from evaluations of medical treatments, such as breast cancer screenings, but it was rarely applied in prior work for a health-related information system such as the CWA. Our results indicate that intrinsic and extrinsic motivation to use the CWA are the strongest drivers of app use. In contrast, technical barriers, privacy concerns and lower income are the main inhibitors. Our findings contribute to the literature on the adoption of contact tracing apps by questioning actual users and non-users of the CWA, and we provide valuable insights for policymakers regarding influences of adoption and potential user groups of disease prevention technologies in times of pandemics.
Background: The German Corona-Warn-App (CWA) is a contact tracing app to mitigate the spread of SARS-CoV-2. As of today, it has been downloaded approximately 45 million times.
Objective: This study aims to investigate the influence of (non)users’ social environments on the usage of the CWA during 2 periods with relatively lower death rates and higher death rates caused by SARS-CoV-2.
Methods: We conducted a longitudinal survey study in Germany with 833 participants in 2 waves to investigate how participants perceive their peer groups’ opinion about making use of the German CWA to mitigate the risk of SARS-CoV-2. In addition, we asked whether this perceived opinion, in turn, influences the participants with respect to their own decision to use the CWA. We analyzed these questions with generalized estimating equations. Further, 2 related sample tests were performed to test for differences between users of the CWA and nonusers and between the 2 points in time (wave 1 with the highest death rates observable during the pandemic in Germany versus wave 2 with significantly lower death rates).
Results: Participants perceived that peer groups have a positive opinion toward using the CWA, with more positive opinions by the media, family doctors, politicians, and virologists/Robert Koch Institute and a lower, only slightly negative opinion originating from social media. Users of the CWA perceived their peer groups’ opinions about using the app as more positive than nonusers do. Furthermore, the perceived positive opinion of the media (P=.001) and politicians (P<.001) was significantly lower in wave 2 compared with that in wave 1. The perceived opinion of friends and family (P<.001) as well as their perceived influence (P=.02) among nonusers toward using the CWA was significantly higher in the latter period compared with that in wave 1. The influence of virologists (in Germany primarily communicated via the Robert Koch Institute) had the highest positive effect on using the CWA (B=0.363, P<.001). We only found 1 decreasing effect of the influence of politicians (B=–0.098, P=.04).
Conclusions: Opinions of peer groups play an important role when it comes to the adoption of the CWA. Our results show that the influence of virologists/Robert Koch Institute and family/friends exerts the strongest effect on participants’ decisions to use the CWA while politicians had a slightly negative influence. Our results also indicate that it is crucial to accompany the introduction of such a contact tracing app with explanations and a media campaign to support its adoption that is backed up by political decision makers and subject matter experts.
This study investigates the socio-economic characteristics, behavioral preferences, and consumption of individuals who own crypto-assets. Our empirical analysis utilizes data from a German personal finance management app where users connect their bank accounts and depots. We conducted a survey and elicited behavioral factors for financial decision-making. By combining survey with account and security account data, we identify crypto investors’ preferences for financial decision-making and financial advice. Our results suggest that, in particular, students or self-employed, young, and male individuals who are risk-seeking and impatient are more likely to have invested in crypto-assets. Most crypto owners have less experience with financial advisory. They see it as too time-consuming and qualitatively poor, and instead, they prefer to decide on their own as they have self-reported high financial literacy. Investigating their consumption in more detail we conclude that crypto investors more often spend on travelling, electronics, and food delivery and less on health. Our findings suggest policymakers in identifying high-risk consumers and investors, and help financial institutions develop appropriate products.
If service providers can identify reasons users are in favor of or against a service, they have insightful information that can help them understand user behavior and what they need to do to change such behavior. This article argues that the novel text-mining technique referred to as information-seeking argument mining (IS-AM) can identify these reasons. The empirical study applies IS-AM to news articles and reviews about electric scooter-sharing systems (i.e., a service enabling the short-term rentals of electric motorized scooters). Its results point to IS-AM as a promising technique to improve service; the data enable the authors to identify 40 reasons to use or not use electric scooter-sharing systems, as well as their importance to users. Furthermore, the results show that news articles are better data sources than reviews because they are longer and contain more arguments and, thus, reasons.
Libra — a global virtual currency project initiated by Facebook — has been the subject of many controversial discussions since its announcement in June 2019. This paper provides a differentiated view on Libra, recognising that different development scenarios of Libra are conceivable. Libra could serve purely as an alternative payment system in combination with a dedicated payment token, the Libra coin. Alternatively, the Libra project could develop into a broader financial infrastructure for advanced financial services such as savings and loan products operating on the Libra Blockchain. Based on a comparison of the Libra architecture with other cryptocurrencies, the opportunities and challenges for the development of the respective Libra ecosystems are investigated from a commercial, regulatory and monetary policy perspective.
The importance of agile methods has increased in recent years, not only to manage IT projects but also to establish flexible and adaptive organisational structures, which are essential to deal with disruptive changes and build successful digital business strategies. This paper takes an industry-specific perspective by analysing the dissemination, objectives and relative popularity of agile frameworks in the German banking sector. The data provides insights into expectations and experiences associated with agile methods and indicates possible implementation hurdles and success factors. Our research provides the first comprehensive analysis of agile methods in the German banking sector. The comparison with a selected number of fintechs has revealed some differences between banks and fintechs. We found that almost all banks and fintechs apply agile methods in IT projects. However, fintechs have relatively more experience with agile methods than banks and use them more intensively. Scrum is the most relevant framework used in practice. Scaled agile frameworks are so far negligible in the German banking sector. Acceleration of projects is apparently the most important objective of deploying agile methods. In addition, agile methods can contribute to cost savings and lead to improved quality and innovation performance, though for banks it is evidently more challenging to reach their respective targets than for fintechs. Overall our findings suggest that German banks are still in a maturing process of becoming more agile and that there is room for an accelerated adoption of agile methods in general and scaled agile frameworks in particular.
The financial sector plays an important role in financing the green transformation. Various regulatory initiatives in the EU aim to improve transparency in relation to the sustainability of financial products and the sustainability of economic activities of non-financial and financial undertakings. For credit institutions, the Green Asset Ratio (GAR) has been established by the European regulatory authorities as a key performance indicator (KPI) for measuring the proportion of Taxonomy-aligned on-balance-sheet exposure in relation to the total assets. The breakdown of the total GAR by type of counterparty, environmental objective and type of asset provides in-depth information about the sustainability profile of a credit institution. This information, which has not been available to date, may also initiate discussions between management and shareholders or other stakeholders regarding the future sustainability strategy of credit institutions. This paper provides an overview of the regulatory background and the method of calculating the GAR along different dimensions. Finally, the potential benefits and limitations of the GAR are discussed.
Advances in distributed ledger technology are leading to a growing decentralisation of financial services (“decentralised finance”) that can be offered largely without intermediation by financial institutions. An important driver for this development is the ongoing tokenisation of assets, payments and rights, which enables the digital encryption of “crypto assets” on distributed ledgers. This article elaborates the foundations and fields of application of decentralised financial services with crypto assets that could challenge the established business models of financial institutions. This trend not only affects payment systems based on controversial crypto currencies such as Bitcoin, but also exchange platforms, capital markets solutions and corporate financing. A rapidly growing ecosystem of start-ups, tech companies and financial institutions is emerging, yet this ecosystem lacks a consistent regulatory framework. The European initiative MiCA (Markets in Crypto Assets) points in the right direction but needs to be adopted soon to ensure the future competitiveness of the European financial sector.
The financial sector plays an important role in supporting the green transformation of the European economy. A critical assessment of the current regulatory framework for sustainable finance in Europe leads to ambiguous results. Although the level of transparency on environmental, social and governance aspects of financial products has improved significantly, it is questionable whether the complex, mainly disclosure-oriented architecture is sufficient to mobilise more private capital into sustainable investments. It should be discussed whether a minimum taxonomy ratio or Green Asset Ratio has to be fulfilled to market a financial product as “green”. Furthermore, because of the high complexity of the regulation, it could be helpful for private investors to establish a simplified green rating, based on the taxonomy ratio, to facilitate the selection of green financial products.
With a notional amount outstanding of more than USD 500 trillion, the market for OTC derivatives is of vital importance for global financial stability. A growing proportion of these contracts are cleared via central counterparties (CCPs), which means that CCPs are gaining in importance as critical financial market infrastructures. At the same time, there is growing concern that a new „too big to fail" problem could arise, as the CCP industry is highly concentrated due to economies of scale. From a European perspective, it should be noted that the clearing of euro-denominated OTC derivatives mainly takes place in London, hence outside the EU in the foreseeable future. For some time there has been a controversial discussion as to whether this can remain the case post Brexit. CCPs, which clear a significant proportion of euro OTC derivatives and are systemically relevant from an EU perspective, should be subject to direct supervision by EU authorities and should be established in the EU. This would represent an important building block for a future Capital Markets Union in Europe, as regulatory or supervisory arbitrage in favour of systemically important third-country CCPs could be prevented. In addition, if a systemically relevant CCP handling a considerable portion of the euro OTC derivatives business were to run into serious difficulties, this may impact ECB monetary policy. This applies both to demand for central bank money and to the transmission of monetary policy measures, which can be significantly impaired, particularly in the event that the repo market or payment systems are disrupted. It is therefore essential for the ECB to be closely involved in the supervision of CCPs. Against this background, the draft amendment of EMIR (European Market Infrastructure Regulation) presented on 13 June 2017 is a step in the right direction. In addition, there is an urgent need to introduce a recovery and resolution mechanism for CCPs in the EU to complement the existing single resolution mechanism (SRM) for banks in the eurozone. Only then can the diverse interdependencies between banks and CCPs be adequately taken into account in the recovery and resolution programmes required in a financial crisis.
The German federal government intended to alleviate the burden of increasing fuel prices by introducing a temporary reduction of energy taxes on gasoline and diesel. In order to evaluate the impact of this measure on consumer prices at the filling stations the development of procurement costs for crude oil as well as the downstream development of refinery and distribution margins have to be taken into account. It turns out that about 80 % of the tax reduction has been passed on to end consumers on and around the effective date of the tax relief. However, within the first month the impact of the tax reduction has been wiped out for diesel completely as the gross margin of the mineral oil groups have substantially improved since then. On the other hand, for gasoline (E10) at least part of the impact can still be observed as the initial margin improvement has come down in the meantime. For a detailed analysis the German antitrust authority should look into the pricing algorithms of all 14,000 filling stations in Germany.
Mehr Nachhaltigkeit im deutschen Leitindex DAX : Reformvorschläge im Lichte des Wirecard-Skandals
(2020)
Im Rahmen der Aufarbeitung des Wirecard-Skandals wird auch eine Änderung der Kriterien zur Aufnahme in den deutschen Leitindex DAX diskutiert. Die bislang von der Deutschen Börse vorgesehenen Maßnahmen gehen in die richtige Richtung, sind aber nicht weitreichend genug. Es bedarf eines deutlichen Zeichens, dass sich künftig nur solche Unternehmen für den DAX qualifizieren können, die ein zumindest befriedigendes Maß an Nachhaltigkeit gemessen durch einen ESG-Risk-Score (Environment, Social, Governance) in ihrer Geschäftstätigkeit erreichen. Eine Simulation verdeutlicht, dass nach ESG-Kriterien seit langem kritisch betrachtete Unternehmen dem DAX nicht mehr angehören würden. Damit könnte mehr Kapital in nachhaltig wirtschaftende Unternehmen und Sektoren fließen.
Ad blockers allow users to browse websites without viewing ads. Online news publishers that rely on advertising income tend to perceive users’ adoption of ad blockers purely as a threat to revenue. Yet, this perception ignores the possibility that avoiding ads—which users presumably dislike—may affect users’ online news consumption behavior in positive ways. Using 3.1 million visits from 79,856 registered users on a news website, this research finds that ad blocker adoption has robust positive effects on the quantity and variety of articles users consume. Specifically, ad blocker adoption increases the number of articles that users read by 21.0%–43.2%, and it increases the number of content categories that users consume by 13.4%–29.1%. These effects are stronger for less-experienced users of the website. The increase in news consumption stems from increases in repeat visits to the news website, rather than in the number of page impressions per visit. These postadoption visits tend to start from direct navigation to the news website, rather than from referral sources. The authors discuss how news publishers could benefit from these findings, including exploring revenue models that consider users’ desire to avoid ads.
A common element of market structure analysis is the spatial representation of firms’ competitive positions on maps. Such maps typically capture static snapshots in time. Yet, competitive positions tend to change. Embedded in such changes are firms’ trajectories, that is, the series of changes in firms’ positions over time relative to all other firms in a market. Identifying these trajectories contributes to market structure analysis by providing a forward-looking perspective on competition, revealing firms’ (re)positioning strategies and indicating strategy effectiveness. To unlock these insights, we propose EvoMap, a novel dynamic mapping framework that identifies firms’ trajectories from high-frequency and potentially noisy data. We validate EvoMap via extensive simulations and apply it empirically to study the trajectories of more than 1,000 publicly listed firms over 20 years. We find substantial changes in several firms’ positioning strategies, including Apple, Walmart, and Capital One. Because EvoMap accommodates a wide range of mapping methods, analysts can easily apply it in other empirical settings and to data from various sources.
Regulators worldwide have been implementing different privacy laws. They vary in their impact on the value for advertisers, publishers and users, but not much is known about these differences. This article focuses on three important privacy laws (i.e., General Data Protection Regulation [GDPR], California Consumer Privacy Act [CCPA] and Personal Information Protection Law [PIPL]) and compares their impact on the value for the three primary actors of the online advertising market, namely, advertisers, publishers and users. This article first compares these three privacy laws by developing a legal strictness score. It then uses the existing literature to derive the effects of the legal strictness of each privacy law on each actor’s value. Finally, it quantifies the three privacy laws’ impact on each actor’s value. The results show that GDPR and PIPL are similar and stricter than CCPA. Stricter privacy laws bring larger negative changes to the value for actors. As a result, both GDPR and PIPL decrease the actors’ value more substantially than CCPA. These value declines are the largest for publishers and are rather similar for users and advertisers. Scholars and practitioners can use our findings to explore ways to create value for multiple actors under various privacy laws.
For many services, consumers can choose among a range of optional tariffs that differ in their access and usage prices. Recent studies indicate that tariff-specific preferences may lead consumers to choose a tariff that does not minimize their expected billing rate. This study analyzes how tariff-specific preferences influence the responsiveness of consumers’ usage and tariff choice to changes in price. We show that consumer heterogeneity in tariff-specific preferences leads to heterogeneity in their sensitivity to price changes. Specifically, consumers with tariff-specific preferences are less sensitive to price increases of their preferred tariff than other consumers. Our results provide an additional reason why firms should offer multiple tariffs rather than a uniform nonlinear pricing plan to extract maximum consumer surplus.
Digitale Technologien begünstigen den Einsatz einer dynamischen Preisgestaltung, also von Preisen, die für ein prinzipiell gleiches Produkt unangekündigt variieren. Dabei werden in der öffentlichen Diskussion unterschiedliche Ausgestaltungsformen dynamischer Preise oftmals vermischt, was eine sinnvolle Analyse der Vor- und Nachteile der dynamischen Preisgestaltung erschwert. Das Ziel des Beitrags ist die Darstellung der ökonomischen Grundlagen und die Diskussion sowie Klassifikation der Ausgestaltungsmöglichkeiten der dynamischen Preisgestaltung. Darüber hinaus erfolgt eine Bewertung der Vor- und Nachteile der dynamischen Preisgestaltung aus Käufer- und Verkäufersicht. Abschließend werden Implikationen für die betriebswirtschaftliche Forschung diskutiert.
Highlights
• The 1986 Immigration Reform and Control Act legalized millions of Hispanic migrants.
• The IRCA receive significantly increases state-to-county fiscal transfers.
• Electoral incentives of the state governor drive the fiscal response of the IRCA.
• Legalization increases Hispanic turnout and political engagement.
Abstract
We study the impact of immigrant legalization on fiscal transfers from state to local governments in the United States, exploiting variation in legal status from the 1986 Immigration Reform and Control Act (IRCA). State governments allocate more resources to IRCA counties, an allocation that is responsive to the electoral incentives of the governor. Importantly, the effect emerges prior to the enfranchisement of the IRCA migrants and we argue it is driven by the IRCA’s capacity to politically empower already legal Hispanic migrants in mixed legal status communities. The IRCA increases turnout in large Hispanic communities as well as Hispanic political engagement, without detectably triggering anti-migrant sentiment.
With adequate support for the learner, errors can have high learning potential. This study investigates rather unsuitable action patterns of teachers in dealing with errors. Teachers rarely investigate the causes that evoke the occurrence of individual students’ errors, but instead often change addressees immediately after an error occurs. Such behavior is frequent in the classroom, leaving unexploited, yet important potential to learn from errors. It has remained unexplained why teachers act the way they do in error situations. Using video-stimulated recalls, I investigate the reasons for teachers’ behavior in students’ error situations by confronting them with recorded episodes from their own teaching. Error situations are analyzed (within-case) and teachers’ beliefs are classified in an explanatory model (cross-case) to illustrate patterns across teachers. Results show that teachers refer to an interaction of student attributes, their own attributes, and error attributes when reasoning their own behavior. I find that reference to specific attributes varies depending on the situation, and so do the described reasons that led to a particular behavior as a spontaneous or more reflective decision.
The crowdfunding of altruism
(2022)
This paper introduces a machine learning approach to quantify altruism from the linguistic style of textual documents. We apply our method to a central question in (social) entrepreneurship: How does altruism impact entrepreneurial success? Specifically, we examine the effects of altruism on crowdfunding outcomes in Initial Coin Offerings (ICOs). The main result suggests that altruism and ICO firm valuation are negatively related. We, then, explore several channels to shed some light on whether the negative altruism-valuation relation is causal. Our findings suggest that it is not altruism that causes lower firm valuation; rather, low-quality entrepreneurs select into altruistic projects, while the marginal effect of altruism on high-quality entrepreneurs is actually positive. Altruism increases the funding amount in ICOs in the presence of high-quality projects, low asymmetric information, and strong corporate governance.
Detailed feedback on exercises helps learners become proficient but is time-consuming for educators and, thus, hardly scalable. This manuscript evaluates how well Generative Artificial Intelligence (AI) provides automated feedback on complex multimodal exercises requiring coding, statistics, and economic reasoning. Besides providing this technology through an easily accessible web application, this article evaluates the technology’s performance by comparing the quantitative feedback (i.e., points achieved) from Generative AI models with human expert feedback for 4,349 solutions to marketing analytics exercises. The results show that automated feedback produced by Generative AI (GPT-4) provides almost unbiased evaluations while correlating highly with (r = 0.94) and deviating only 6 % from human evaluations. GPT-4 performs best among seven Generative AI models, albeit at the highest cost. Comparing the models’ performance with costs shows that GPT-4, Mistral Large, Claude 3 Opus, and Gemini 1.0 Pro dominate three other Generative AI models (Claude 3 Sonnet, GPT-3.5, and Gemini 1.5 Pro). Expert assessment of the qualitative feedback (i.e., the AI’s textual response) indicates that it is mostly correct, sufficient, and appropriate for learners. A survey of marketing analytics learners shows that they highly recommend the app and its Generative AI feedback. An advantage of the app is its subject-agnosticism—it does not require any subject- or exercise-specific training. Thus, it is immediately usable for new exercises in marketing analytics and other subjects.
This paper studies discrete time finite horizon life-cycle models with arbitrary discount functions and iso-elastic per period power utility with concavity parameter θ. We distinguish between the savings behavior of a sophisticated versus a naive agent. Although both agent types have identical preferences, they solve different utility maximization problems whenever the model is dynamically inconsistent. Pollak (1968) shows that the savings behavior of both agent types is nevertheless identical for logarithmic utility (θ = 1). We generalize this result by showing that the sophisticated agent saves in every period a greater fraction of her wealth than the naive agent if and only if θ ≥ 1. While this result goes through for model extensions that preserve linearity of the consumption policy function, it breaks down for non-linear model extensions.
Homeownership rates differ widely across European countries. We document that part of this variation is driven by differences in the fraction of adults co-residing with their parents. Comparing Germany and Italy, we show that in contrast to homeownership rates per household, homeownership rates per individual are very similar during the first part of the life cycle. To understand these patterns, we build an overlapping-generations model where individuals face uninsurable income risk and make consumption-saving and housing tenure decisions. We embed an explicit intergenerational link between children and parents to capture the three-way trade-off between owning, renting, and co-residing. Calibrating the model to Germany we explore the role of income profiles, housing policies, and the taste for independence and show that a combination of these factors goes a long way in explaining the differential life-cycle patterns of living arrangements between the two countries.
When estimating misspecified linear factor models for the cross-section of expected returns using GMM, the explanatory power of these models can be spuriously high when the estimated factor means are allowed to deviate substantially from the sample averages. In fact, by shifting the weights on the moment conditions, any level of cross-sectional fit can be attained. The mathematically correct global minimum of the GMM objective function can be obtained at a parameter vector that is far from the true parameters of the data-generating process. This property is not restricted to small samples, but rather holds in population. It is a feature of the GMM estimation design and applies to both strong and weak factors, as well as to all types of test assets.
Market risks account for an integral part of insurers' risk profiles. We explore market risk sensitivities of insurers in the United States and Europe. Based on panel regression models and daily market data from 2012 to 2018, we find that sensitivities are particularly driven by insurers' product portfolio. The influence of interest rate movements on stock returns is 60% larger for US than for European life insurers. For the former, interest rate risk is a dominant market risk with an effect that is five times larger than through corporate credit risk. For European life insurers, the sensitivity to interest rate changes is only 44% larger than toward credit default swap of government bonds, underlining the relevance of sovereign credit risk.
Does political conflict with another country influence domestic consumers' daily consumption choices? We exploit the volatile US-China relations in 2018 and 2019 to analyze whether US consumers reduce their visits to Chinese restaurants when bilateral relations deteriorate. We measure the degree of political conflict through negativity in media reports and rely on smartphone location data to measure daily visits to over 190,000 US restaurants. A deterioration in US-China relations induces a significant decline in visits not only to Chinese but also to other foreign ethnic restaurants, while visits to typical American restaurants increase. We identify consumers' age, race, and cultural openness to moderate the strength of this ethnocentric effect.
External linkages allow nascent ventures to access crucial resources during the process of new product development. Forming external linkages can substantially contribute to a venture’s performance. However, little is known about the paths of external linkage formation, as well as the circumstances that drive the choice to pursue one rather than another path. This gap deserves further investigation, because we do not know whether insights developed for incumbent firms also apply to nascent ventures: To address this gap, we explore a novel dataset of 370 venture creation processes. Using sequence analyses based on optimal matching techniques and cluster analyses, we reveal that nascent ventures pursue one of overall four distinct paths of linkage formation activities during new product development. Contrary to the findings of the strategy literature, we find that if nascent ventures engage in external linkages at all, they do not combine exploration- and exploitation-oriented linkages but form either exploration- or exploitation-oriented linkages. Additional regression analyses highlight the circumstances that lead nascent ventures to pursue one rather than the other pathways. Taken together, our analyses point out that resource scarcity constitutes an important factor shaping the linkage formation activities of nascent ventures. Accordingly, we show that nascent ventures tend not to optimize by adding complementary knowledge to the firm’s knowledge base but rather to extend the existing knowledge base—a strategy which we call bricolage.
The 2011 Arab Spring marked the opening of the Central Mediterranean Route for irregular border crossings between Libya and Italy, which produced heterogeneous reductions of bilateral smuggling distances between country pairs in the Mediterranean region. We exploit this source of spatial and temporal variation in bilateral distance along land and sea routes to estimate the elasticity of irregular migration intentions for African and Near East countries. We estimate an elasticity of migration intentions to smuggling distances exceeding −3, mainly driven by countries with weak rule of law and high internet penetration. Our findings are consistent across irregular migration measures both at the aggregate and individual levels. We show that irregular migration elasticity is higher for youth, relatively skilled individuals and those with an informative advantage (having a social network abroad or a mobile phone).
Nations are imposing unprecedented measures at a large scale to contain the spread of the COVID-19 pandemic. While recent studies show that non-pharmaceutical intervention measures such as lockdowns may have mitigated the spread of COVID-19, those measures also lead to substantial economic and social costs, and might limit exposure to ultraviolet-B radiation (UVB). Emerging observational evidence indicates the protective role of UVB and vitamin D in reducing the severity and mortality of COVID-19 deaths. This observational study empirically outlines the protective roles of lockdown and UVB exposure as measured by the ultraviolet index (UVI). Specifically, we examine whether the severity of lockdown is associated with a reduction in the protective role of UVB exposure. We use a log-linear fixed-effects model on a panel dataset of secondary data of 155 countries from 22 January 2020 until 7 October 2020 (n = 29,327). We use the cumulative number of COVID-19 deaths as the dependent variable and isolate the mitigating influence of lockdown severity on the association between UVI and growth rates of COVID-19 deaths from time-constant country-specific and time-varying country-specific potentially confounding factors. After controlling for time-constant and time-varying factors, we find that a unit increase in UVI and lockdown severity are independently associated with − 0.85 percentage points (p.p) and − 4.7 p.p decline in COVID-19 deaths growth rate, indicating their respective protective roles. The change of UVI over time is typically large (e.g., on average, UVI in New York City increases up to 6 units between January until June), indicating that the protective role of UVI might be substantial. However, the widely utilized and least severe lockdown (governmental recommendation to not leave the house) is associated with the mitigation of the protective role of UVI by 81% (0.76 p.p), which indicates a downside risk associated with its widespread use. We find that lockdown severity and UVI are independently associated with a slowdown in the daily growth rates of cumulative COVID-19 deaths. However, we find evidence that an increase in lockdown severity is associated with significant mitigation in the protective role of UVI in reducing COVID-19 deaths. Our results suggest that lockdowns in conjunction with adequate exposure to UVB radiation might have even reduced the number of COVID-19 deaths more strongly than lockdowns alone. For example, we estimate that there would be 11% fewer deaths on average with sufficient UVB exposure during the period people were recommended not to leave their house. Therefore, our study outlines the importance of considering UVB exposure, especially while implementing lockdowns, and could inspire further clinical studies that may support policy decision-making in countries imposing such measures.
Highlights
• Pathways for a circular economy towards the EU goals require policy support that, in turn, requires legitimacy.
• Legitimacy is often contested in the public discourse at all phases in the technological innovation system.
• Legitimacy remains poorly understood for ‘in-between’ technologies that struggle to move from the formative to the growth stage.
• The article explores legitimacy for chemical recycling primarily based on evidence from the UK, Germany, and Italy.
Abstract
The European Commission aims to increase the recycling of plastic packaging to 60% by 2025, requiring fundamental changes towards a more circular economy. Pathways for this transition require policy support that largely depends on their legitimacy in the public discourse. These normative aspects remain poorly understood for ‘in-between’ technologies, i.e., technologies that are no longer novel but struggle to move to the growth phase within the technological innovation system. Therefore, we ask: How do discourses shape technology legitimacy for in-between technologies? Drawing on the empirical example of chemical recycling, the analysis renders two principal findings. First, legitimising and delegitimising storylines present contesting views on in-between technologies regarding their technological aspects, environmental and social impacts, and economic and policy implications. Second, how discourses contribute to technology legitimacy depends on the actors and interests that drive the prevalent storylines in particular contexts.
Highlights
• Six Newton methods for solving matrix quadratic equations in linear DSGE models.
• Compared to QZ using 99 different DSGE models including Smets and Wouters (2007).
• Newton methods more accurate than QZ with comparable computation burden.
• Apt for refining solutions from alternative methods or nearby parameterizations.
Abstract
This paper presents and compares Newton-based methods from the applied mathematics literature for solving the matrix quadratic that underlies the recursive solution of linear DSGE models. The methods are compared using nearly 100 different models from the Macroeconomic Model Data Base (MMB) and different parameterizations of the monetary policy rule in the medium-scale New Keynesian model of Smets and Wouters (2007) iteratively. We find that Newton-based methods compare favorably in solving DSGE models, providing higher accuracy as measured by the forward error of the solution at a comparable computation burden. The methods, however, suffer from their inability to guarantee convergence to a particular, e.g. unique stable, solution, but their iterative procedures lend themselves to refining solutions either from different methods or parameterizations.
In a unifying framework generalizing established theories we characterize under which conditions Joint Ownership of assets creates the best cooperation incentives in a partnership. We endogenise renegotiation costs and assume that they weakly increase with additional assets. A salient sufficient condition for optimal cooperation incentives among patient partners is if Joint Ownership is a Strict Coasian Institution for which transaction costs impede an efficient asset reallocation after a breakdown. In contrast to Halonen (2002) the logic behind our results is that Joint Ownership maximizes the value of the relationship and the costs of renegotiating ownership after a broken relationship.
The hierarchical feature regression (HFR) is a novel graph-based regularized regression estimator, which mobilizes insights from the domains of machine learning and graph theory to estimate robust parameters for a linear regression. The estimator constructs a supervised feature graph that decomposes parameters along its edges, adjusting first for common variation and successively incorporating idiosyncratic patterns into the fitting process. The graph structure has the effect of shrinking parameters towards group targets, where the extent of shrinkage is governed by a hyperparameter, and group compositions as well as shrinkage targets are determined endogenously. The method offers rich resources for the visual exploration of the latent effect structure in the data, and demonstrates good predictive accuracy and versatility when compared to a panel of commonly used regularization techniques across a range of empirical and simulated regression tasks.
By computing a volatility index (CVX) from cryptocurrency option prices, we analyze this market’s expectation of future volatility. Our method addresses the challenging liquidity environment of this young asset class and allows us to extract stable market implied volatilities. Two alternative methods are considered to compute volatilities from granular intra-day cryptocurrency options data, which spans over the COVID-19 pandemic period. CVX data therefore capture ‘normal’ market dynamics as well as distress and recovery periods. The methods yield two cointegrated index series, where the corresponding error correction model can be used as an indicator for market implied tail-risk. Comparing our CVX to existing volatility benchmarks for traditional asset classes, such as VIX (equity) or GVX (gold), confirms that cryptocurrency volatility dynamics are often disconnected from traditional markets, yet, share common shocks.
Using a field study at a German brokerage, we investigate advised individual investors’ behavior and outcomes after self-selecting into a flat-fee scheme (percentage of portfolio value) for mutual funds. In a difference-in-differences setting, we compare 699 switchers to propensity-score-matched advisory clients who remained in the commission-based scheme. Switchers increase their portfolio values, improve portfolio diversification, and increase their portfolio performance. They also demand more financial advice and follow more advisor recommendations. We argue that switchers attribute a higher quality to the unchanged advisory services.
We study the role mutual funds play in the recovery from fast intraday crashes based on data from the National Stock Exchange of India for a single large stock. During normal times, trading activity and liquidity provision by mutual funds is negligible compared to other traders at around 4% of overall activity. Nevertheless, for the two intraday market-wide crashes in our sample, price recovery took place only after mutual funds moved in. Market stability may require the presence of well-capitalized standby liquidity providers for recovery from fast crashes.
The recent COVID-19 pandemic represents an unprecedented worldwide event to study the influence of related news on the financial markets, especially during the early stage of the pandemic when information on the new threat came rapidly and was complex for investors to process. In this paper, we investigate whether the flow of news on COVID-19 had an impact on forming market expectations. We analyze 203,886 online articles dealing with COVID-19 and published on three news platforms (MarketWatch.com, NYTimes.com, and Reuters.com) in the period from January to June 2020. Using machine learning techniques, we extract the news sentiment through a financial market-adapted BERT model that enables recognizing the context of each word in a given item. Our results show that there is a statistically significant and positive relationship between sentiment scores and S&P 500 market. Furthermore, we provide evidence that sentiment components and news categories on NYTimes.com were differently related to market returns.
This paper explores entrepreneurs’ initially intended exit strategies and compares them to their final exit paths using an inductive approach that builds on the grounded theory methodology. Our data shows that initially intended and final exit strategies differ among entrepreneurs. Two groups of entrepreneurs emerged from our data. The first group comprises entrepreneurs who financed their firms through equity investors. The second group is made up of entrepreneurs who financed their businesses solely with their own equities. Our data shows that the first group originally intended a financial harvest exit strategy and settled with this harvest exit strategy. The second group initially intended a stewardship exit strategy but did not succeed. We used the theory of planned behavior and the behavioral agency model to analyze our data. By examining our results from these two theoretical perspectives, our study explains how entrepreneurs’ exit intentions lead to their actual exit strategies.
Product aesthetics is a powerful means for achieving competitive advantage. Yet most studies to date have focused on the role of aesthetics in shaping pre-purchase preferences and have failed to consider how product aesthetics affects post-purchase processes and consumers' usage behavior. This research focuses on the relationship between aesthetics and usage behavior in the context of durable products. Studies 1A to 1C provide evidence of a positive effect of product aesthetics on usage intensity using market data from the car and the fashion industries. Study 2 corroborates these findings and shows that the more intensive use of highly aesthetic products may lead to the acquisition of product-specific usage skills that form the basis for a cognitive lock-in. Hence, consumers are less likely to switch away from products with appealing designs, an effect that is labeled as the ‘aesthetic fidelity’ effect. Study 3 addresses an alternative explanation for the ‘aesthetic fidelity effect’ based on mood and motivation but finds that the ‘aesthetic fidelity’ effect is indeed determined by usage intensity. Finally, Study 4 identifies a boundary condition of the positive effect of product aesthetics on product usage, showing that it is limited to durable products. In sum, this research demonstrates that the effects of product aesthetics extend beyond the pre-consumption stage and have an enduring impact on people's consumption experiences.
This research examines the impact of online display advertising and paid search advertising relative to offline advertising on firm performance and firm value. Using proprietary data on annualized advertising expenditures for 1651 firms spanning seven years, we document that both display advertising and paid search advertising exhibit positive effects on firm performance (measured by sales) and firm value (measured by Tobin's q). Paid search advertising has a more positive effect on sales than offline advertising, consistent with paid search being closest to the actual purchase decision and having enhanced targeting abilities. Display advertising exhibits a relatively more positive effect on Tobin's q than offline advertising, consistent with its long-term effects. The findings suggest heterogeneous economic benefits across different types of advertising, with direct implications for managers in analyzing advertising effectiveness and external stakeholders in assessing firm performance.
Most event studies rely on cumulative abnormal returns, measured as percentage changes in stock prices, as their dependent variable. Stock price reflects the value of the operating business plus non-operating assets minus debt. Yet, many events, in particular in marketing, only influence the value of the operating business, but not non-operating assets and debt. For these cases, the authors argue that the cumulative abnormal return on the operating business, defined as the ratio between the cumulative abnormal return on stock price and the firm-specific leverage effect, is a more appropriate dependent variable. Ignoring the differences in firm-specific leverage effects inflates the impact of observations pertaining to firms with large debt and deflates those pertaining to firms with large non-operating assets. Observations of firms with high debt receive several times the weight attributed to firms with low debt. A simulation study and the reanalysis of three previously published marketing event studies shows that ignoring the firm-specific leverage effects influences an event study's results in unpredictable ways.
This article uses information from two data sources, Compustat and Nexis Uni, and textual analysis to measure and validate the brand focus and customer focus of 109 U.S. listed retailers. The results from an analysis of their 853 earnings calls in 2010 and 2018 outline that on average, both foci increased over time. Although both foci vary substantially, brand focus varies more widely across retailers than their customer focus. Both foci are independent of each other. Specialty retailers have the highest brand focus, and internet & direct marketing retailers have the highest customer focus. A positive correlation exists between a retailer’s customer focus and its profitability, but not between a retailer’s brand focus and its profitability. The authors use the results to generate a research agenda that can direct future research in further systematically exploring firms’ brand and customer focus.
Knowledge of consumers' willingness to pay (WTP) is a prerequisite to profitable price-setting. To gauge consumers' WTP, practitioners often rely on a direct single question approach in which consumers are asked to explicitly state their WTP for a product. Despite its popularity among practitioners, this approach has been found to suffer from hypothetical bias. In this paper, we propose a rigorous method that improves the accuracy of the direct single question approach. Specifically, we systematically assess the hypothetical biases associated with the direct single question approach and explore ways to de-bias it. Our results show that by using the de-biasing procedures we propose, we can generate a de-biased direct single question approach that is accurate enough to be useful for managerial decision-making. We validate this approach with two studies in this paper.
In recent years, European regulators have debated restricting the time an online tracker can track a user to protect consumer privacy better. Despite the significance of these debates, there has been a noticeable absence of any comprehensive cost-benefit analysis. This article fills this gap on the cost side by suggesting an approach to estimate the economic consequences of lifetime restrictions on cookies for publishers. The empirical study on cookies of 54,127 users who received ∼128 million ad impressions over ∼2.5 years yields an average cookie lifetime of 279 days, with an average value of €2.52 per cookie. Only ∼13 % of all cookies increase their daily value over time, but their average value is about four times larger than the average value of all cookies. Restricting cookies’ lifetime to one year (two years) could potentially decrease their lifetime value by ∼25 % (∼19 %), which represents a potential decrease in the value of all cookies of ∼9 % (∼5%). Most cookies, however, would not be affected by lifetime restrictions of 12 or 24 months as 72 % (85 %) of the users delete their cookies within 12 (24) months. In light of the €10.60 billion cookie-based display ad revenue in Europe, such restrictions would endanger €904 million (€576 million) annually, equivalent to €2.08 (€1.33) per EU internet user. The article discusses these results' marketing strategy challenges and opportunities for advertisers and publishers.
Even as online advertising continues to grow, a central question remains: Who to target? Yet, advertisers know little about how to select from the hundreds of audience segments for targeting (and combinations thereof) for a profitable online advertising campaign. Utilizing insights from a field experiment on Facebook (Study 1), we develop a model that helps advertisers solve the cold-start problem of selecting audience segments for targeting. Our model enables advertisers to calculate the break-even performance of an audience segment to make a targeted ad campaign at least as profitable as an untargeted one. Advertisers can use this novel model to decide whether to test specific audience segments in their campaigns (e.g., in randomized controlled trials). We apply our model to data from the Spotify ad platform to study the profitability of different audience segments (Study 2). Approximately half of those audience segments require the click-through rate to double compared to an untargeted campaign, which is unrealistically high for most ad campaigns. Our model also shows that narrow segments require a lift that is likely not attainable, specifically when the data quality of these segments is poor. We confirm this theoretical finding in an empirical study (Study 3): A decrease in data quality due to Apple’s introduction of the App Tracking Transparency (ATT) framework more negatively affects the click-through rate of narrow (versus broad) audience segments.
Small businesses face major challenges to becoming more innovative. These challenges are particularly prevalent in emerging economies where high uncertainties are a barrier to innovation. We know from previous studies that linkages to universities, on the one hand, and public procurement, on the other, support large and innovative firms in their efforts to become more innovative. However, we do not know whether these positive effects also hold true for small businesses. In this paper, we focus on how policy strategies reducing information, market and financial uncertainties shape small businesses’ innovation in China. Based on a sample of 926 small businesses derived from the World Bank Enterprises Survey in China (2012), we find that university-industry linkages enhance innovation, though only when it comes to minor forms of innovation. In line with the resource-based view of the firm, this effect is stronger for small businesses with higher capabilities. Moreover, we show that bidding for or delivering contracts to public sector clients has a positive effect on innovation, and in particular of major forms of innovation. In the bidding selection process, private firms and firms with higher capabilities are selected. Our findings show that both policy strategies have enhanced innovation, though with different effects on the degree of novelty. We attribute this finding to the different degrees of uncertainties they address.
In this article, we examine anti-refugee hate crime in the wake of the large influx of refugees to Germany in 2014 and 2015. By exploiting institutional features of the assignment of refugees to German regions, we estimate the impact of unexpected and sudden large-scale immigration on hate crime against refugees. Results indicate that it is not simply the size of local refugee inflows which drives the increase in hate crime, but rather the combination of refugee arrivals and latent anti-refugee sentiment. We show that ethnically homogeneous areas, areas which experienced hate crimes in the 1990s, and areas with high support for the Nazi party in the Weimar Republic, are more prone to respond to the arrival of refugees with incidents of hate crime against this group. Our results highlight the importance of regional anti-immigration sentiment in the analysis of the incumbent population’s reaction to immigration.
Vulnerability comes, according to Orio Giarini, with two risks: human-made risks, also called entrepreneurial risks, and natural or pure risks such as accidents and earthquakes. Both types of risk are growing in dimension and are increasingly interrelated. To control the vulnerability, sophisticated insurance products are called for. Here, mutual insurance is relevant, in particular when risks are large, probabilities uncertain or unknown, and events interrelated or correlated. In this paper the following three examples are discussed and the advantages of mutual insurance are shown: unknown probabilities connected with unforeseeable events, correlated risks and macroeconomic or demographic risks.
We estimate the causal effect of shared e-scooter services on traffic accidents by exploiting the variation in the availability of e-scooter services induced by the staggered rollout across 93 cities in six countries. Police-reported accidents involving personal injuries in the average month increased by around 8.2% after shared e-scooters were introduced. Effects are large during summer and insignificant during winter. Further heterogeneity analysis reveals the largest estimated effects for cities with limited cycling infrastructure, while no effects are detectable in cities with high bike-lane density. This difference suggests that public policy can play a crucial role in mitigating accidents related to e-scooters and, more generally, to changes in urban mobility.
This paper proposes tests for out-of-sample comparisons of interval forecasts based on parametric conditional quantile models. The tests rank the distance between actual and nominal conditional coverage with respect to the set of conditioning variables from all models, for a given loss function. We propose a pairwise test to compare two models for a single predictive interval. The set-up is then extended to a comparison across multiple models and/or intervals. The limiting distribution varies depending on whether models are strictly non-nested or overlapping. In the latter case, degeneracy may occur. We establish the asymptotic validity of wild bootstrap based critical values across all cases. An empirical application to Growth-at-Risk (GaR) uncovers situations in which a richer set of financial indicators are found to outperform a commonly-used benchmark model when predicting downside risk to economic activity.
This study explores the implications of rising markups for optimal Mirrleesian income and profit taxation. Using a stylized model with two individuals, the main forces shaping welfare-optimal policies are analytically characterized. Although a higher profit tax has redistributive benefits, it adversely affects market competition, leading to a greater equilibrium cost-of-living. Rising markups directly contribute to a decline in optimal marginal taxes on labor income. The optimal policy response to higher markups includes increasingly relying on the profit tax to fund redistribution. Declining optimal marginal income taxes assists the redistributive function of the profit tax by contributing to the expansion of the profit tax base. This response alone considerably increases the equilibrium cost-of-living. Nevertheless, a majority of the individuals become better off with the optimal policy. If it is not possible to tax profits optimally, due, for example, to profit shifting, increasing redistribution via income taxes is not optimal; every individual is worse off relative to the scenario with optimal profit taxation.
The debate on monetary and fiscal policy is heavily influenced by estimates of the equilibrium real interest rate. In particular, this concerns estimates derived from a simple aggregate demand and Phillips curve model with time-varying components as proposed by Laubach and Williams (2003). For example, Summers (2014a) refers to these estimates as important evidence for a secular stagnation and the need for fiscal stimulus. Yellen (2015, 2017) has made use of such estimates in order to explain and justify why the Federal Reserve has held interest rates so low for so long. First, we re-estimate the United States equilibrium rate with the methodology of Laubach and Williams (2003). Then, we build on their approach and an alternative specification to provide new estimates for the United States, Germany, the euro area and Japan. Third, we subject these estimates to a battery of sensitivity tests. Due to the great uncertainty and sensitivity that accompany these equilibrium rate estimates, the observed decline in the estimates is not a reliable indicator of a need for expansionary monetary and fiscal policy. Yet, if these estimates are employed to determine the appropriate monetary policy stance, such estimates are better used together with the consistent estimate of the level of potential output.
While the COVID-19 pandemic had a large and asymmetric impact on firms, many countries quickly enacted massive business rescue programs which are specifically targeted to smaller firms. Little is known about the effects of such policies on business entry and exit, investment, factor reallocation, and macroeconomic outcomes. This paper builds a general equilibrium model with heterogeneous and financially constrained firms in order to evaluate the short- and long-term consequences of small firm rescue programs in a pandemic recession. We calibrate the stationary equilibrium and the pandemic shock to the U.S. economy, taking into account the factual Paycheck Protection Program (PPP) as a specific policy. We find that the policy has only a modest impact on aggregate output and employment because (i) jobs are saved predominately in the smallest firms that account for a minor share of employment and (ii) the grant reduces the reallocation of resources towards larger and less impacted firms. Much of the reallocation effects occur in the aftermath of the pandemic episode. By preventing inefficient liquidations, the policy dampens the long-term declines of aggregate consumption and of the real wage, thus delivering small welfare gains.
Nowadays, digitalization has an immense impact on the landscape of jobs. This technological revolution creates new industries and professions, promises greater efficiency and improves the quality of working life. However, emerging technologies such as robotics and artificial intelligence (AI) are reducing human intervention, thus advancing automation and eliminating thousands of jobs and whole occupational images. To prepare employees for the changing demands of work, adequate and timely training of the workforce and real-time support of workers in new positions is necessary. Therefore, it is investigated whether user-oriented technologies, such as augmented reality (AR) and virtual reality (VR) can be applied “on-the-job” for such training and support—also known as intelligence augmentation (IA). To address this problem, this work synthesizes results of a systematic literature review as well as a practically oriented search on augmented reality and virtual reality use cases within the IA context. A total of 150 papers and use cases are analyzed to identify suitable areas of application in which it is possible to enhance employees' capabilities. The results of both, theoretical and practical work, show that VR is primarily used to train employees without prior knowledge, whereas AR is used to expand the scope of competence of individuals in their field of expertise while on the job. Based on these results, a framework is derived which provides practitioners with guidelines as to how AR or VR can support workers at their job so that they can keep up with anticipated skill demands. Furthermore, it shows for which application areas AR or VR can provide workers with sufficient training to learn new job tasks. By that, this research provides practical recommendations in order to accompany the imminent distortions caused by AI and similar technologies and to alleviate associated negative effects on the German labor market.
Goal setting is vital in learning sciences, but the scientific evaluation of optimal learning goals is underexplored. This study proposes a novel methodological approach to determine optimal learning goals. The data in this study comes from a gamified learning app implemented in an undergraduate accounting course at a large German university. With a combination of decision trees and regression analyses, the goals connected to the badges implemented in the app are evaluated. The results show that the initial badge set already motivated learning strategies that led to better grades on the exam. However, the results indicate that the levels of the goals could be improved, and additional badges could be implemented. In addition to new goal levels, new goal types are also discussed. The findings show that learning goals initially determined by the instructors need to be evaluated to offer an optimal motivational effect. The new methodological approach used in this study can be easily transferred to other learning data sets to provide further insights.
Life insurers use accounting and actuarial techniques to smooth reporting of firm assets and liabilities, seeking to transfer surpluses in good years to cover benefit payouts in bad years. Yet these techniques have been criticized as they make it difficult to assess insurers’ true financial status. We develop stylized and realistically-calibrated models of a participating life annuity, an insurance product that pays retirees guaranteed lifelong benefits along with variable non-guaranteed surplus. Our goal is to illustrate how accounting and actuarial techniques for this type of financial contract shape policyholder wellbeing, along with insurer profitability and stability. Smoothing adds value to both the annuitant and the insurer, so curtailing smoothing could undermine the market for long-term retirement payout products.
We investigate how financial literacy shapes older Americans’ demand for financial advice. Using an experimental module fielded in the Health and Retirement Study, we show that financial literacy strongly improves the quality but not the quantity of financial advice sought. In particular, more financially literate people seek financial help from professionals. This effect is more pronounced among older people and those with more wealth and more complex financial positions. Our analysis result implies that financial literacy and financial advisory services are complementary with, rather than substitutes for, each other.
This paper examines heterogeneity in time discounting among a representative sample of elderly Americans, as well as its role in explaining key economic behaviors at older ages. We show how older Americans evaluate simple (hypothetical) inter-temporal choices in which payments today are compared with payments in the future. Using the indicators derived from this measure, we then demonstrate that differences in discounting patterns are associated with characteristics of particular importance in elderly populations. For example, cognitive deficits are associated with greater impatience, whereas bequest motives are associated with less impatience. We then relate our discounting measure to key economic outcomes and find that impatience is associated with lower wealth, fewer investments in health, and less planning for end of life care.
The US Treasury recently permitted deferred longevity income annuities to be included in pension plan menus as a default payout solution, yet little research has investigated whether more people should convert some of the $18 trillion they hold in employer-based defined contribution plans into lifelong income streams. We investigate this innovation using a calibrated lifecycle consumption and portfolio choice model embodying realistic institutional considerations. Our welfare analysis shows that defaulting a modest portion of retirees’ 401(k) assets (over a threshold) is an attractive way to enhance retirement security, enhancing welfare by up to 20% of retiree plan accruals.
Do required minimum distribution 401(k) rules matter, and for whom? Insights from a lifecycle model
(2023)
Tax-qualified vehicles have helped U.S. private-sector workers accumulate $33Tr in retirement plans. An often-overlooked important institutional feature shaping decumulations from these plans is the “Required Minimum Distribution” (RMD) regulation requiring retirees to withdraw a minimum fraction from their retirement accounts or pay excise taxes on withdrawal shortfalls. Our calibrated lifecycle model measures the impact of RMD rules on heterogeneous households’ financial behavior during their work lives and in retirement. The model shows that reforms delaying or eliminating the RMD rules have little effect on consumption profiles, but they would influence withdrawals and tax payments for households with bequest motives.
Artificial Intelligence (AI) and Machine Learning (ML) are currently hot topics in industry and business practice, while management-oriented research disciplines seem reluctant to adopt these sophisticated data analytics methods as research instruments. Even the Information Systems (IS) discipline with its close connections to Computer Science seems to be conservative when conducting empirical research endeavors. To assess the magnitude of the problem and to understand its causes, we conducted a bibliographic review on publications in high-level IS journals. We reviewed 1,838 articles that matched corresponding keyword-queries in journals from the AIS senior scholar basket, Electronic Markets and Decision Support Systems (Ranked B). In addition, we conducted a survey among IS researchers (N = 110). Based on the findings from our sample we evaluate different potential causes that could explain why ML methods are rather underrepresented in top-tier journals and discuss how the IS discipline could successfully incorporate ML methods in research undertakings.
Tail-correlation matrices are an important tool for aggregating risk measurements across risk categories, asset classes and/or business segments. This paper demonstrates that traditional tail-correlation matrices—which are conventionally assumed to have ones on the diagonal—can lead to substantial biases of the aggregate risk measurement’s sensitivities with respect to risk exposures. Due to these biases, decision-makers receive an odd view of the effects of portfolio changes and may be unable to identify the optimal portfolio from a risk-return perspective. To overcome these issues, we introduce the “sensitivity-implied tail-correlation matrix”. The proposed tail-correlation matrix allows for a simple deterministic risk aggregation approach which reasonably approximates the true aggregate risk measurement according to the complete multivariate risk distribution. Numerical examples demonstrate that our approach is a better basis for portfolio optimization than the Value-at-Risk implied tail-correlation matrix, especially if the calibration portfolio (or current portfolio) deviates from the optimal portfolio.
We empirically examine how systemic risk in the banking sector leads to correlated risk in office markets of global financial centers. In so doing, we compute an aggregated measure of systemic risk in financial centers as the cumulated expected capital shortfall of local financial institutions. Our identification strategy is based on a double counterfactual approach by comparing normal with financial distress periods as well as office with retail markets. We find that office market interconnectedness arises from systemic risk during financial turmoil periods. Office market performance in a financial center is affected by returns of systemically linked financial center office markets only during a systemic banking crisis. In contrast, there is no evidence of correlated risk during normal times and among the within-city counterfactual retail sector. The decline in office market returns during a banking crisis is larger in financial centers compared to non-financial centers.
Having a gatekeeper position in a collaborative network offers firms great potential to gain competitive advantages. However, it is not well understood what kind of collaborations are associated with such a position. Conceptually grounded in social network theory, this study draws on the resource-based view and the relational factors view to investigate which types of collaboration characterize firms that are in a gatekeeper position, which ultimately could improve firm performance in subsequent periods. The empirical analysis utilizes a unique longitudinal data set to examine dynamic network formation. We used a data crawling approach to reconstruct collaboration networks among the 500 largest companies in Germany over nine years and matched these networks with performance data. The results indicate that firms in gatekeeper positions often engage in medium-intensity collaborations and less likely weak-intensity collaborations. Strong-intensity collaborations are not related to the likelihood of being a gatekeeper. Our study further reveals that a firm's knowledge base is an important moderator and that this knowledge base can increase the benefits of having a gatekeeper position in terms of firm performance.
Questionable research practices have generated considerable recent interest throughout and beyond the scientific community. We subsume such practices involving secret data snooping that influences subsequent statistical inference under the term MESSing (manipulating evidence subject to snooping) and discuss, illustrate and quantify the possibly dramatic effects of several forms of MESSing using an empirical and a simple theoretical example. The empirical example uses numbers from the most popular German lottery, which seem to suggest that 13 is an unlucky number.
This paper analyzes the scope of the private market for pandemic insurance. We develop a framework that explains theoretically how the equilibrium price of pandemic insurance depends on accumulation risk, covariance between pandemic claims and other claims, and covariance between pandemic claims and the stock market performance. Using the natural catastrophe (NatCat) insurance market as a laboratory, we estimate the relationship between the insurance price markup and the tail characteristics of the loss distribution. Then, by using the high-frequency data tracking the economic impact of the COVID-19 pandemic in the United States, we calibrate the loss distribution of a hypothetical insurance contract designed to alleviate the impact of the pandemic on small businesses. The pandemic insurance contract price markup corresponds to the top 20% markup observed in the NatCat insurance market. Then we analyze an intertemporal risk-sharing scheme that can reduce the expected shortfall of the loss distribution by 50%.
Data is considered the new oil of the economy, but privacy concerns limit their use, leading to a widespread sense that data analytics and privacy are contradictory. Yet such a view is too narrow, because firms can implement a wide range of methods that satisfy different degrees of privacy and still enable them to leverage varied data analytics methods. Therefore, the current study specifies different functions related to data analytics and privacy (i.e., data collection, storage, verification, analytics, and dissemination of insights), compares how these functions might be performed at different levels (consumer, intermediary, and firm), outlines how well different analytics methods address consumer privacy, and draws several conclusions, along with future research directions.
The present study investigates the moderating effect of usage intensity of the social networking site (SNS) Instagram (IG) on the influence of advertisement disclosure types on advertising performance. A national sample (N = 566) participated in a randomized online experiment including a real influencer and followers in order to investigate how different advertisement disclosure types affect advertising performance and how usage intensity moderates this effect. We find that disclosing an influencer’s postings with “#ad” increases the trustworthiness of the influencer and the general credibility of the posting for heavy users, but not for light users. Followership of a user has been found to strongly improve all researched variables (attitude toward product placement, trustworthiness of the spokesperson and general credibility of the posting). This study adds to literature the first distinction on heavy and light usage intensity, and on followership of an IG user when regarding the effects of advertisement disclosure types on advertising performance. To conclude, we present a number of recommendations regarding how advertisers, influencers, and SNS providers should develop strategies for monitoring, understanding, and responding to different social media users, e.g., to closely monitor an influencer’s audience to identify heavy users and optimally target them.
The current economic landscape is complex and globalized, and it imposes on individuals the responsibility for their own financial security. This situation has been intensified by the COVID-19 crisis, since short-time work and layoffs significantly limit the availability of financial resources for individuals. Due to the long duration of the lockdown, these challenges will have a long-term impact and affect the financial well-being of many citizens. Moreover, it can be assumed that the consequences of this crisis will once again particularly affect groups of people who have already frequently been identified as having low financial literacy. Financial literacy is therefore an important target for educational measures and interventions. However, it cannot be considered in isolation but must take into account the many potential factors that influence financial literacy alone or in combination. These include personality traits and socio-demographic factors as well as the (in)ability to defer gratification. Against this background, individualized support offers can be made. With this in mind, in the first step of this study, we analyze the complex interaction of personality traits, socio-demographic factors, the (in-)ability to delay gratification, and financial literacy. In the second step, we differentiate the identified effects regarding different groups to identify moderating effects, which, in turn, allow conclusions to be drawn about the need for individualized interventions. The results show that gender and educational background moderate the effects occurring between self-reported financial literacy, financial learning opportunities, delay of gratification, and financial literacy.
A person's intelligence level positively influences his or her professional success. Gifted and highly intelligent individuals should therefore be successful in their careers. However, previous findings on the occupational situation of gifted adults are mainly known from popular scientific sources in the fields of coaching and self-help groups and confirm prevailing stereotypes that gifted people have difficulties at work. Reliable studies are scarce. This systematic literature review examines 40 studies with a total of 22 job-related variables. Results are shown in general for (a) the employment situation and more specific for the occupational aspects (b) career, (c) personality and behavior, (d) satisfaction, (e) organization, and (f) influence of giftedness on the profession. Moreover, possible differences between female and male gifted individuals and gifted and non-gifted individuals are analyzed. Based on these findings, implications for practice as well as further research are discussed.
The importance of agile methods has increased in recent years, not only to manage IT projects but also to establish flexible and adaptive organisational structures, which are essential to deal with disruptive changes and build successful digital business strategies. This paper takes an industry-specific perspective by analysing the dissemination, objectives and relative popularity of agile frameworks in the German banking sector. The data provides insights into expectations and experiences associated with agile methods and indicates possible implementation hurdles and success factors. Our research provides the first comprehensive analysis of agile methods in the German banking sector. The comparison with a selected number of fintechs has revealed some differences between banks and fintechs. We found that almost all banks and fintechs apply agile methods in IT projects. However, fintechs have relatively more experience with agile methods than banks and use them more intensively. Scrum is the most relevant framework used in practice. Scaled agile frameworks are so far negligible in the German banking sector. Acceleration of projects is apparently the most important objective of deploying agile methods. In addition, agile methods can contribute to cost savings and lead to improved quality and innovation performance, though for banks it is evidently more challenging to reach their respective targets than for fintechs. Overall our findings suggest that German banks are still in a maturing process of becoming more agile and that there is room for an accelerated adoption of agile methods in general and scaled agile frameworks in particular.
This paper examines rent sharing in private investments in public equity (PIPEs) between newly public firms and private investors. The evidence suggests highly asymmetric rent sharing. Newly public firms earn a negative return of up to −15% in the first post-PIPE year, while investors benefit due to the ability to dictate transaction terms. The results are economically relevant because newly public firms are, at least in recent years, more likely to tap private rather than public markets for follow-on financing shortly after the initial public offering (IPO), and because the results for newly public firms contrast with those for the broad PIPE market in Lim et al. (2021). The study also contributes to the PIPE literature by offering an integrative view of competing theories of the cross-section of post-PIPE stock returns. We simultaneously test proxies for corporate governance, asymmetric information, bargaining power, and managerial entrenchment. While all explanations have univariate predictive power for the post-PIPE performance, only the proxies for corporate governance and asymmetric information are robust in ceteris-paribus tests.
We use census data to show that structural transformation reflects a fundamental reallocation of labour from goods to services, instead of a relabelling that occurs when goods-producing firms outsource their in-house service production. The novelty of our approach is that it categorizes labour by occupations, which are invariant to outsourcing. We find that the reallocation of labour from goods-producing to service-producing occupations is a robust feature in censuses from around the world and different time periods. To understand the underlying forces, we propose a tractable model in which uneven occupation-specific technological change generates structural transformation of occupation employment.
We propose a novel approach to the study of international trade based on a theory of country integration that embodies a broad systemic viewpoint on the relationship between trade and growth. Our model leads to an indicator of country openness that measures a country's level of integration through the full architecture of its connections in the trade network. We apply our methodology to a sample of 204 countries and find a sizable and significant positive relationship between our integration measure and a country's growth rate, while that of the traditional measures of outward orientation is only minor and statistically insignificant.
This paper defends The Transformation of Values into Prices on the Basis of Random Systems, published in EIER, by answering to the Comments made in the same journal by Professors Mori, Morioka and Yamazaki. The clarifications mainly concern the justification of the randomness assumptions, the conditions needed to obtain the equality of total profit with total surplus value in the simplified one-industry system and the invariance of the results to changes in the units of measurement.
Sample-based longitudinal discrete choice experiments: preferences for electric vehicles over time
(2021)
Discrete choice experiments have emerged as the state-of-the-art method for measuring preferences, but they are mostly used in cross-sectional studies. In seeking to make them applicable for longitudinal studies, our study addresses two common challenges: working with different respondents and handling altering attributes. We propose a sample-based longitudinal discrete choice experiment in combination with a covariate-extended hierarchical Bayes logit estimator that allows one to test the statistical significance of changes. We showcase this method’s use in studies about preferences for electric vehicles over six years and empirically observe that preferences develop in an unpredictable, non-monotonous way. We also find that inspecting only the absolute differences in preferences between samples may result in misleading inferences. Moreover, surveying a new sample produced similar results as asking the same sample of respondents over time. Finally, we experimentally test how adding or removing an attribute affects preferences for the other attributes.
We have designed and implemented an experimental module in the 2014 Health and Retirement Study to measure older persons' willingness to defer claiming of Social Security benefits. Under the current system’ status quo where delaying claiming boosts eventual benefits, we show that 46% of the respondents would delay claiming and work longer. If respondents were instead offered an actuarially fair lump sum payment instead of higher lifelong benefits, about 56% indicate they would delay claiming. Without a work requirement, the average amount needed to induce delayed claiming is only $60,400, while when part-time work is stipulated, the amount is slightly higher, $66,700. This small difference implies a low utility value of leisure foregone, of under 20% of average household income.
The modern tontine : an innovative instrument for longevity risk management in an aging society
(2020)
We investigate whether a historical pension concept, the tontine, yields enough innovative potential to extend and improve the prevailing privately funded pension solutions in a modern way. The tontine basically generates an age-increasing cash flow, which can help to match the increasing financing needs at old ages. In contrast to traditional pension products, however, the tontine generates volatile cash flows, which means that the insurance character of the tontine cannot be guaranteed in every situation. By employing Multi Cumulative Prospect Theory (MCPT) we answer the question to what extent tontines can be a complement to or a substitute for traditional annuities. We find that it is only optimal to invest in tontines for a certain range of initial wealth. In addition, we investigate in how far the tontine size, the volatility of individual liquidity needs and expected mortality rates contribute to the demand for tontines.
Crowdfunding platforms offer project initiators the opportunity to acquire funds from the Internet crowd and, therefore, have become a valuable alternative to traditional sources of funding. However, some processes on crowdfunding platforms cause undesirable external effects that influence the funding success of projects. In this context, we focus on the phenomenon of project overfunding. Massively overfunded projects have been discussed to overshadow other crowdfunding projects which in turn receive less funding. We propose a funding redistribution mechanism to internalize these overfunding externalities and to improve overall funding results. To evaluate this concept, we develop and deploy an agent-based model (ABM). This ABM is based on a multi-attribute decision-making approach and is suitable to simulate the dynamic funding processes on a crowdfunding platform. Our evaluation provides evidence that possible modifications of the crowdfunding mechanisms bear the chance to optimize funding results and to alleviate existing flaws.
Correction to: Computational Economics https://doi.org/10.1007/s10614-020-10061-x
The original publication has been updated. In the original publication of this article, under the Introduction heading section, the corrections to the second paragraph’s inline equation were not incorporated. The author’s additional corrections have also been incorporated. The publisher apologizes for the error made during production.
India has recorded 142,186 deaths over 36 administrative regions placing India third in the world after the US and Brazil for COVID-19 deaths as of 12 December 2020. Studies indicate that south-west monsoon season plays a role in the dynamics of contagious diseases, which tend to peak post-monsoon season. Recent studies show that vitamin D and its primary source Ultraviolet-B (UVB) radiation may play a protective role in mitigating COVID-19 deaths. However, the combined roles of the monsoon season and UVB radiation in COVID-19 in India remain still unclear. In this observational study, we empirically study the respective roles of monsoon season and UVB radiation, whilst further exploring, whether the monsoon season negatively impacts the protective role of UVB radiation in COVID-19 deaths in India. We use a log-linear Mundlak model to a panel dataset of 36 administrative regions in India from 14 March 2020–19 November 2020 (n = 6751). We use the cumulative COVID-19 deaths as the dependent variable. We isolate the association of monsoon season and UVB radiation as measured by Ultraviolet Index (UVI) from other confounding time-constant and time-varying region-specific factors. After controlling for various confounding factors, we observe that a unit increase in UVI and the monsoon season are separately associated with 1.2 percentage points and 7.5 percentage points decline in growth rates of COVID-19 deaths in the long run. These associations translate into substantial relative changes. For example, a permanent unit increase of UVI is associated with a decrease of growth rates of COVID-19 deaths by 33% (= − 1.2 percentage points) However, the monsoon season, mitigates the protective role of UVI by 77% (0.92 percentage points). Our results indicate a protective role of UVB radiation in mitigating COVID-19 deaths in India. Furthermore, we find evidence that the monsoon season is associated with a significant reduction in the protective role of UVB radiation. Our study outlines the roles of the monsoon season and UVB radiation in COVID-19 in India and supports health-related policy decision making in India.
Shares of open-end real estate funds are typically traded directly between the investor and the fund management company. However, we provide empirical evidence for the growth of secondary market activities, i.e., the trading of shares on stock exchanges. We find high trading levels in situations where the fund management company suspends the issue or redemption of shares. Shares trade at a discount when the fund management company suspends the redemption, whereas shares trade at a premium when the fund management company suspends the issue. We also find evidence that secondary market trading activity is increasing since German regulation introduced a minimum holding period and a mandatory notice period for open-end real estate funds.
Consider two independent random walks. By chance, there will be spells of association between them where the two processes move in the same direction, or in opposite direction. We compute the probabilities of the length of the longest spell of such random association for a given sample size, and discuss measures like mean and mode of the exact distributions. We observe that long spells (relative to small sample sizes) of random association occur frequently, which explains why nonsense correlation between short independent random walks is the rule rather than the exception. The exact figures are compared with approximations. Our finite sample analysis as well as the approximations rely on two older results popularized by Révész (Stat Pap 31:95–101, 1990, Statistical Papers). Moreover, we consider spells of association between correlated random walks. Approximate probabilities are compared with finite sample Monte Carlo results.
Vehicle registrations have been shown to strongly react to tax reforms aimed at reducing CO2 emissions from passengers’ cars, but are the effects equally strong for positive and negative tax changes? The literature on asymmetric reactions to price and tax changes has documented asymmetries for everyday goods but has not yet considered durables. We leverage multiple vehicle registration tax (VRT) reforms in Norway and estimate their impact on within car-model substitutions. We estimate stronger effects for cars receiving tax cuts and rebates than for those affected by tax increases. The corresponding estimated elasticity is − 1.99 for VRT decreases and 0.77 for increases. As consumers may also substitute across car models, our estimates represent a lower bound.
This paper uses historical monthly temperature level data for a panel of 114 countries to identify the effects of within year temperature level variability on productivity growth in five different macro regions, i.e., (1) Africa, (2) Asia, (3) Europe, (4) North America and (5) South America. We find two primary results. First, higher intra-annual temperature variability reduces (increases) productivity in Europe and North America (Asia). Second, higher intra-annual temperature variability has no significant effects on productivity in Africa and South America. Additional empirical tests indicate also the following: (1) rising intra-annual temperature variability reduces productivity (even thought less significantly)in both tropical and non-tropical regions, (2) inter-annual temperature variability reduces (increases) productivity in North America (Europe) and (3) winter and summer inter-annual temperature variability generates a drop in productivity in both Europe and North America. Taken together, these findings indicate that temperature variability shocks tend to have stronger adverse economic effects among richer economies. In a production economy featuring long-run productivity and temperature volatility shocks, we quantify these negative impacts and find welfare losses of 2.9% (1%) in Europe (North America).
Solving High-Dimensional Dynamic Portfolio Choice Models with Hierarchical B-Splines on Sparse Grids
(2021)
Discrete time dynamic programming to solve dynamic portfolio choice models has three immanent issues: firstly, the curse of dimensionality prohibits more than a handful of continuous states. Secondly, in higher dimensions, even regular sparse grid discretizations need too many grid points for sufficiently accurate approximations of the value function. Thirdly, the models usually require continuous control variables, and hence gradient-based optimization with smooth approximations of the value function is necessary to obtain accurate solutions to the optimization problem. For the first time, we enable accurate and fast numerical solutions with gradient-based optimization while still allowing for spatial adaptivity using hierarchical B-splines on sparse grids. When compared to the standard linear bases on sparse grids or finite difference approximations of the gradient, our approach saves an order of magnitude in total computational complexity for a representative dynamic portfolio choice model with varying state space dimensionality, stochastic sample space, and choice variables.
The mobile games business is an ever-increasing sub-sector of the entertainment industry. Due to its high profitability but also high risk and competitive atmosphere, game publishers need to develop strategies that allow them to release new products at a high rate, but without compromising the already short lifespan of the firms' existing games. Successful game publishers must enlarge their user base by continually releasing new and entertaining games, while simultaneously motivating the current user base of existing games to remain active for more extended periods. Since the core-component reuse strategy has proven successful in other software products, this study investigates the advantages and drawbacks of this strategy in mobile games. Drawing on the widely accepted Product Life Cycle concept, the study investigates whether the introduction of a new mobile game built with core-components of an existing mobile game curtails the incumbent's product life cycle. Based on real and granular data on the gaming activity of a popular mobile game, the authors find that by promoting multi-homing (i.e., by smartly interlinking the incumbent and new product with each other so that users start consuming both games in parallel), the core-component reuse strategy can prolong the lifespan of the incumbent game.
Digital wealth and its necessary regulation have gained prominence in recent years. The European Commission has published several documents and policy proposals relating, directly or indirectly, to the data economy. A data economy can be defined as an ecosystem of different types of market players collaborating to ensure that data is accessible and usable in order to extract value from data through, for example, creating a variety of applications with great potential to improve daily life. The value of data can increase from EUR 257 billion (1.85 of EU Gross Domestic Product (GDP)) to EUR 643 billion by 2020 (3.17% of EU GDP), according to the EU Commission. The legal implications of the increasing value of the data economy are clear; hence the need to address the challenges presented by its legal regulation.
The health and genetic data of deceased people are a particularly important asset in the field of biomedical research. However, in practice, using them is compli- cated, as the legal framework that should regulate their use has not been fully developed yet. The General Data Protection Regulation (GDPR) is not applicable to such data and the Member States have not been able to agree on an alternative regulation. Recently, normative models have been proposed in an attempt to face this issue. The most well- known of these is posthumous medical data donation (PMDD). This proposal supports an opt-in donation system of health data for research purposes. In this article, we argue that PMDD is not a useful model for addressing the issue at hand, as it does not consider that some of these data (the genetic data) may be the personal data of the living relatives of the deceased. Furthermore, we find the reasons supporting an opt-in model less convincing than those that vouch for alternative systems. Indeed, we propose a normative framework that is based on the opt-out system for non-personal data combined with the application of the GDPR to the relatives’ personal data.
The quality of life: protecting non-personal interests and non-personal data in the age of big data
(2021)
Under the current legal paradigm, the rights to privacy and data protection provide natural persons with subjective rights to protect their private interests, such as related to human dignity, individual autonomy and personal freedom. In principle, when data processing is based on non-personal or aggregated data or when such data pro- cesses have an impact on societal, rather than individual interests, citizens cannot rely on these rights. Although this legal paradigm has worked well for decades, it is increasingly put under pressure because Big Data processes are typically based indis- criminate rather than targeted data collection, because the high volumes of data are processed on an aggregated rather than a personal level and because the policies and decisions based on the statistical correlations found through algorithmic analytics are mostly addressed at large groups or society as a whole rather than specific individuals. This means that large parts of the data-driven environment are currently left unregu- lated and that individuals are often unable to rely on their fundamental rights when addressing the more systemic effects of Big Data processes. This article will discuss how this tension might be relieved by turning to the notion ‘quality of life’, which has the potential of becoming the new standard for the European Court of Human Rights (ECtHR) when dealing with privacy related cases.
Ownership of databases: personal data protection and intellectual property rights on databases
(2021)
When we think on initiatives on access to and reuse of data, we must consider both the European Intellectual Property Law and the General Data Protection Regulation (GDPR). The first one provides a special intellectual property (IP) right – the sui generis right – for those makers that made a substantial investment when creating the database, whether it contains personal or non-personal data. That substantial investment can be made by just one person, but, in many cases, it is the result of the activities of many people and/or some undertakings processing and aggregating data. In the modern digital economy, data are being dubbed the ‘new oil’ and the sui generis right might be con- sidered a right to control any access to the database, thus having an undeniable relevance. Besides, there are still important inconsistences between IP Law and the GDPR, which must be removed by the European legislator. The genuine and free consent of the data subject for the use of his/her data must remain the first step of the legal analysis.
Commercialization of consumers’ personal data in the digital economy poses serious, both conceptual and practical, challenges to the traditional approach of European Union (EU) Consumer Law. This article argues that mass-spread, automated, algorithmic decision-making casts doubt on the foundational paradigm of EU consumer law: consent and autonomy. Moreover, it poses threats of discrimination and under- mining of consumer privacy. It is argued that the recent legislative reaction by the EU Commission, in the form of the ‘New Deal for Consumers’, was a step in the right direction, but fell short due to its continued reliance on consent, autonomy and failure to adequately protect consumers from indirect discrimination. It is posited that a focus on creating a contracting landscape where the consumer may be properly informed in material respects is required, which in turn necessitates blending the approaches of competition, consumer protection and data protection laws.
What are the effects of the GDPR on consumer apps? This article presents an analysis of app behavior before and after the regulatory change in data protection in Europe. Based on long-term data collection, we present differences in app permission use and expressed user concerns and discuss their implications. In May 2018, the General Data Protection Regulation (GDPR) changed the data protection obligations of the information industry with the European Union users substantially. One should expect to find changes in code, program behavior and data collection activities. To investigate this expectation, we analyzed data about Android apps request and use of permissions to access sensitive group of data on smartphones, and collected user reviews. Our data shows an overall reduction of both permissions used and of expressed user concern. However, in some areas apps have increased access or user complaints while in addition, many apps carry with them several unused access privileges.