Refine
Document Type
- Article (3)
- Conference Proceeding (2)
Language
- English (5)
Has Fulltext
- yes (5)
Is part of the Bibliography
- no (5)
Keywords
- Pokémon Go (2)
- privacy concerns (2)
- APCO (1)
- Anonymity Services (1)
- Artificial intelligence (1)
- Augmented reality (1)
- Changes in labor markets (1)
- Future of work (1)
- Human-enhancing technologies (1)
- IUIPC (1)
Institute
Pokémon Go is one of the most successful mobile games of all time. Millions played and still play this mobile augmented reality (AR) application, although severe privacy issues are pervasive in the app due to its use of several sensors such as location data and camera. In general, individuals regularly use online services and mobile apps although they might know that the use is associated with high privacy risks. This seemingly contradictory behavior of users is analyzed from a variety of different perspectives in the information systems domain. One of these perspectives evaluates privacy-related decision making processes based on concepts from behavioral economics. We follow this line of work by empirically testing one exemplary extraneous factor within the “enhanced APCO model” (antecedents–privacy concerns–outcome). Specific empirical tests on such biases are rare in the literature which is why we propose and empirically analyze the extraneous influence of a positivity bias. In our case, we hypothesize that the bias is induced by childhood brand nostalgia towards the Pokémon franchise. We analyze our proposition in the context of an online survey with 418 active players of the game. Our results indicate that childhood brand nostalgia influences the privacy calculus by exerting a large effect on the benefits within the trade-off and, therefore, causing a higher use frequency. Our work shows two important implications. First, the behavioral economics perspective on privacy provides additional insights relative to previous research. However, the effects of several other biases and heuristics have to be tested in future work. Second, relying on nostalgia represents an important, but also double-edged, instrument for practitioners to market new services and applications.
Privacy concerns as well as trust and risk beliefs are important factors that can influence users’ decision to use a service. One popular model that integrates these factors is relating the Internet Users Information Privacy Concerns (IUIPC) construct to trust and risk beliefs. However, studies haven’t yet applied it to a privacy enhancing technology (PET) such as an anonymization service. Therefore, we conducted a survey among 416 users of the anonymization service JonDonym [1] and collected 141 complete questionnaires. We rely on the IUIPC construct and the related trust-risk model and show that it needs to be adapted for the case of PETs. In addition, we extend the original causal model by including trust beliefs in the anonymization service provider and show that they have a significant effect on the actual use behavior of the PET.
Nowadays, digitalization has an immense impact on the landscape of jobs. This technological revolution creates new industries and professions, promises greater efficiency and improves the quality of working life. However, emerging technologies such as robotics and artificial intelligence (AI) are reducing human intervention, thus advancing automation and eliminating thousands of jobs and whole occupational images. To prepare employees for the changing demands of work, adequate and timely training of the workforce and real-time support of workers in new positions is necessary. Therefore, it is investigated whether user-oriented technologies, such as augmented reality (AR) and virtual reality (VR) can be applied “on-the-job” for such training and support—also known as intelligence augmentation (IA). To address this problem, this work synthesizes results of a systematic literature review as well as a practically oriented search on augmented reality and virtual reality use cases within the IA context. A total of 150 papers and use cases are analyzed to identify suitable areas of application in which it is possible to enhance employees' capabilities. The results of both, theoretical and practical work, show that VR is primarily used to train employees without prior knowledge, whereas AR is used to expand the scope of competence of individuals in their field of expertise while on the job. Based on these results, a framework is derived which provides practitioners with guidelines as to how AR or VR can support workers at their job so that they can keep up with anticipated skill demands. Furthermore, it shows for which application areas AR or VR can provide workers with sufficient training to learn new job tasks. By that, this research provides practical recommendations in order to accompany the imminent distortions caused by AI and similar technologies and to alleviate associated negative effects on the German labor market.
We investigate privacy concerns and the privacy behavior of users of the AR smartphone game Pokémon Go. Pokémon Go accesses several functionalities of the smartphone and, in turn, collects a plethora of data of its users. For assessing the privacy concerns, we conduct an online study in Germany with 683 users of the game. The results indicate that the majority of the active players are concerned about the privacy practices of companies. This result hints towards the existence of a cognitive dissonance, i.e. the privacy paradox. Since this result is common in the privacy literature, we complement the first study with a second one with 199 users, which aims to assess the behavior of users with regard to which measures they undertake for protecting their privacy. The results are highly mixed and dependent on the measure, i.e. relatively many participants use privacy-preserving measures when interacting with their smartphone. This implies that many users know about risks and might take actions to protect their privacy, but deliberately trade-off their information privacy for the utility generated by playing the game.
Augmented reality (AR) gained much public attention since the success of Pok´emon Go in 2016. Technology companies like Apple or Google are currently focusing primarily on mobile AR (MAR) technologies, i.e. applications on mobile devices, like smartphones or tablets. Associated privacy issues have to be investigated early to foster market adoption. This is especially relevant since past research found several threats associated with the use of smartphone applications. Thus, we investigate two of the main privacy risks for MAR application users based on a sample of 19 of the most downloaded MAR applications for Android. First, we assess threats arising from bad privacy policies based on a machine-learning approach. Second, we investigate which smartphone data resources are accessed by the MAR applications. Third, we combine both approaches to evaluate whether privacy policies cover certain data accesses or not. We provide theoretical and practical implications and recommendations based on our results.