Universitätspublikationen
Refine
Document Type
- Article (2)
- Part of a Book (1)
- Conference Proceeding (1)
Language
- English (4) (remove)
Has Fulltext
- yes (4) (remove)
Is part of the Bibliography
- no (4)
Keywords
- GDPR (4) (remove)
Institute
Regulators worldwide have been implementing different privacy laws. They vary in their impact on the value for advertisers, publishers and users, but not much is known about these differences. This article focuses on three important privacy laws (i.e., General Data Protection Regulation [GDPR], California Consumer Privacy Act [CCPA] and Personal Information Protection Law [PIPL]) and compares their impact on the value for the three primary actors of the online advertising market, namely, advertisers, publishers and users. This article first compares these three privacy laws by developing a legal strictness score. It then uses the existing literature to derive the effects of the legal strictness of each privacy law on each actor’s value. Finally, it quantifies the three privacy laws’ impact on each actor’s value. The results show that GDPR and PIPL are similar and stricter than CCPA. Stricter privacy laws bring larger negative changes to the value for actors. As a result, both GDPR and PIPL decrease the actors’ value more substantially than CCPA. These value declines are the largest for publishers and are rather similar for users and advertisers. Scholars and practitioners can use our findings to explore ways to create value for multiple actors under various privacy laws.
Enabling cybersecurity and protecting personal data are crucial challenges in the development and provision of digital service chains. Data and information are the key ingredients in the creation process of new digital services and products. While legal and technical problems are frequently discussed in academia, ethical issues of digital service chains and the commercialization of data are seldom investigated. Thus, based on outcomes of the Horizon2020 PANELFIT project, this work discusses current ethical issues related to cybersecurity. Utilizing expert workshops and encounters as well as a scientific literature review, ethical issues are mapped on individual steps of digital service chains. Not surprisingly, the results demonstrate that ethical challenges cannot be resolved in a general way, but need to be discussed individually and with respect to the ethical principles that are violated in the specific step of the service chain. Nevertheless, our results support practitioners by providing and discussing a list of ethical challenges to enable legally compliant as well as ethically acceptable solutions in the future.
This paper provides an assessment framework for privacy policies of Internet of Things Services which is based on particular GDPR requirements. The objective of the framework is to serve as supportive tool for users to take privacy-related informed decisions. For example when buying a new fitness tracker, users could compare different models in respect to privacy friendliness or more particular aspects of the framework such as if data is given to a third party. The framework consists of 16 parameters with one to four yes-or-no-questions each and allows the users to bring in their own weights for the different parameters. We assessed 110 devices which had 94 different policies. Furthermore, we did a legal assessment for the parameters to deal with the case that there is no statement at all regarding a certain parameter. The results of this comparative study show that most of the examined privacy policies of IoT devices/services are insufficient to address particular GDPR requirements and beyond. We also found a correlation between the length of the policy and the privacy transparency score, respectively.
With the rapid growth of technology in recent years, we are surrounded by or even dependent on the use of technological devices such as smartphones as they are now an indispensable part of our life. Smartphone applications (apps) provide a wide range of utilities such as navigation, entertainment, fitness, etc. To provide such context-sensitive services to users, apps need to access users' data including sensitive ones, which in turn, can potentially lead to privacy invasions. To protect users against potential privacy invasions in such a vulnerable ecosystem, legislation such as the European Union General Data Protection Regulation (EU GDPR) demands best privacy practices. Therefore, app developers are required to make their apps compatible with legal privacy principles enforced by law. However, this is not an easy task for app developers to comprehend purely legal principles to understand what needs to be implemented. Similarly, bridging the gap between legal principles and technical implementations to understand how legal principles need to be implemented is another barrier to develop privacy-friendly apps. To this end, this paper proposes a privacy and security design guide catalog for app developers to assist them in understanding and adopting the most relevant privacy and security principles in the context of smartphone apps. The presented catalog is aimed at mapping the identified legal principles to practical privacy and security solutions that can be implemented by developers to ensure enhanced privacy aligned with existing legislation. Through conducting a case study, it is confirmed that there is a significant gap between what developers are doing in reality and what they promise to do. This paper provides researchers and developers of privacy-related technicalities an overview of the characteristics of existing privacy requirements needed to be implemented in smartphone ecosystems, on which they can base their work.