004 Datenverarbeitung; Informatik
Refine
Year of publication
Document Type
- Article (21)
- Conference Proceeding (7)
- Report (5)
- Part of Periodical (4)
- Contribution to a Periodical (2)
- Doctoral Thesis (2)
- Working Paper (2)
- Part of a Book (1)
Has Fulltext
- yes (44)
Is part of the Bibliography
- no (44)
Keywords
- Data protection (3)
- GDPR (3)
- Privacy (3)
- Datenschutz (2)
- Machine learning (2)
- Software Engineering (2)
- machine learning (2)
- (mobile) Internet (1)
- AI fairness (1)
- Abrechnung (1)
Institute
- Wirtschaftswissenschaften (44) (remove)
Enabling cybersecurity and protecting personal data are crucial challenges in the development and provision of digital service chains. Data and information are the key ingredients in the creation process of new digital services and products. While legal and technical problems are frequently discussed in academia, ethical issues of digital service chains and the commercialization of data are seldom investigated. Thus, based on outcomes of the Horizon2020 PANELFIT project, this work discusses current ethical issues related to cybersecurity. Utilizing expert workshops and encounters as well as a scientific literature review, ethical issues are mapped on individual steps of digital service chains. Not surprisingly, the results demonstrate that ethical challenges cannot be resolved in a general way, but need to be discussed individually and with respect to the ethical principles that are violated in the specific step of the service chain. Nevertheless, our results support practitioners by providing and discussing a list of ethical challenges to enable legally compliant as well as ethically acceptable solutions in the future.
Augmented reality (AR) gained much public attention since the success of Pok´emon Go in 2016. Technology companies like Apple or Google are currently focusing primarily on mobile AR (MAR) technologies, i.e. applications on mobile devices, like smartphones or tablets. Associated privacy issues have to be investigated early to foster market adoption. This is especially relevant since past research found several threats associated with the use of smartphone applications. Thus, we investigate two of the main privacy risks for MAR application users based on a sample of 19 of the most downloaded MAR applications for Android. First, we assess threats arising from bad privacy policies based on a machine-learning approach. Second, we investigate which smartphone data resources are accessed by the MAR applications. Third, we combine both approaches to evaluate whether privacy policies cover certain data accesses or not. We provide theoretical and practical implications and recommendations based on our results.
Advanced machine learning has achieved extraordinary success in recent years. “Active” operational risk beyond ex post analysis of measured-data machine learning could provide help beyond the regime of traditional statistical analysis when it comes to the “known unknown” or even the “unknown unknown.” While machine learning has been tested successfully in the regime of the “known,” heuristics typically provide better results for an active operational risk management (in the sense of forecasting). However, precursors in existing data can open a chance for machine learning to provide early warnings even for the regime of the “unknown unknown.”
This paper provides an assessment framework for privacy policies of Internet of Things Services which is based on particular GDPR requirements. The objective of the framework is to serve as supportive tool for users to take privacy-related informed decisions. For example when buying a new fitness tracker, users could compare different models in respect to privacy friendliness or more particular aspects of the framework such as if data is given to a third party. The framework consists of 16 parameters with one to four yes-or-no-questions each and allows the users to bring in their own weights for the different parameters. We assessed 110 devices which had 94 different policies. Furthermore, we did a legal assessment for the parameters to deal with the case that there is no statement at all regarding a certain parameter. The results of this comparative study show that most of the examined privacy policies of IoT devices/services are insufficient to address particular GDPR requirements and beyond. We also found a correlation between the length of the policy and the privacy transparency score, respectively.
Plattformen für Social Communities im Internet, wie Facebook, StudiVZ und XING, haben in den vergangenen Jahren rasant an Popularität gewonnen. Auf ihnen versammeln sich bereits heute Millionen von Nutzern weltweit. Sie verbinden sich über virtuelle Freundeslisten und tauschen sich über gemeinsame Interessen und Aktivitäten aus. Immer häufger werden dazu auch mobile Endgeräte wie Handys verwendet, erlauben diese doch ständig in Kontakt mit der Community zu bleiben. Allerdings wollen viele Nutzer längst nicht jedem Mitglied einer Community alles preisgeben. Doch wie lässt sich die Privatsphäre in solchen Communities besser schützen? Dieser Frage geht das Forschungsprojekt PICOS nach.
Correction to: Computational Economics https://doi.org/10.1007/s10614-020-10061-x
The original publication has been updated. In the original publication of this article, under the Introduction heading section, the corrections to the second paragraph’s inline equation were not incorporated. The author’s additional corrections have also been incorporated. The publisher apologizes for the error made during production.
Online Privacy: Towards Informational Self-Determination on the Internet (Dagstuhl Perspectives Workshop 11061) : Simone Fischer-Hübner, Chris Hoofnagle, Kai Rannenberg, Michael Waidner, Ioannis Krontiris and Michael Marhöfer Self-Repairing Programs (Dagstuhl Seminar 11062) : Mauro Pezzé, Martin C. Rinard, Westley Weimer and Andreas Zeller Theory and Applications of Graph Searching Problems (Dagstuhl Seminar 11071) : Fedor V. Fomin, Pierre Fraigniaud, Stephan Kreutzer and Dimitrios M. Thilikos Combinatorial and Algorithmic Aspects of Sequence Processing (Dagstuhl Seminar 11081) : Maxime Crochemore, Lila Kari, Mehryar Mohri and Dirk Nowotka Packing and Scheduling Algorithms for Information and Communication Services (Dagstuhl Seminar 11091) Klaus Jansen, Claire Mathieu, Hadas Shachnai and Neal E. Young
Es geht um Werbung, Betrug oder die Optimierung von Geschäftsmodellen: Verbraucherdaten sind ein kostbares Gut, das Kreditgeber und Versicherer genauso interessiert wie Händler und Kriminelle. Kai Rannenberg, Professor für Mobile Business & Multilateral Security an der Goethe-Universität, forscht zur Cybersicherheit. Dirk Frank hat mit dem Wirtschaftsinformatiker über Datenschutz, Hackerangriffe und das Auto als »Handy auf Rädern« gesprochen.
What are the effects of the GDPR on consumer apps? This article presents an analysis of app behavior before and after the regulatory change in data protection in Europe. Based on long-term data collection, we present differences in app permission use and expressed user concerns and discuss their implications. In May 2018, the General Data Protection Regulation (GDPR) changed the data protection obligations of the information industry with the European Union users substantially. One should expect to find changes in code, program behavior and data collection activities. To investigate this expectation, we analyzed data about Android apps request and use of permissions to access sensitive group of data on smartphones, and collected user reviews. Our data shows an overall reduction of both permissions used and of expressed user concern. However, in some areas apps have increased access or user complaints while in addition, many apps carry with them several unused access privileges.