Data – being part of the constituent elements of today’s quantified society – hold myriads of promises. As manifold these promises might be however they all link back to the same basic notion: that data will be able to make things, expressions, interactions – data enthusiasts might say everything – visible and understandable, for now to a certain degree at least. While there’s ongoing work to potentiate its possibilities, new socio-technical constellations emerge that make data even more embracing and permeating. Measuring the secret fibers of our very being by us and others has become a part of data reality. Just think of Fitbits and other wearables making precise maps of our dreaming phases, our sexual performance and eating behavior or Social Networks analyzing and categorizing our believes and visions we post, our behavior and preferences while interacting socially.
In this new mode of existence – defined by new data sources, new infrastructures, new logics of analysis – it seems more vital then ever that data is cooked with care (Bowker 2006: 184). While the discourse about data, security and society has become highly controversial and often critical in recent years, appropriate concepts to put data into an effective perspective are still developing (ibd.). Therefore a lot of fundamental, yet unanswered questions arise, such as: What constitutes our techno-society nowadays? What kind of paradigm shift are we facing when turning from a knowledge-driven to a data-driven society? How to address the dynamics of structural changes? Can new technologies be regulated by data protection rules at all? What does that mean for fundamental and therefore inalienable rights and freedoms we have currently?
One approach of handling those challenges new data practices bring is the notion of data protection. The ideology behind this specific perspective derives from an understanding that data is something sensitive that has to be protected because it is ultimately connected to and a representation of individuals, groups and populations. Data protection in this regard offers a new catalog of criteria and terms, as well as tools, which might be useful for describing the present situation (of an insufficiently regulated data fabrication and processing) and maybe even regulating it (by establishing privacy-by-default and privacy-by-design as standards).
From 2nd to 3rd of November Forum Privatheit in association with Fraunhofer Institute for Systems and Innovation Research ISI aimed to ask and answer these and further questions. The interdisciplinary conference with the title Die Fortentwicklung des Datenschutzes assembled a lot of different perspectives and approaches in law, social sciences and engineering. With four requirements in mind – finding a balance between challenges and possibilities for innovation, having a critical discourse, having a competition between member states for the most suitable solutions and reaching better data protection by interdisciplinary dialogues – the conference was divided into two parallel tracks, complemented by keynotes and panel discussions. I participated to some extend with the idea to immerse and observe how experts talk about data, data practices and what perspectives, questions and answers they come up with.
The first keynotes elaborated on regulatory and technical issues. While Frank Pallas was examining ubiquitous networking and the interaction between technology and regulation, Gerrit Hornung was asking if new technologies can be regulated by data protection laws at all. It was interesting to follow the arguments and especially Gerrit Hornung presented a well-founded perspective that it will be a challenge to protect data by relying on existing and upcoming laws. One example to make this more plastic: Data protection directive 95/46/EC was made 1995, eventually got implemented 1998. Twenty years later – May 2018 – the new General Data Protection Regulation will be legally effective. Within that time-frame, ‘traditional’ computing changed significantly while regulation hardly addressed the new developments adequately. If law were supposed to be the solution it would have been a good addition to reflect on the question if law has the right criteria to address recent everyday practices and processes.
Alexander Roßnagel drew a perspective on responsibilities getting more and more blurry in times of omnipresent and linked devices, known as fog or edge computing. He concluded that it is not clear anymore who – or what – has been and still is processing which data when, that multiple parties are involved and that these parties can hardly be held accountable. He further addressed the issue that technology like data-driven tools and processes are not free of biases. Consequently he requested a stronger reflection on the need for neutrality of technology which he thinks is essential. This demand is interesting from a perspective of critical data studies and value-sensitive-design where it is claimed that technology and data is neither good nor bad nor even neutral but always active and inscribed with specific expectations, understandings, ethics, norms and values (Simon 2016: 223; Iliadis and Russo 2016: 1).
Another interesting question that was raised during the conference was articulated by Martin Degeling and Thomas Herrmann, which were asking about the possibility to modulate socio-technical systems. By showing technology-orientated practical examples (e.g. browser plugins) they explained what possibilities users have to regulate their digital or data double and what procedural and organisational possibilities exist for users to protect themselves. Some important questions however were not debated: Should it be the subject’s own responsibility to intervene in data practices and manage them? How could the user discern and anticipate what data could be used in which way now and in the future? Is this displacement of responsibilities towards users a way the society should go and which ideology is driving this development? What about data literacy, etc.?
This interesting conference focused on essential data-related questions from a broad but commonly shared perspective: Data is linked to fundamental rights, which are rights of defense. Data protection then is supposed to watch over and execute these rights.
Bowker, Geoffrey C. (2005): Memory Practices in the Sciences. London, MIT Press.
Iliadis, Andrew/Russo, Federica (2016): Critical data studies: An introduction. In: Big Data & Society July–December 2016, DOI: 10.1177/2053951716674238.
Simon, J. (2016): Value-Sensitive Design and Responsible Research and Innovation. In: Hansson, S.-O.: The Ethics of Technology – Methods and Approaches. London, Rowman & Littlefield.