Artificial intelligence and the right to informational self-determination

This A.Ideas series presents opinion pieces based on the discussions from the OECD Conference "AI: Intelligent Machines, Smart Policies"
Artificial intelligence and the right to informational self-determination
Like

This article is part of the Forum Network series on Digitalisation

Artificial Intelligence (AI) will be a core driver for innovation, growth and productivity. It will help to effectively address some of the most pressing societal challenges in areas such as healthcare, traffic congestion and disaster management. At the same time, AI raises legitimate social and ethical concerns like its impact on jobs, privacy and possible loss of human control.

Benefits of new technologies vs data protection and privacy risks 

While most people enjoy the benefits of digital technology, of being able to connect themselves and communicate at anytime and anywhere, there is also a growing unease among individuals about possible misuse of their personal data and the increase of cyber threats. Data protection and privacy laws have been conceived to protect people against the threats of a digital world. Today their purpose is more relevant than ever. Unfortunately, achieving a well-functioning data protection framework is challenging. Striking the right balance between the benefits of new technologies and managing data protection and privacy risks may very well be one of the biggest challenges of our times.

AI only becomes meaningful when it has access to a large amount of high-quality data, including, in many cases, personal data. The European Union (EU) adopted the new General Data Protection Regulation (GDPR), underlined by traditional data protection principles regarding the legal grounds for data processing, data minimisation and purpose limitation. The former EU data protection law, known as the Data Protection Directive, was conceived in the pre-Internet age. While it has proven to be remarkably resilient, and has been flexible enough to retain relevance even in today’s globally-networked world, the emergence of new data-driven technologies and business models has put increasing pressure on the underlying principles.

6 things GDPR is

What is personal data?

The dilemma starts with the definition of personal data. The logic of an expansion in scope of ‘personal data’ is appealing and sounds simple: the broader the definition of personal data, the more data comes in scope, the more data is protected. However, if all data that can be linked back to an individual comes under the full scope of data protection laws, access to the many beneficial uses of data become uncertain. Companies and authorities are faced with the unmanageable reality that, in effect, all data could be considered personal.

This raises the question whether the concept that has been underlying data protection laws in Europe for the past 20 years – to limit data processing and to keep the digital footprint of a person as low as possible – has failed or, to the contrary, helped prevent the worst. I believe that neither is the case. Data protection and privacy are still relevant, in fact, they are more relevant than ever. Now, however, it may be time to re-consider its core concepts. If one does not want to go as far as reversing today’s general practice of prohibition (the processing of personal data is prohibited unless expressly allowed) to generally allowing the processing of personal data unless expressly prohibited, then we should at least consider introducing statutory permissions that define the boundary conditions for what is socially acceptable. We should start concentrating more on what matters to people and less on what we believe should matter to them.

"Trying to eliminate every remote privacy risk may jeopardise valuable data uses in return for small privacy gains".

Now that the GDPR has been enacted and that, according to some sources, is meant to be relevant for the next 20 years these conceptual changes may be future reflections. Much will depend on how the new law will be put into practice: whether the EU and its’ citizens will be able to participate in data-driven, next-generation innovation, or not. At the same time a modern way of interpreting and implementing the regulation should not assume a “one size fits all” approach. Going forward, we should concentrate on what is important: where the individual’s right to informational self-determination (the right to determine the governance of his/her personal information) is significantly impacted. Not all data processing is equally intrusive, and not every piece of data is equally sensitive. We need to recognise the importance of context and how it affects potential consequences to users. Trying to eliminate every remote privacy risk may jeopardise valuable data uses in return for small privacy gains.

Privacy settings

Several tools and approaches (foreseen under the GDPR) including pseudonymisation, privacy impact assessments and privacy by design which, when properly applied, can help reduce or minimise the impact on privacy. Companies have the possibility to enact technical safeguards, such as pseudonymising and encrypting data; automating data logging; restricting data analytics; access rights managment; and automating data validation. A legal system that is closely attuned to these additional safeguards will enable organisations to maximise data utility while minimising privacy risks. If companies set tighter controls on access to such data and provide consumers with meaningful controls to start, this should enable more liberal legal treatment and softer obligations.

All fundamental rights must be ensured

In addition, data protection rules do not exist in a legal vacuum. There is no question that privacy and the right to data protection are fundamental rights. However, they must be balanced against other fundamental rights, such as the right to liberty and security, the freedom to conduct business, the right to choose an occupation and engage in work, the freedom of expression and the freedom for the arts and sciences – to name but a few. Informational self-determination is a fundamental element of human dignity but so are the rights to physical well-being and economic prosperity. Therefore, we need to excel in research and education if we want to safeguard these values for the generations to come.

We must foster an approach with innovation, technology, business and European competitiveness in mind, while putting the necessary controls and balances in place to ensure that society will not be put at crossroads. In this context, we must ensure fruitful dialogue, avoiding polarised debate with each side dismissing the concerns of the other. As little as we like the idea of having machines make decisions on our behalf, we must ensure we take conscious decisions on the right balance for the future of Europe. I am not suggesting that key issues around our human individuality and dignity should be subjected to automated, algorithmic decision making. Clearly, the individual is at the centre of society and critical decisions must always remain under the control of a human being. However, in a world ruled by economic principles, our European values will only be able to prevail if we manage to translate them into clear and easy-to-follow rules that people understand and accept and that our businesses can easily implement and comply with.

GOT A FEW MORE MINUTES?

OECD Conference on Artificial Intelligence - "AI: Intelligent Machines, Smart Policies" -Paris, 26-27 October 2017

EU General Data Protection Regulation (GDPR) website

EU Charter of Fundamental Rights

ABOUT THE AUTHOR

Mathias Cellarius studied law at the University of Freiburg, Germany. He joined SAP in 2005 and has been working in the legal department since then. Since 2013 he has been heading a new team within SAP SE Global Legal focusing on regulatory matters and their impacts on SAP’s current and future business models. Additionally, in July 2014 he has become SAP’s Data Protection Officer and heads SAP's Data Protection and Privacy Office.

OECD Going Digital project

Related Topics

Privacy & Cybersecurity Artificial Intelligence

Please sign in

If you are a registered user on The OECD Forum Network, please sign in