Actions sur le document

GDPR: online privacy is a collective issue !

– S.I.Lex – - calimaq, 25/05/2018

Today comes into force the General Data Protection Regulation (GDPR). Laura Aufrère and I publish on this occasion a tribune in the newspaper Le Monde. This is the short version of a more developed text that we post below.   *** GDPR: online privacy is a collective issue ! GDPR. Great hope rides on these … Lire la suite GDPR: online privacy is a collective issue !

Lire l'article...

Today comes into force the General Data Protection Regulation (GDPR). Laura Aufrère and I publish on this occasion a tribune in the newspaper Le Monde. This is the short version of a more developed text that we post below.

GDPR : a new hope for the Galaxy ? Yes, but…

 

***

GDPR: online privacy is a collective issue !

GDPR. Great hope rides on these four letters. The EU’s General Data Protection Regulation will come into force on 25 May. The regulation aims to reinforce individual rights and protections by restricting the conditions under which data can be collected and processed. Moreover, it applies worldwide to any platform that processes the data of European citizens, ensuring more consistent implementation. Digital companies contravening the new law risk hefty sanctions of up to 4% of their worldwide turnover. After years of misuse and impunity, it now seems possible to take back control of our digital lives, in particular from GAFAM (Google, Apple, Facebook, Amazon, and Microsoft).

The law’s ambition – to protect personal data seen as attributes of the individual – is both noble and essential. But what kind of misuses do we need protection from, and what precisely is meant by “data”? Companies like Facebook and Google primarily benefit from our exploiting “social graph” (i.e. the network of relationships linking individuals to each other) for commercially gain. In fact, the bulk of exploited data is produced by consolidating our digital data and footprints to reveal our social behaviours. Our collective data is therefore the primary source of potential value. As such, it is being monetized.

While the law protects personal data, online platforms are actually interested in social data, and there are no specific legal protections covering this collective aspect of our digital lives. The Cambridge Analytica scandal that has embroiled Facebook over the past several months is a perfect illustration of this gap. What the company allegedly obtained via Facebook was the individual consent of 270,000 people who filled out a personality test. In fact, what it gathered through these individuals’ contacts was data pertaining to at least 50 million people, a large proportion of Facebook’s social graph. This – the “social coordinates” that position us on the graph and the technical representation of our social relations – was exactly the kind of data that is currently unprotected.

Despite the GDPR’s real and novel contributions, this collective aspect largely escapes the new regulation, which was written with the same focus on individuals as the 1978 French law on freedom and information technology (CNIL). Under this approach, data is understood from an individualist perspective, and seen only as relating to and identifying a specific person. What could seem more natural than to grant individuals the right to protect their private lives? To this end, the legislation aims to make sure companies obtain users’ “free and informed” consent before they can legally process their data. While this is one of the text’s strengths, but this approach is actually based on a legal fiction that is already being exploited by online platforms – the isolated individual, capable of « self-determination », on whom the regulation rests.

Of course, the GDPR introduces new guarantees, such as privacy by design and privacy by default which take into account the power imbalance between the parties in question. These concepts suggest that apps should structurally incorporate protection for users’ private lives by setting up initial parameters that minimize data collection with regard to the end goal. Consent could therefore be expressed in a context where the individual is not placed by default in a vulnerable position. But leading digital companies have already become masters at manufacturing consent, right down to the design of their interface. New terms of use for Twitter, Google and Facebook have been introduced in such a way as to subtly guide users. For example, Google now gives them the option to deactivate the bulk of their profiling functions, in the full knowledge that only a small minority will bother to navigate the dashboard and avail themselves of this option. For its part, Facebook has deactivated default facial recognition, but “informs” users that this function will prevent ill-intentioned parties from posting photos without their knowledge, and help blind users understand the images…

And what of our digital friends who accept these conditions ? What will become of their data that relates to our private lives, and our collective digital footprints? Which parts of our lives may be sold without users ever knowing exactly how their relationships will be used to bolster the financial machine? To put it another way, due to the inherently social nature of the data, consent is always collective. What is at stake is not only individual dignity in the face of these demonstrable manipulations, but also our “collective dignity”. Solidarity between users to mutually protect the privacy of our relations is therefore key. GAFAM are designing platforms that put users in a position of subordination[1], caught up in a tidal wave of incitements arising from the excessive power imbalance between themselves and digital platforms. Yet the GDPR has just made individual consent sufficient to circumvent a collective debate on what in our lives can reasonably be sold.

The GDPR provides platforms – which might have reason to fear the exercise of individual consent – with a formidable means of conserving certain basic aspects of their operations. The text allows for a company to invoke their “legitimate interest” in order to process data without the consent of users. The law states that rights and freedoms should not be compromised on this basis, but does not give a precise definition, leaving companies significant wiggle room. Google’s new terms of use claim that its advertising model is part of its legitimate interest. Facebook goes even further, claiming as legitimate interest the ability to “conduct research and innovate for the good of all” in order to “change our company and the world for the better”. This is a clever move, in that it implies that its legitimate interest goes beyond its private interest, and is indeed contingent on the interest of the general public.

While the regulation has several flaws, it is not necessarily doomed, since an alternative interpretation is possible. The notion of “free consent” in particular has great potential if its full meaning is given expression. The Art. 29 WP, made up of European regulatory bodies, made a move in this direction, stating that consent cannot be valid if it is “conditional”–i.e. if users face negative consequences by refusing certain types of data processing. This would ban companies from coercing users into accepting specific kinds of data processing to avoid loosing service after 25 May, as Facebook, BlaBlaCar and AirBnB, to name a few, attempted.

The aim is to prevent consent from being used against individuals by coercing them into weakening their own rights, and by extension, our collective rights over our data. Protecting individuals in a position of weak bargaining power is traditionally the purpose of labour law and social legislation. Both of which assert the collective nature of consent, whose legitimacy is rooted in the consideration of the goals and stakes of the negotiation at hand. Their aim is to guarantee individual dignity and prevent society from being broken down into individuals whose vulnerabilities can be exploited separately. The GDPR could play a similar role: individual data protection would then evolve towards a kind of social protection that could collectively protect individual dignity.

The GDPR may yet have teeth, but we must be realistic: its potential will never be realized if citizens do not make sure this strong interpretation is enforced by the privacy authorities and the courts. One innovative aspect of the GDPR is that it opens up the possibility of class actions. In fact, La Quadrature du Net will be filing class suits against GAFAM for breaching their obligation to obtain their users’ free and informed consent, and all french citizens can join the action until 28 May. This is the only way to force internet giants to go beyond superficial changes to their Terms Of Service, and structurally amend their models in a way that respects our fundamental rights.

This battle over interpretation will determine the GDPR’s ability to effect real change and bring us out of collective submission to reclaim our rights and freedoms.

Translated from the French by Alice Heathwood for Fast for Word

[1] In reference the concept of subordination in french Labor Code.

Check the French version of this article. 

Retrouvez l'article original ici...

Vous pouvez aussi voir...