Data isn’t the new oil. And we should be careful.

Data isn’t the new oil. And we should be careful.

We’re all the rage with big data and its potential. But data might actually be the new CO2, as previously analyzed in a Luminate blog post. Recently, the covid19 crisis adds more surveillance risks, as described by Yuval Harari. The European regulator has declared the situation changes drastically:

We could not even imagine that reasonable people would start asking internet & telecom operators to possibly track each and every person in Europe using his or her mobile location data in real time, and to create a diagram representing all physical interactions between people. — Wojciech Wiewiórowski

Privacy is a fundamental but not absolute right. Interferences may be justified when (and only when) prescribed by law, necessary to achieve a legitimate aim, and proportionate to that aim. In the current situation the legitimate aim is to limit the spread of a contagious disease, until we get a cure.

The proportionate requirement is more complex, especially in emergency situations during which we don’t have time for careful design and evaluation. I would argue that the legitimate aim can just as well be achieved through other means (in short, humanity didn’t wait for iPhone and Android to fight pandemics, and pedagogy is a safer bet than technology controled behaviour). Most backtracking applications, even if they claim anonymization, cannot technically ensure it is completely safe (of course it depends on the types of data and the methods used). Re-identification is usually possible (see as examples, those two articles in nature and arxiv).

The rest of the article is geared more specifically towards the specific data use made by Big Techs. Before the current pandemic crisis, 2 important books were published in 2019, that anyone should know and which complement each other:

  • The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power, by Shoshana Zuboff

  • Between Truth and Power: The Legal Constructions of Informational Capitalism, by Julie E. Cohen

In case you don’t read till the end, data is not oil but a product of social creation. As a citizen, you should care, as your current choices will shape the future.

The first book, based mostly on the analysis of Google and Facebook, provides some insightful descriptions of what the author, a professor in sociology at Harvard, calls “surveillance capitalism” and how that constitutes a threat to democracy. The meteoric rise of Google in the aftermath of 9/11 is well explained: from one amongst many search engines that existed in the early internet, the breakthrough came from the adwords business model.

Google’s unique auction methods and capabilities earned a great deal of attention, which distracted observers from reflecting on exactly what was beeing auctioned: derivatives of behavioral surplus [and Zuboff spends a great deal of effort explaining how this is different from the traditional argument “If You’re Not Paying For It, You Become The Product”]

Probably everyone knows about 1984 and Big Brother. Zuboff reminds us of another pre-apocalyptic novel, Walden Two, written by an Harvard scholar named Skinner in 1948, as an illustration of what behaviorism could achieve: automize us. Only this has come true, she argues. Thanks to the vast troves of geolocalized data about their users, games such as Pokemon Go actually change our behavior, without us even noticing (the arte mini-serie “Dopamine” was also very good on the subject). According to Zuboff, technologies and apps provided by GAFAs are diffusing a sort of muted and sanytized tyrany.

A good and lengthy critic of the book has been provided by Evgeny Morozov. A relevant issue is that the alternative theories are not discussed and her proposed solutions fall short of the challenge.

But what is really disturbing is when esteemed intellectuals don’t put their money where their mouth is. Like in a recent interview to the Italian newspaper La Repubblica:

Source : [https://cdn.hashnode.com/res/hashnode/image/upload/v1618573562609/FMUuuadC5.html](https://rep.repubblica.it/pwa/intervista/2020/04/09/news/shoshana_zuboff_altro_che_privacy_le_app_per_il_controllo_della_pandemia_devono_essere_obbligatorie_come_i_vaccini_-253587046/)Source : https://rep.repubblica.it/pwa/intervista/2020/04/09/news/shoshana_zuboff_altro_che_privacy_le_app_per_il_controllo_della_pandemia_devono_essere_obbligatorie_come_ivaccini-253587046/

Of all people, Zuboff setting aside privacy issues and advocating for mandatory pandemic tracking (whatever you as a reader think of it), is really dissonant. The defense can’t be conditional on the type of distopia. Whether the risk is big tech or big brother, the discourse should not change:

I need privacy. Not because my actions are questionable, but because your judgment and intentions are.

Whatever the limitations, the analysis of how our self may get lost in the process is worth a read. Zuboff tends to present Google as a nearly omnipotent entity that will prevail no matter what, which is not true. Google is well known for many failures, as illustrated the failed products cemetery. The Google we know today could be better, yet it is no evil.

But Google is damn powerful in its capability to drive change not only technology but also law and society, and sets the tone for the rest of the industry. Likewise, data hunger has become a widespread business strategy.

Cohen’s book explains better the power effects at play:

The bigger problem with Zuboff’s account is that her fixation on threats to our autonomy screens out broader and arguably more important problems of private power in the information age — for example, the ways in which network effects feed platform power, informationalism generates winner-take-all dynamics, and digital technology has impacted labor — Amy Kapczynski

Zuboff also claims that surveillance capitalism is built on “lawlessness”, due to legal conditions that users of the platforms cannot comprehend or even read. Of course, law experts such as Julie E. Cohen provide much more insights to the matter. Amy Kapczynski gives a more thorough analysis in the Yale Law Journal, and since I won’t do any better the best is to just point to it. I believe it is of particular importance to the European reader, less familiar with the institutional settlements largely driven by the US legal system (where the largest software companies are headquartered).

The path towards regulating Big Tech

As Big Tech keeps using their current demand and social glow to lobby against regulation, for better or worse, building an intellectual framework to better grasp the challenges is a critical requirement. Ideally leading to actionable insight, as done in the proposal of Sebastien Soriano, the head of French regulatory agency ARCEP.

The European Commission is considering imposing legal obligations on gatekeepers of digital platforms to remedy or prevent “commercial imbalances”. This includes Digital IDs — as customers using a single ID to login to a range of unrelated 3rd-party services could be locked in. “Restrictions or separations of digital ID services from platforms’ commercial operations may be necessary”. That would be a game changer.

There are now movements that support that change at scale. The society centered design manifesto is worth a read (and a sign). Ethan Zuckerman, an academic researcher at UMass, has also recently announced his focus on making proposals for public digital infrastructures, which is a subject I have discussed from an opensource viewpoint in another blogpost.