Surveillance capitalism

Patrick Zandl · 20. říjen 2019

Surveillance capitalism

Liberal capitalism has one charm: everyone can do what they want in it, as long as they don’t restrict anyone else, and it is on this rule that the modern world tries to stand, with minor adjustments. For a long time now, theorists on all sides of the barricades, from economists to sociologists to political scientists to apocalypse prophets, have been pondering what might upset this magic formula for success. Isn’t liberal capitalism all-absorbing by its very nature?

The key to replacing liberal capitalism (leave aside why replace it, everyone sees you differently) is obviously that “do what it wants”. Totalitarian regimes have tried to breed the new man by dictating to a long enough line of generations what they should want and using various forms of planning to satisfy those modified needs. Which ran into many difficulties, from the inability to plan on such a scale, to the no/success of changing habits.

Recently, it seems that another developmental direction of capitalism has found its way: surveillance capitalism. With the help of surveillance and powerful and smart computers, it is possible to target individuals and model them into a mass that desires what is presented to them to desire. Only a few years ago, the means to bring about surveillance capitalism were unavailable or inefficient (mass media), but today they are commonplace. And that is why we stop the subject.

The term “surveillence capitalism” has been promoted for several years by the American sociologist Shoshana Zuboff. Her book The Age of Surveillance Capitalism has already been published in Czech. According to Zuboffová, the current stage of the economy can be characterised by three theses:

  • Everything that can be automated will be automated.
  • Everything that can be digitized will be digitized.
  • Any technology that can be used to digitise and control, if not regulated, will be used to do so.

Let’s leave the first thesis, from which Zuboff digresses to discuss the depression of contemporary society, for a separate story.

The second two theses are essential for us now. In practice, we see that these theses are coming true. All information, including historical information, is gradually being digitized, and all the technologies that can be used for digitization are gradually being used to do so, including technologies as expensive and as unthinkable as satellite photographs only twenty years ago. What is still a question is how much control and control of individuals and communities based on this data will be enforced. There are early swallows, such as the tracking systems in China or Singapore, but why reach for these state entities now. A good example is influencing voter preferences through Facebook or marketing to people through Google advertising.

If anyone still thinks that this is just a marginal thing that a judgmental person will see through, then we must immediately disabuse ourselves of the fallacy. Again, let’s leave aside how many people today have enough information and analytical skills to spot a behavioural trap and look at a practical example such as the game Pokemon Go.

Pokemon Go as a data collector

Yes, it’s “just a game”. The player downloads it on their mobile phone, walks around in the real world and sees the game elements, the various Pokémon they are battling, on their mobile screen. The Pokémon, like humans, congregate at prominent buildings. For example, McDonalds restaurants or Starbucks coffee shops. In fact, developer Niantic signed a sponsorship deal with these businesses in 2016 and is continually signing more. It’s definitely more comfortable for young people to sit in a coffee shop and hunt Pokémon than to wander around town. Niantic charges a sum of between 15 and 50 US cents per visitor to the “Pokéstop”, making the game, which is free, a highly profitable affair. Niantic’s primary interest, of course, is not to change user behavior, but to make money - and its customers are happy to pay the price when it succeeds in their desired direction of changing user behavior.

And that’s the social camouflage of tracking technologies with digital data collection: they weren’t primarily created to change user behavior. It’s just that it’s too tempting and effective to use them for that purpose, and eventually someone will use them for that purpose. Commercial companies in this case to show the link between entertainment and sitting in their restaurant, etc.

That there is room for free choice? That you can choose not to play such a game or attend another pokéstop? Oh, no. Pokemon Go is a phenomenon in its user group and not playing it is to disqualify yourself from the collective. And why would you seek out another Pokéstop when this one is so convenient and has so many great gameplay elements? Will you deprive yourself of success in the game just to give the truth to the theorists proclaiming that such seduction can be resisted? At sixteen, hardly. After all, the publisher takes care of that - he can invest so much of the game’s profits in its development and promotion that he’s a winner-take-all. That’s the way it is in the digital globalised world, there’s no room for a “local version”, why settle for a number two when you can have a number one in the industry.

Direct and targeted influence of habits through manipulation based on detailed knowledge of the manipulated subject. This is one method by which monitoring and knowing data changes people’s behavior. The other is the fact that “someone is watching” - or someone else knows what you are doing. You are hardly going to engage in behavior that will be labeled and pilloried as inappropriate by others, or even by the institution, or even by the state. What about stealing - but what about “utterly unnecessarily slandering” the head of an institution, even a state? Do you need to do that when you know it will be found out?

The increase in the amount of analysable information that machines know about us, and the simultaneous increase in their computational capabilities and analytical algorithms, means that our behaviour as a human community is analysable and consequently modifiable. A sufficiently massive deployment of such technologies means that we will change our behaviour. While we may not all be pushed by machines to change, the remaining ones will only be pushed to change by the environment. And that a few “outcasts” remain? That will always happen, no need to worry about that…

Companies versus the State

Another problem today is that the state administration is not so much active in the digitization of information as private companies. Companies such as Google, Facebook or Apple now control interpersonal communication much better and in more detail than states ever dreamed of. Gmail, Whatsapp, Messenger, Facebook, Waze or iCloud all add more and more digital information to the pile of knowledge of the target object.

For now, the reassuring fact may be that internet companies are not cooperating with the state on sharing this information, even in China they are not very keen on it. There are reasons for this: internet firms have indicated in the past that the state is not behaving very sensibly in the internet and lacks a long-term strategy. So they prefer to avoid it, because any attempt at rapprochement has tended to end in state regulation. First of all. Secondly, states today are not interested in much of this information, they are not structured enough for it and they don’t know how to process it. At least we hope…

And so companies are gradually building parallel structures to the state. Subtly at first, that is, when they arrogate to themselves the right to restrict the rights of users within their platform and go beyond what the laws dictate. Liberal capitalism understands this: they manage their assets, they have a vision of how to carry out this management, and they have “rights” to do so. Facebook does not restrict public debate, it does not censor - from the perspective of liberal capitalism, it simply deletes content that it deems inappropriate for other users of its platform to see. The content would upset users of its platform, they might leave or reduce their traffic, and that would hurt Facebook financially. Deleting content is not censorship, it is protecting an investment, says liberal capitalism.

That this approach to content has the same, and often worse, impact than censorship is something few people realise. At least in its own space, which is admittedly huge, Facebook has usurped the rights of the exercise of a judicial power from which there is no appeal and it is often difficult to seek redress through the state courts. Now it is launching a virtual currency project, the Pound, which will set it apart from another marker of state sovereignty, the central currency. And it will be interesting to see how Facebook would hold up in a power struggle with state power, for example in an escalating military conflict. I wouldn’t bet on the destruction of its servers going smoothly, certainly not by digital attack and probably not by simple physical confiscation of server rooms. If, punishable, it would be economic rather than forceful, and even here there would be an agreement rather than an escalation of conflict.

In the end, this means a certain erosion of state influence, which politicians not only in the Czech Republic are coping with by buying their own media as a “nuclear suitcase” or rather a carpetbagger suitcase. But that’s another story, which makes the problem obvious: it is individual politicians who are coping with the situation, not the states themselves. It is one of the sources of the crisis of liberal democracy.

The question is whether the massive spread of surveillance technologies will be a definitive change in the world as we know it. To some extent, it undoubtedly will. The element of danger and the unknown is disappearing from the world as we used to perceive it. It is increasingly strange not to know very precisely where children or more expensive objects are, for example. Everything is being equipped with tracking technologies under the impression of ‘loss prevention’. But this deepens the social aversion to loss. By preventing smaller losses, we cannot so easily cope with larger losses, whether property or psychological. And this will have its consequences. All the more profound the more unbalanced the globalisation process becomes - as we are already seeing today.

Mass surveillance will change our society, there can be no doubt about that, just as there can be no doubt that its advent will not be stopped by global regulation. But it can reverse the biggest negatives. That is why it makes sense to talk about the regulation of surveillance technologies and, in particular, the processing of data from them.

Chcete tyto články emailem?

Twitter, Facebook, Opravit 📃