// you’re reading...

Cultural Studies

The Dream of Algorithms: Our Calculated Existences

by Thorsten Botz-Bornstein

Dominique Cardon is a French sociologist working on social questions of the internet. In What are Algorithms Dreaming Of?, his third book, he unmasks the internet world of algorithms by which our everyday lives are increasingly being dominated. An algorithm is a self-contained mathematical operation used in data processing and automated reasoning. The book’s title suggests that algorithms seem to be “dreaming” of something: that is, the “reasoning” of algorithms imitates neither Kant’s pure reason nor Aristotle’s’ practical reason. What raises more concern is that we are supposed to accept their dream as a reality. 

Image: Camelia_Boban; Source: Wikimedia Commons, used under CC-BY SA3.0 License

Image: Camelia_Boban; Source: Wikimedia Commons, used under CC-BY SA3.0 License

There are few aspects of our lives that are not influenced by calculating infrastructures: shopping, traveling, sex, personal, professional decisions, etc. While we tend to take those calculations not always seriously, at the same time we make no serious attempt to escape them. Underlying this algorithmization of everything is the thought that our lives, our bodies, conversations, food, and sleep, are measurable, and that the right calculation of relevant data will make our lives more efficient. The problem is that this activity of measurement, to which in the past only specialists – mostly employed by governments – had access, can now be carried out by anyone. Another problem is that the personalization of data creates a new category of the social, which leads to the “behaviour society” where the relationship between a central authority and the autonomous individual is increasingly being redefined, such that everyday reality is reinvented in terms of statistics. In the end, humans begin to adapt to this reality, which means that algorithms do not merely calculate but transform reality. Still another problem is that, most of the time, the calculations themselves remain opaque.

For example, it is very easy to manipulate online audience measurements, be it through “clickbait” or through the selling of “third-party cookies” to ad-networks like Weborama, Double-Click, or Right Media whose task is to link online commercials to browsers. Fake sites are routinely created in order to accommodate links and there are even site farms (p. 27): “Facebook and Twitter do nothing to chase away false accounts and cheaters” (p. 81). This artificial fabrication of authority has become the elephant in the room that nobody dares to talk about.

Though the preachers of algorithms claim to have overcome the traditional systems of authorities (and their professional representatives, such as journalists, editors, marketing specialists, etc.) in the name of decentralization, in reality they have created a new authority that will sooner or later exclude those who are not already at its centre. Consider the case of internet traffic. Currently, one percent of internet users occupy ninety percent of the overall visibility (p. 95). What kind of decentralization is this? Attention attracts more attention. In the traditional world of authority, attention used to be a matter of merit, while in the brave new world of “numerical affinities” it is merely fabricated (p. 28). Cardon explains this through Gabriel Tarde’s theory of “gloriometers,” which has recently attracted the attention of sociologists like Bruno Latour. Social networks, on this account, are immense “expressive signs factories.” Those signs are quantified and classified according to the guidelines of an abstract form of excellence. You have to be “the best” in terms of clicks and links. The inevitable consequence is that the numerical reality invented by algorithms becomes more and more incompatible with the lived reality of our everyday existences.

Server farm at a traditional data centre (University of Minnesota; photo: Cly22; source: Wikimedia Commons; used under CC BY SA 3.0 License)

Server farm at a traditional data centre (University of Minnesota; photo: Cly22; source: Wikimedia Commons; used under CC BY SA 3.0 License)

These are Cardon’s core arguments, which he unfolds over little more than a hundred pages, backed up with empirical studies and presented through an interesting didactic scheme. Cardon believes that in relation to the web, calculation occupies four different positions: on top of, underneath, next to, and inside the web, with the “calculator” constantly migrating from one place to another so as to weaken the standard model of social statistics (pp. 40-41). When located on top of the web, the calculator counts the marks of “recognition” such as the frequency of links. Located underneath the web, calculation records the traces left by the users. Situated next to the web, it counts the clicks in order to grasp the public in its totality. Finally, from inside the web, it attempts to produce visibility himself.

 

False Freedom

Algorithms are taken to be authoritative not because they have acquired a particular understanding of a webpage, but because they measure the social force of the webpage inside the structure of the internet: that is, they measure the number of times the page has been linked to. This revolutionary idea led Sergey Brin and Larry Page to start Google, the first search engine based on “page rank”. More generally, algorithm culture claims, in the spirit of positivism, to have overcome ideologies and traditional authorities which it deems to make individuals unfree. By contrast, an individual who follows an algorithm, in a sense, is merely following himself. The internet user who follows algorithms does not have to adhere to the “essential” tendencies of his ethnicity, nationality, religion, and not even of his/her gender. This “self” is truly individual as it is not even linked to a larger group. What could be freer than that? When Amazon.com recommends a new CD to a user, this is not because the user is a Muslim or an American woman. Rather, the product is suggested because the algorithm has evaluated the person’s previous behaviour. This freedom, however, stops once algorithms lock us into their own dream of a quantified person. By basing all knowledge on the quantitative analysis of interests, the algorithmic approach refashions individuals and entire societies. It reduces individuals to their behaviours from which they will have difficulties to escape. In the end, the user is captured in a bubble that will be explained to him as the result of maximized freedom. However, real freedom has been stifled by the calculators’ ambition to construct an image of statistical regularity that we take for granted and which we have begun to accept as the result of our free choice. The famous “nudges” that Carl Sunstein analyzed in Why Nudge? can very well be construed as the kind of “voluntary paternalism” associated with cults and religions in order to make their members believe that their forced “choices” were actually voluntary.

A classic existentialist theme comes to mind when Cardon argues that people submit their destiny to the “funnel of the probable” (p. 16) because the sheer size of the sample of personal information used for the creation of an algorithm is simply disproportionate. The probable, here, has become an essence to which we are supposed to submit our existence – instead of, as Sartre urged, imposing our own existence upon the world around us (and only afterwards determining any potential essences). Through algorithms the probable becomes an essence. We accept our own essentialization through quantification and rationalize this as an act of freedom because what has been quantified are our own choices. Yet, asks Cardon, “can the profile that has been derived from my traces not lead to aberrant and discriminatory decisions?” (p. 80). Given that the regularity is even produced automatically, we should be suspicious with regard to any freedom-related argument. Are automatically created choices compatible with (a more existentially determined idea of) freedom? Certainly not. People can have “strange” existences that do not necessarily fit into schemes of regularity.

To be free means to have the right to make mistakes one can learn from, as Cardon points out in some very well written passages (p. 105-6). Being lost in a city, losing time and money, taking risks, marveling at something one doesn’t really understand, assuming responsibility for mistakes one has made… all these are existential experiences an individual is entitled to in the name of freedom. In the world of algorithms we no longer have this freedom.

 

The End of Hypotheses

A free individual also has the right to make a guess based on a limited number of data, and hope that this guess is right. Data science, of course, has no truck with guesswork. The model of the persistently optimistic data scientists is provided by the exact sciences, and since computers are able to calculate an enormous amount of data instantly, it has become virtually unnecessary to make guesses or even derive hypotheses (p. 53). Since we have knowledge about everything, all we need to do is to compare different data sets. The results will emerge immediately. The only hypothesis permitted is the one suggesting that a regular and predictable set of characteristics does exist.

When you have a GPS you do not need to make any guess as to what would be the best way to get to your destination. Knowledge is automatized. All this is efficient but, in the process of automatization, the existential act of choosing has been purged. A free individual must have the right to reflect and to contemplate from afar as long as he deems it necessary. Hence, a truly free individual must also be able to scrap his own internet profile at any moment. Unfortunately, the latter option is no longer available unless we decide to go completely “off the grid”. Only then could we hope to govern ourselves without being essentialized in terms of our behaviour and separate ourselves from the automatized existences suggested by algorithms.

Paradoxically, the algorithmization of life goes against what the free citizen values most highly. People are by and large suspicious of centralized powers, be it the power of politicians, journalists, or unions. People also abhor being classified into broad categories, believing instead that their individuality fits into no “box.” Yet, these very same individuals allow themselves to become locked into the bubble of algorithms – partly because this new algorithmic authoritarianism has successfully been camouflaged as non-authoritarianism, partly because they are impressed with the speed and the effects of algorithmic coordination.

Existence, just like experience, cannot be quantified. What can be quantified are clicks, purchases, performances, or interactions. Since their mere quantification is not interesting either, data need to be subjected to algorithms in order to compute comparisons with other clicks, purchases, performances, and interactions. As Cardon puts it:

“A unified theory of behaviours is replaced by calculators with a constantly revisable mosaic of contingent micro-theories formulating local pseudo-explications of probable behaviors. Those calculations are supposed to lead us towards the most probable objects. They do not need to, and often cannot, be understood. This inverted way of fabricating social life denotes the reversal of the causality underlying statistic calculations. The purpose is to take into account the individualization of our societies and the increasing indeterminacy of the forces that determine our actions.” (p. 53)

Ironically, then, social indeterminism has led to a new determinism (or behaviourism; “comportementalisme”) of new calculation techniques (p. 68). The freedom that was won in postmodern societies has now led to new constraints, this time imposed upon us not by dictators or by capitalism but by technology. Cardon’s book offers arguments that derive from a body of research work done in France by French scholars and which remains yet to be discovered by the international research community. The works of Alain Desrosières, Pierre Lascoumes, Patrick Le Galès, Isabelle Bruno, Emmanuel Didier, Gilbert Simondon, and many others have created a philosophical culture of techno-criticism that has so far gone largely unnoticed outside France. Written in a non-technical language that makes it easy to understand, this book is able to change your view of the internet.

Dominique Cardon: A quoi rêvent les algorithmes: Nos vies à l’heure des big data.
Paris: Seuil, 2015.
ISBN: 9782021279962
Paperback, 112 pp., EUR 11.80.

Thorsten Botz-Bornstein is an Associate Professor of Philosophy at the Gulf University for Science and Technology in Kuwait. He is the author of Films and Dreams: Tarkovsky, Bergman, Sokurov, Kubrick, Wong Kar-wai (2007) and has written a number of books on topics ranging from intercultural aesthetics to the philosophy of architecture.

Discussion

Comments are disallowed for this post.

Comments are closed.