Thinking With Machines
At the End of the Future
1. Messianic Negatives
In 1964 Marshall McLuhan famously stated that the medium is the message. The way we communicate, not necessarily the specific content which is communicated, alters the ratio of our senses.
For example, the radio collectively prioritizes the ear, television the eye. In this sense, media are, for McLuhan, a kind of social surgery. Far from merely affecting the site of an incision, media alter the whole functioning of the social body.
However, with his optimistic commitment to Christian messianism, McLuhan believed that the instantaneity of electric communication would bring us back to an original, primal state. Or so he told Playboy in 1969.
The global village – the technological realization of a primitive immediacy and connectivity – would be the body of an electric Jesus composed of each one of us as we plugged into His circuits. With increased connectivity and instantaneity, the flesh would be indistinguishable from the word, the medium indistinguishable from the message.
McLuhan, of course, did not live to see the development of the internet and the unique message of its medium.
The transformation of human behaviour, of life itself, into programmable code, has been the primary consequence of online media platforms. We submit our desires, preferences, and the minutiae of our attention spans to the surgical table of big data. There, the insights of machine learning extract us from ourselves. Our data doubles fuel predictive models, mathematically secure predictions of our future utilized for everything from platform optimization, political messaging, and commercial marketing.
These data doubles, however, do more than predict. They utilize snapshots of the past to inform what we will continue to see on our screens. They alter the horizon of the visible and proselytize a future derived entirely from a logic of efficiency and statistical prediction. Our capacity for truly private self-reflection is wrenched from us by machines to whose terms and conditions we happily provide consent.
As a result, the future disappears. Specifically, a transcendent and more humane future which necessarily requires an ambiguous space to negotiate what is not yet possible. With every second of our attention collated and colonized, we no longer have the ability to bear responsibility for our own thoughts, we no longer have the ability to think otherwise nor to dream.
That is to say, the ability to formulate what has hitherto been inexpressible, unthought, and unimaginable is thoroughly eroded. Plagued by anxieties and fears of uncertainty, we already bear an ingrained resistance to self-knowledge. This resistance is merely strengthened and automated. The hard cultural and individual work of recontextualizing, reinterpreting, reintegrating, and transcending rigid systems of thought is abandoned.
There is no future.
But perhaps McLuhan’s vision of an electric Jesus was not far off the mark – a terminal, vacant space that swallows up all human potential under the guise of its ultimate flourishing.
The certainties and absolutes of religion are products of a regressive inability to distinguish illusory, wishful fantasies from the anxious realities they placate. The transcendent messianism of Christ places an immovable taboo on autonomous thinking, on transformative and introspective thinking, growth, and transformation. The digital messiah which McLuhan attempted to predict maintains the soothing death of thought, once enshrined in Christ and now encoded in mathematical models: the holy pacifier of nations. Like Christ, our new digital messiah ensures we are happy slaves who exchange freedom for certainty, exchange the future for the rote recombination of the past.
2. Snapshots
But the story doesn’t start digitally.
Just as the photographic camera scientifically reduces a slice of lived experience to the binary logic of visibility (light and shadow), the horizon of digital sight continues to reduce lived experience to the binary logic of computational prediction.
The images of the past – as in, our past behavioural data – are rigidly codified and calculated in an index of mathematical probabilities that exist outside of time. The past, inscribed by machines for the sole purpose of effective prediction, not understanding, comes to be othered from humanity to such a degree that it can no longer be a site of reflective transcendence.
History has no more lessons, so to speak. For history’s datafied inscription, where every record and imprint of a past moment is weighed in relation to its optimal use in generating a behavioural response, is tantamount to history’s forgetting.
The past is damned to be repeated, as the adage goes, and its damnation is now automated.
The future is foreclosed when the scraps of history are captured, like insects, in the sickly amber of platform optimization. When the past is communicated – i.e. remembered – solely as a tool for the excitation of a spellbound public, the living presence of the past in the present is obscured.
Enthusiastically consenting to the terms and conditions, the individual is saved from undertaking the difficult task of being responsible for their emotions – their arousal, rage, and inferiority. The individual also surrenders their ability to address their own role in projecting otherness, in sustaining and constructing rigid identities of race, sex, and class. Most importantly, the individual abrogates any ability for transformation. Slogans for change, for supposed futures, are bought, sold, and marketed as detonators of emotional responses, i.e. data. Slogans for change become ends in themselves, as opposed to merely being means that might lead towards as-of-yet-unimagined ends.
Our outrage and support are collated as an affirmation of the effectiveness of particular content. Like the photograph is a snapshot of ephemeral light/shadow, with no regard to the human life that stands in front of the lens, the predicted future of data is assembled out of impartial calculation determining what retained eyeballs and struck visceral nerves.
In 1965, Theodor Adorno made a prescient observation concerning “cybernetic machines.” He suggested that the widespread adoption of the computer might lead to two possible outcomes. Individuals would either be unburdened of rote cognitive tasks, and thus offered new freedoms for self-determination; or the computer’s formalized patterns of thinking would be internalized as society’s guiding principle, a rigid plan for its most efficient organization. Within such a society, however, individual human beings would cease to be autonomous subjects. They would exist as instruments of a logic which is inherently external to human creativity, as objectified extensions of an otherwise inhuman ideal of optimized functioning.
Adorno argued that the allure of technology’s efficiency, which comes to stand in for ‘progress,’ produces “technological people” – people who, instead of directing individual and collective actions in accordance with humane needs and interests, adhere to technological optimization and corporate interests as the guiding principle of society. This deferral and automation of society’s rudder – so to speak – ensures that technology and its ideals are received as an external force in and of itself, as a justifiable end instead of merely a means towards a human dignity which is increasingly eroded and forgotten.
As a result, it is not merely that innocent users are duped of some coherent picture of the world by the organizational efforts of algorithms. Individual humans begin to act algorithmically – as technological people.
3. Digital Drive
Consider the troll. With a high rate of affective reactions, the troll flourishes in an algorithmic environment and is perhaps the ideal technological person.
A mythical creature, the troll is also a monster under the bridge who resides at the limits of livable space. Monsters, as scholars have observed, de-monstrate. They point to uncomfortable, sharp edges found at the boundaries of human experience. They are visible social outcasts precisely because they represent incompatible aspects contained within society. Their presence affirms and denies what we have to live with yet cannot bear.
The troll is a monster, and what is monstrous in our technological society is our transformation into technological people. To “troll,” to act out, shock, and harass, is to mimic the established and accepted organization of information by order of its intensity of response and attention. Thus, the greatest concern of ubiquitous surveillance and behavioural prediction is not privacy. The greatest concern is the dispossession of our autonomy as we internalize and enthusiastically perpetuate an algorithmic pattern of thinking, offline as well as on.
The Frankfurt School often described both mass media and authoritarian propaganda as a kind of ‘psychoanalysis in reverse.’ Much like how the psychoanalyst might follow the neurotic symptom of his patient back to its unconscious roots, the propagandist traces the anxieties of the masses with the sole purpose of indulging them further. In this way, the propagandist ensures that his public’s fears are never dismantled, so that he might continue to sell them illusory solutions.
By locating real affective disturbances and actively discouraging an authentic reflection upon their complex causes, entire populations can be absolved of introspection and convinced that their attention should solely be turned outwards – to an external ethnic group, social class, or cultural movement which quickly takes on the imagined role of adversary.
Algorithmic prediction, too, bears the mark of fascism through the assurance that an excess of efficient and noncontradictory data equals an excess of objective truth. Contemporary media platforms harness and exploit our tendency to indulge in projection, to repeatedly and willfully fail at self-reflection in order to assure the conformity and, thus, efficiency of predictions.
Indeed, Adorno was clear that an aspect that gave rise to fascism in the 20th century was the affordances of technologically-facilitated power combined with a lack of accountability for where that power could or should be exercised. “Technology is making gestures precise and brutal, and with them men,” he wrote. “And which driver is not tempted, merely by the power of his engine, to wipe out the vermin of the street, pedestrians, children and cyclists? The movements machines demand of their users already have the violent, hard hitting, unresting jerkiness of Fascist maltreatment.”
That is to say, we are finding it easier and easier to explain difficult emotions by refusing to take ownership of our anxiety and inferiority. Instead, in an increasingly narcissistic turn we persistently label any insecurity as an imposition – as the alien work of a nefarious outside individual, group, or system. Perversely, a contradictory effect of this narcissism is the willful reduction of ourselves, as well as others, to the status of things – of data.
The computer, far from merely automating rote thinking, is today an engine whose sole purpose is the elimination of emotional work. The computer is a reverse psychoanalyst, a fascist propagandist whose task is to optimize, delete, and replace the mess, unpredictability, and contradiction of our ‘everyday unhappiness,’ as Freud put it, with the eradication of uncertainty and a totalizing emotional predictability.
But, although a technological development, this is a cultural situation which is far from new. Humanity has always struggled with an impulse, a drive for gratification, that skews our image of the world. Although we might wish to imagine a world of dignity and autonomy, in our efforts to eradicate uncertainty the world we often imagine is one of debasement and heteronomy.
But at least blissful certitude.
Thus, if humanity’s regressive, tribal past is any indication, perhaps we will continue to be content in treating life as raw material for the mill of placating illusions. Perhaps we are content to be dispossessed objects, carved out and harvested from the inside out.
The difficult burden of thinking, of active self-reflection and creative self-determination, is certainly alleviated in such a society – a veritable kingdom of clouds, signals, data, and illusions. A kingdom of no future.
And, as McLuhan suggested, a Kingdom of God.
Bibliography
Adorno, Theodor. [1951] 1991. “Freudian Theory and the Pattern of Fascist Propaganda” in The Culture Industry: Selected Essays on Mass Culture. London: Routledge. 132-157.
Adorno, Theodor. [1951] 2005. Minima Moralia: Reflections from Damaged Life. Trans. E.F.N. Jephcott. London: Verso.
Adorno, Theodor. [1965] 2005. “Notes on Philosophical Thinking” in Critical Models: Interventions and Catchwords. Trans. Henry W. Pickford. 127-134.
Adorno, Theodor. [1967] 2005. “Education After Auschwitz” in in Critical Models: Interventions and Catchwords. Trans. Henry W. Pickford. 191-204.
Cohen, Jeffrey Jerome. 1996. “Monster Culture (Seven Theses) in Monster Theory: Reading Culture. Ed. Jeffrey Jerome Cohen. Minneapolis: University of Minnesota Press. 3-25.
Löwenthal, Leo & Norbert Guterman. [1949] 2021. Prophets of Deceit: A Study of the Techniques of the American Agitator. London: Verso.
McLuhan, Marshall. 1964. Understanding Media: The Extensions of Man. New York: Signet.
McLuhan, Marshall. 1969, March. “Playboy Interview: Marshall McLuhan – Candid Conversation.” Playboy: Entertainment for Men 16, 3: 53-74, 158.




