Surveillance capitalism is a large undertaking. A technical one. Sensors in our homes and on our bodies connect to towers and cables running to massive computer centres doing the data processing. A built world meant to collect, command and control our habits, and vested in a few companies.
Being connected to this machine is a scenario ripped from the pages of cyberpunk science fiction, yet it is a mundane detail today. Every day we live close to bots, algorithms, AI and internet daemons—ubiquitous programs and code-based creatures that have colonized the web. Their updates greet us in the morning. They dim our screens to remind us to sleep.
The scale of this machine can be unsettling, especially when understood in the service of surveillance and control. Resistance demands constant suspicion and distrust. Paranoia is a best practice, but the state of alert is hard to maintain.
We don't think much of this world because we don't have to. An entire industry keeps this system running. Known as information security, or InfoSec, it protects the data of surveillance capitalists. The industry is already valued at an estimated $149 billion. It is the price some pay for others to worry about the machine.
InfoSec professionals are growing concerned about the morality of their work. These workers now question the ends of this machine and the exploitation of their labour, data and programs in servitude to surveillance. This consciousness threatens the operating of surveillance capitalism.
Workers at Google have refused to work on U.S. military contracts, including Project Maven (AI that interprets video images to improve drone bombing accuracy) and a censored search engine for China called Dragonfly. Edward Snowden and Chelsea Manning struggled with their involvement in the surveillance complex, becoming whistleblowers that revealed the extent of government surveillance of the internet.
Developers also work on privacy-enhancing technologies like Signal to encrypt text messages, or Firefox to browse the web more securely (perhaps combined with anti-tracking plugins such as Privacy Badger). Many of these projects are old, a reminder that programmers, academics and activists have long worked against popular computing being an instrument of command and control.
It will take more than the ideals of security to crystalize these critical consciousnesses into something more transformational. Too often, security is the flipside of surveillance—how personal information and data remains a secure private asset. Security as a virtue promises a certainty and a stability that ignores the fragility of the machines, and their ecological and economic impacts. A desire to remake the world in an ordered grid, however unachievable, will always require further investment in the machinery of surveillance capitalism.
No, we need to reconsider information security entirely. Not a change in practice so much as a change in perspective. Computing unbundled from the slavish drive of data collection and control. The field must be to continue to keep data safe, and protect privacy, but it can also create a better life for all. One where technology leads to fewer working hours, better equality and justice. Automation for human dignity.
My proposal, one of many like it, is that we reconsider our relations to all those machines watching us. Daemon, bots and algorithms can be allies, not enemies, as I have learned from Indigenous approaches to technology. What would it mean to live with machines without feeling under their control? To live in peace with them rather than in constant paranoia of being watched?
Such an alternative requires bringing in the environmental impact of this mega-machine. Data and information technology are already part of global ecology. Gigabytes is geology. Signals are sustainability. Today’s logic of total information awareness requires tremendous energy and natural resources, so much so that unlimited data storage and constant surveillance might be too much for this world.
One way then to dismantle surveillance capital is to highlight its absurdity in placing unrealistic and out-of-touch demands on humans, machines and Earth. All suffer from the fantasy of total control and perfect information. In its place, we must imagine ways of being and knowing this mega-machine that appreciate its environmental and technical fragility, its uncertainty and ultimately its humanity among all its parts.
Fenwick McKelvey is Associate Professor in Information and Communication Technology Policy in the Department of Communication Studies at Concordia University. His book, Internet Daemons: Digital Communications Possessed, was published by University of Minnesota Press in October.