Halfway through James Bridle’s foreboding, at times terrifying, but ultimately motivating account of our technological present, he recounts a scene from a magazine article about developments in artificial intelligence. The journalist is asking a Google engineer to give an image of the AI system developed at Google. The engineer’s response was, ‘I do not generally like trying to visualise thousand-dimensional vectors in three-dimensional space.’ A few pages later, discussing the famous example of grandmaster Garry Kasparov losing a series of six chess matches to IBM supercomputer Deep Blue, Bridle quotes Fan Hui, an experienced Go player, describing the Google-developed AlphaGo software’s defeat of professional Korean Go player Lee Sedol at the 2,500-year-old strategy game: ‘“It’s not a human move. I’ve never seen a human play this move.” And then he added, “So beautiful.”’
The first challenge for proving a system’s intelligence is image cognition: AI are trained for facial recognition or to scan satellite imagery. Still, technology is not primarily considered a visual problem, even if new technologies’ effect on our lives is the subject of countless movies which are often, to echo Bridle’s title, quite dark. Bridle, a visual artist whose artworks consider the intersection of technology and representation, from the shadows cast by drones to the appearance of stock images in public space, does not focus his book on representations of technology, but rather on a different visual problem: invisibility. In his introduction, Bridle warns that society is powerless to understand and map the interconnections between the technological systems that it has built. What is needed, the artist claims, is an understanding that ‘cannot be limited to the practicalities of how things work: it must be extended to how things came to be, and how they continue to function in the world in ways that are often invisible and interwoven. What is required is not understanding, but literacy.’
Literacy, in Bridle’s use, is beyond understanding, and is the result of our struggle to conceive — to imagine, or describe — the scale of new technologies. A lot of the examples in the book are visual and descriptive, providing new imagery to help his readers picture some of the issues that should concern them but are hard to imagine since they happen far from the eye. In a chapter dedicated to complex systems, Bridle describes Amazon warehouses that employ a logistics technique called ‘chaotic storage’ which manages the goods on floors whose organisation is not based on any order a human can grasp — alphabetised books, homeware in a specific department — but on an algorithmic logic that makes the system incomprehensible to its employees. The workers carry handheld devices that direct them across the facility: they are incapable of intervening with the machine’s choice, incapable of seeing its reason. Even when things are made visible, it’s also a reflection of darkness: when IBM developed the Selective Sequence Electronic Calculator, it was installed on a ground-floor shop on East 57th Street in Manhattan. The President of IBM at the time, Thomas J. Watson, wanted the public to see the SSEC, so that they would feel assured that the machine is not meant to replace them. The publicity photos of the IBM calculator, operated by a woman in a former shoe store, do not expose what was actually happening: the SSEC was being used to run simulations of hydrogen bomb explosions, carried out in full view in a storefront in New York City.
New Dark Age is neatly divided into ten chapters, each titled with a single word beginning with the letter C: ‘Chasm’ is the introduction, and one of the most valuable sections of the book, discussing how technological acceleration has changed society and charting the impossibility of seeing clearly how these changes affect every aspect of our day-to-day lives: ‘new technologies,’ writes Bridle, ‘do not merely augment our abilities, but actively shape and direct them, for better and worse. It is increasingly necessary to be able to think new technologies in different ways, and to be critical of them, in order to meaningfully participate in that shaping and directing.’ The next chapter, ‘Computation’, is a short history of computers, in which Bridle explores the interwoven history of computational development and warfare, especially atomic warfare during the Cold War. A chapter called ‘Cognition’ is dedicated to artificial intelligence, and one titled ‘Complicity’ discusses surveillance and systems of control via technology. ‘Concurrency’ takes up an example Bridle has written about before — and which was picked up by major newspapers and television news — and expands it. The initial essay was titled ‘Something is wrong on the internet’; Bridle published it on Medium because, he explained in a short paragraph preceding the piece, he didn’t want the materials he was writing on ‘anywhere near’ his own website. Looking at YouTube videos, Bridle was pointing to several disturbing, weird, dark clips purportedly served up to toddlers. Things like the ‘wrong head trope’, which involves Disney characters whose heads are separated from their bodies, floating onscreen to the sound of nursery rhymes until they are matched with the right bodies or a bloody video of Peppa Pig going to the dentist. Bridle describes ‘a growing sense of something inhuman’ in the proliferation of these — which isn’t necessarily related to the origin of these videos but, rather, to the way they are distributed to children: via an algorithm that serves them disturbing content because it is set to autoplay. Bridle links this example with a discussion of Russian interference with foreign elections via the distribution of misinformation, and he also brings in the Ashley Madison hacks, which exposed that the dating site for married people had tens of thousands of fake, automated female accounts that interacted with men: paid subscribers who shelled dollars to interact with a piece of software attached to a photograph of a woman. The content directed at us, whether created by state propaganda, corporations in search of advertising dollars and paid subscriptions, or simply spammers, creates the same results — confusion, deception, a relationship to power (state or corporate) that is constantly reasserted by the information we are served up. ‘This is how the world actually is,’ Bridle says, ‘and now we are going to have to live in it.’ (And raise our children in it.)
In ‘Climate’, a summary of technology’s effect on and impact by climate change, Bridle outlines the endless cycle in which abuse of resources affects a system that uses those resources both to study and monitor the climate. For example, cable landing sites, where the submarine cables connecting the internet reach the shore, are especially vulnerable to sea level rise — which is ironic since the internet is also a major player in climate change. The power data centres require accounted for 2 per cent of global emissions in 2015, which is about the same carbon footprint as commercial aviation. Cryptocurrencies and blockchain software, so often discussed in emancipatory terms, since they have the potential to decentralise financial systems, require the same amount of energy as nine American homes per day to complete a single transaction; blockchain will use up the same amount of electricity as the entire United States by the end of 2019. In Japan, predictions are that by 2030 digital services will require more power than the nation can generate. The network’s voracious consumption of power isn’t just the responsibility of the NSA data centres, but also the end-users. ‘We need to be more responsible about what we use the internet for,’ Bridle quotes Ian Bitterlin, a UK expert on data centres: ‘Data centres aren’t the culprits – it’s driven by social media and mobile phones. It’s films, pornography, gambling, dating, shopping – anything that involves images.’
Which suggests the missing chapter—or approach—in the book: Culture. Bridle is an artist and the visual examples he puts forth are some of the highlights of the book, especially when considering its subtitle: ‘the end of the future’. The end, that is, of something we’ve always imagined. There is a lovely short section where Bridle writes about the Concorde, the supersonic passenger plane that British Airways and Air France stopped flying in 2003. Bridle describes growing up in the London suburbs under the flight path to Heathrow Airport and hearing, every evening at 6.30 p.m., the rumble of the plane, its futuristic, sleek, triangular design an image of the future that died with the end of the Concorde flights. These stunning few paragraphs on design and its impact on the popular psyche follow a discussion of clean-air turbulence (another terrifying result of climate change, where flights experience extreme turbulence in unforeseen areas) and precede a simple conclusion: that futuristic inventions and designs like the Concorde are the exception and the rule is small in-flight adjustments, like slightly better wingspan leading to slightly better fuel mileage. These two pages set up an idea about what we cannot see: Bridle cites philosopher Timothy Morton’s idea of the ‘hyperobject’, which is a thing that is too big to see in its entirety and thus to comprehend. Climate, for Bridle, is a hyperobject — which we only perceive through its effects: a melting ice sheet, for example — ‘but so are nuclear radiation, evolution, and the internet.’
The things we cannot see are not always imperceptible because they are too large to comprehend, but because they are intentionally obfuscated. The simple example is the language we use when discussing technology — the ‘cloud’ for a series of links between servers; ‘open’ is a decentralised resource, but open-source is also a method of building free software using business-friendly, hivemind-labour. The ‘democratising’ potential of the internet is hailed by multinational corporations, those same corporations that stand to benefit from the positive PR of the ‘freedom’ that platforms like Twitter promote. Without the use of scare quotes, these ethereal, abstract terms press an understanding of the internet as an ecosystem with its own rules, and one that is presented as intangible and ubiquitous. The far-from-simple example is Bridle’s discussion of high-frequency trading. In a chapter titled ‘Complexity’, Bridle describes a bicycle ride from Slough, just west of London, to Basildon near the eastern coast of the UK. The 60-plus-mile journey cuts through the heart of the City, London’s financial hub. The City and its cluster of glass towers is the public image of the UK’s finance sector, but the transactions that fuel it are made out of sight, in warehouses like the Euronext Data Center (the European outpost of the New York Stock Exchange) in Basildon and the LD4 Data Centre (the London Stock Exchange) in Slough. The glass towers, the stock exchanges designed like Greek temples, are now symbolic, empty signs: they stand for something that is totally invisible, that happens in warehouses on the outskirts of the city. Conjure an image from 1980s films about Wall Street and its culture: on the trading floor, men shouting, fighting, running while carrying slips of papers in their hands. Replace it with the image of men sitting in offices, pressing the refresh button again and again on their desktop computers. Then replace that image, too. Financial transactions were always dependent on speed, but as computing power and network speeds have increased, the speed of these exchanges has accelerated to leave these men behind.
Now computers are trading with other computers in countryside locations where space and power are available, but there is no symbolic imagery. ‘Financial systems have been rendered obscure, and thus even more unequal,’ Bridle writes. The chapter on complexity is also the one that talks most about the effects of the meeting of capital — another C word — and technology on the societies we live in, especially in terms of labour. This is the chapter to include a long discussion of Uber’s relationship to its drivers as contractors (who they force to listen to anti-union podcasts) and the charting of Amazon’s storage facilities. These networks are not invisible, they are made to look invisible. And the stakes of opacity are the impossibility of organising, both as employees and as citizens. Could there be an Occupy movement around obfuscated spaces like data centres on the peripheries of cities?
Bridle’s conclusion begins with an event — the 2013 Google Zeitgeist conference. Held annually in an exclusive hotel in Hertfordshire, England, it’s a private gathering — though some of the meetings are posted on Google’s ‘zeitgesitminds’ page, TED-talk style — for executives and politicians. At the 2013 conference, Google CEO Eric Schmidt publicly discussed the emancipatory power of technology. Schmidt talked about how technology, and particularly cell phones and their built-in cameras, could prevent atrocities by exposing them — ‘somebody would have figured out and somebody would have reacted to prevent this terrible carnage.’ His example was the Rwandan genocide, which, he described, had to be planned, ‘people had to write it down’. An image of those plans would have leaked, Schmidt is certain, and ‘somebody would have reacted’. Bridle summarises easily: ‘Schmidt’s — and Google’s — worldview is one that is entirely predicated on the belief that making something visible makes it better, and that technology is a tool to make things visible.’ But of course, the UN, the USA, Belgium and France all had access to intelligence information, including radio broadcasts and satellite imagery from Rwanda, and ‘somebody’ didn’t react. Bridle cites a report on Rwanda, noting it could have been the conclusion of his book, too: ‘any failure to fully appreciate the genocide stemmed from political, moral, and imaginative weaknesses, not informational ones.’
The incapability to understand the scale and impact of technology on the lives of human beings is not a visual problem, it is a problem of imagination. One of the significant achievements of Bridle’s book is that it challenges the idea that to participate in the conversation about technology requires prior technical knowledge. Rather, Bridle points out, the fight is against the intentional obfuscation of systems, and that is before we even consider machine vision: to counter Schmidt’s idea of technology as a tool to make things visible, we need to criticise the role of technology in the creation of that image. Considering these complex questions of representation, maybe we should look to visual artists in order to see a reflection of the world we live in, and see that to point to the darkness is a way of shining a light. For the informed reader of technology criticism, New Dark Age will not be a revelation. Bridle’s research is impressive and the knowledge, examples and concerns he lays out are proposed in an organised, systemic fashion. As a summary of discussions spanning many disciplines, from finance to entertainment and climate change, Bridle’s book is not a primer, but a crucial illustration of just how intertwined these concerns are.
New Dark Age takes its title from H.P. Lovecraft’s ‘The Call of Cthulhu’ — ‘that we shall either go mad from the revelation or flee from the deadly light into the piece and safety of a new dark age’ — but then goes on to cite a line from Virginia Woolf’s diaries: ‘the future is dark, which is the best thing the future can be.’ This book is not a collection of prophecies; it is a commitment to the present. ‘Nothing here is an argument against technology: to do so would be to argue against ourselves,’ writes Bridle. He insists that what is needed is not understanding, but a new language, new metaphors — a new image — that would allow us to look at the darkness directly and — hopefully — begin to see.