‘What Hath God Wrought!’

Robin Boast, The Machine in the Ghost: Digitality and its Consequences

Reaktion Books, 208pp, £16.9, ISBN 978-1780237398

reviewed by James Draney

What are we to make of this machine, the computer? Indeed, what is this device, whose operations are forever obscured behind its slim outer casing? Perhaps this is what the literary critic Fredric Jameson meant when, in 1991, he wrote that our digital technology does not possess the same capacity for representation as the older, analogue machines. Turbines and steam engines, of course, posses a certain visual power. Consider the ‘kinetic energy’ of Futurist sculpture, or the ‘mimetic idolatry’ of Diego Rivera’s thundering pistons. But the computer, which lacks this ‘emblematic or visual power’, resists such idolatry. It is nothing more than an outer shell, and we experience its inner processes at a distant remove. Our devices function: that is all we know. Digitally encoded instructions pass through the computer’s hardware and, like magic, they appear on our screens as surface effects.

Perhaps this is why the computer is so often imbued with a mystical significance. Where Rivera and Marinetti idolised their speed-and-energy machines, we see the computer and its digital operations as something altogether different. We live in an age in which Transhumanism and its cultic offshoots preach eternal life in the form of the technological Singularity, but this is only the most extreme instance of techno-worship. Meanwhile, in the far more prosaic realm of academic criticism, the phenomenon known as ‘vapour theory’ has polluted critical engagement with these technologies, replacing factual knowledge with vague metaphors and techno-mysticism. In our hyper-rational age, we’ve managed to imbue our technology with an inordinate amount of spirit. Too many ghosts, not enough machines.

So this is the paradox of the digital: that the most misunderstood concept of our time is the one that bears its name. We live, so we’re told, in ‘the digital age’, but what exactly does this mean? Digital is too often used as a lazy synonym for ‘high tech’, fetishised as some futuristic sublime. But, as Robin Boast argues in his excellent book, digitality has little to do with computers, nor with the mystical ‘ghost’ in our machines. Rather, as Boast puts it, ‘the digital is corporeal.’ The machine is in the ghost.

In this important, clear, and lucid book, Boast elucidates and explains the emergence of digital encoding, and helps the lay reader bridge the gap between what actually occurs inside our hardware and what we experience in the ubiquitous world of the interface. In order to do this, Boast first begins by distancing ‘the digital’ from computation – a process with its own history. Rather, digitality is primarily encoding, it is a form of communication, which is why Boast begins his story with the one of the most well-known encoded messages, the first in history to be sent across a wire, travelling the distance from Washington D.C. to Baltimore in a matter of seconds. ‘What hath God wrought!’ tapped Samuel Morse in 24 May 1844, giving birth to his eponymous code. More did not invent digitality, of course. His encoding was neither binary nor digital, but quinary, that is, based on five elements. Yet Morse laid the groundwork for the first digital code, which was brought into use thirty years later.

The date 1878 is usually remembered as the birth year of the telephone – that great analogue invention – but Boast points out that another, perhaps more consequential invention was featured at the Paris Exposition Universelle that year. Not long after Alexander Graham Bell, Elisha Gray, and Thomas Edison received their awards for their pioneering work on the telephone, a humble employee of the French Post & Telegraph administration approached the podium to accept an award for something called the Printing Telegraph. And with this, Jean-Maurice-Émile Baudot laid the first brick in the road to our digital universe. Baudot’s Printing Telegraph was an encoding system that ran on five-bit binary code. It was not the first binary code, of course, but it was the first to be properly considered digital, and its essence still exists in our computers, tablets, and mobiles today. As Boast remarks, ‘all binary communications, and computer information, is still encoded by fixed length binary code – digital code.’

But Baudot’s code was not yet de-coupled from human perception. Real human beings still had to code and decode in order to send and receive messages. It took the ingenuity of a New Zealander named Donald Murray (perhaps even less well known than Baudot) to bring the Printing Telegraph to its logical conclusion. In 1905, Murray patented a typewriter combined with a telegraph machine, forever locking the QWERTY keyboard in with digital code. But this brilliant convergence also marked a separation. As Boast explains: ‘Murray’s system separated, forever, the operator from the code.’

This was but the first fissure in our grand separation from interface and code. From Murray onwards, no longer would humans read and write in code, translating or decoding messages into natural language. Thanks to Murray, a machine would forever do it for us. ‘What Murray did, probably unwittingly,’ writes Boast, ‘was to establish one of the key qualities of digital encoding: the separation of the digital code from its referent.’ As the literary critic and computer programmer Friedrich Kittler once put it, ‘We simply do not know what our writing does.’

Soon, the evolution of digitally encoded messages begins to converge with the electronic computer with the invention of the punched card. Computers, prior to their coupling with digital code, were once actually quite limited in their functions. They are called computers because they were designed to compute, to do sums and solve functions with an input and output. The advent of punched cards, however, allowed a digital character code to flow easily between different types of systems. Cards made information into a material substance that could be counted and calculated. But by the time the card and the code began to converge, in the mid-twentieth century, we can finally begin to see the emergence of what we now call the ‘digital age.’

Famously, it was Alan Turing who designed the first computer. But Turing’s universal machine did not do sums – it executed algorithms. In this sense, it has nothing whatsoever to do with digitality as we understand it. It wasn’t until 1937, when the engineer Claude Shannon discovered that circuits, made up of relays, ‘could be organized so that they could solve algorithms’, that computers could be programmed to make decisions. On and Off, in this case, could be understood as True and False. This logic is the foundation of our electronic digital computers, the mythical ‘ones and zeroes’ to which we are subject. Claude Shannon, in what Kittler describes as the most consequential MA thesis ever written, begot all the chips (logic circuits) in our devices.

And thus, digitality finally enters the world of the machine. The convergence of the digital code and the punched card would lead to other, further convergences. Such is the nature of digitality. Eventually everything will fall under its banner. For instance, the advent of the first standardized character codes like the American Standard Code for Information Interchange (better known as ASCII) helped convert computers from specific purpose computing systems into ‘general, programmable, information processing systems,’ as Boast writes. Digitality provided the fulcrum that converted computers into the media machines we know them as today. ASCII was the code on which our devices grew up, one of the first codes specifically designed for the new digital information processing systems, a direct ancestor of Baudot’s invention.

Yet with each further convergence, there is an uncanny way in which the human user becomes even further removed from the operations of such devices. Consider Unicode, which was invented in the 1980s as the direct heir to ASCII and is still the code on which our iPads and iPhones run. Unicode can encode all of the character scripts, symbols, and numerals in the world. Yet the invention of Unicode takes us a step further away from the digital operations at its heart, almost fully removing human agency in the process. ‘Where Murray detached the encoding from the letter,’ writes Boast, ‘Unicode would remove any representational role of the encoding, leaving this to decisions of a process, or an application to decide. From then on, characters were merely processed images, connected merely by conventions imposed by programs. Digitality had become nothing but a code.’

Marx wrote that a properly critical history of technology would show how little technological inventions are the work of any single individual. Boast’s book does just that: what is most striking is the manner in which digitality – perhaps the paramount technical concept of our age – was never expressly invented, but emerged from an almost natural process of production. We may not know what our writing does, as Kittler would have it, but this does not mean that we must imbue our information systems with vapoury mysticism. Digitality is, indeed, nothing but a code, and Boast’s book is a marvellously engaging critical history of that code.
James Draney is a postgraduate student with the Department of English at King's College London. His writing has appeared in Bookslut and the Los Angeles Review of Books.