Blue Screens

David Berry, Critical Theory and the Digital

Bloomsbury, 272pp, £65.00, ISBN 9781441166395

reviewed by Dominic Fox

What Derrida wrote, at the opening of Of Grammatology (1967), about language could perhaps today be said of computation: ‘never as much as at present has it invaded, as such, the global horizon of the most diverse researches and the most heterogenous discourses … it indicates, as if in spite of itself, that a historico-metaphysical epoch must finally determine as language the totality of its problematic horizon.’ David Berry’s Critical Theory and the Digital is an informed and wide-ranging attempt to confront the computational epoch with the tools of Frankfurtian social analysis.

Its chapters consider computational ontology, computational aesthetics, critical praxis and the computational. There is insight in every chapter – into the new digital panopticism of Facebook and the NSA; the effects on the economy of attention of social media which format micro-events into continuous ‘streams’; the ways in which the ‘computal’ composes a new technology of power and in which a new digital literacy, or ‘iteracy’ as Berry styles it, might equip social actors to respond critically to the effects of that technology. I think he gets it at least broadly right about all of these things.

However, I felt that the computational as such was barely touched upon in Berry’s text. I admit that, as a working programmer, I began reading with the prejudice that this was likely to be the case. Berry is a practitioner of social theory, and his domain of interest and expertise is the social ramifications of digital technology, not the deep philosophical meaning of computational forms themselves. Yet there is also an attempt to grapple with the ‘deep meaning’ in this book – Berry at one point considers the ontology of computation as an ‘onto-theology’ in a Heideggerian sense – and it was here that my disappointment became more acute.

The problem is not that Berry lacks the necessary technical credentials to speak; but that, as a social theorist, he is fundamentally allergic to the inhuman characteristics of the computational. These must always be reduced – led back – to the domain of recognisably human concerns and praxis. There is never any doubt that, in the confrontation of ‘critical theory’ with ‘the digital’, it is ‘critical theory’ that will win out in the end.

Berry’s reading of the relationship between computational ontology and object-oriented/speculative realist philosophy is heavily influenced by Alexander Galloway’s work on the topic, and shares its flaws: both in characterising object-oriented philosophy as ‘reifying’, and in regarding the computational as fundamentally ‘object-oriented’, when this is in fact one of several available programming paradigms, and by no means a definitive model of what computation really entails.

On the first point, if we understand reification as the operation through which something which is not a ‘thing’ – a process, for example, or a social relation – is mistaken for a thing, then it is begging the question to describe an object-oriented ontology as reifying. Of course it is, from the point of view of a critical theory which regards more or less all putative ‘things’ as reified relations, created in the image of the commodity form. But object-oriented ontology is a rival theory, with rival claims of its own. Amongst these would be the claim that ‘objects are what there are’: that when we take things for things, we are not mistaking them for things – that is, reifying them – but apprehending a thing-like quality that they intrinsically possess. Berry is not obliged to accept object-oriented ontology as true, but he is also not entitled simply to assume without argument that it is false. Here, again, the proposed encounter between critical theory and its digital ‘others’ is reduced to a monologue in which the former tells the latter what is wrong with them.

On the second point, it is important to recognise in object-oriented programming an ideological image of what code is and does, and to recognise that what programmers do in practice and what software systems are in actuality is not accurately reflected in this image. The heyday of ‘object-oriented modelling’ is long past, and even the most mainstream programming languages are increasingly open to other paradigms (as witness the recent introduction of ‘lambda expressions’ into the Java programming language). But even two decades ago, the authors of Design Patterns: Elements of Re-usable Object-Oriented Software (1994) were using the semantics of object-oriented programming to describe protocols of system organisation that had no literal analogue in the world of ‘medium-sized dry goods’: ‘objects’ in this context are already abstractions whose purpose is to capture patterns of interaction rather than to model entities in the so-called ‘business domain’.

I was much more convinced by Berry’s characterisation of computational ontology as a ‘glitch’ ontology, characterised as much by its failure modes as by its operational efficacy. Contemporary software architectures are in fact structured around an assumption of pervasive failure, as exemplified by the ‘simian army’ employed by Netflix to randomly disable parts of their network in order to ensure that the secondary systems which guarantee availability are working as they should.

The ‘iteracy’ needed by participants in any future digital democracy will include not only some knowledge of how computers work, but also an understanding of the centrality of the ‘inoperative’ to social-computational networks, beginning with the internet itself. Early users of Twitter were familiar with the ‘fail whale’ which appeared whenever the system was overloaded. Nowadays Twitter is almost continuously available, irrespective of sudden spikes in usage. Understanding how this can be so, and what material resources are involved in making it so, is at least as important as understanding the low-level semantics of computation. It entails an ontology in which high level characteristics of distributed computer system, like those identified by the CAP theorem (which posits that no system can be simultaneously consistent, available and tolerant of partition), are as important as the low-level details of program execution, or the mediating metaphors of operation presented by the user interface.
Dominic Fox is a writer and programmer living and working in London. He is the author of Cold World and blogs at www.thelastinstance.com.