Review of Clemens Apprich, Florian Cramer, Wendy Hui Kyong Chun, Hito Steyerl – Pattern Discrimination

Review of Clemens Apprich, Florian Cramer, Wendy Hui Kyong Chun, Hito Steyerl, Pattern Discrimination (Meson Press, 2018), 124 pages, open access

Book website: https://meson.press/books/pattern-discrimination/

Reviewed by Nicola Bozzi

 

Abstract

Published at a significant socio-historical juncture, Pattern Discrimination is a short collection of critical essays about the renewed urgency of identity politics in the age of big data. The four authors discuss different aspects of how algorithmic cultures facilitate ideological polarisation and replicate existing inequalities in the name of technical efficiency. While highlighting the value of each contribution and suggesting the potential benefits of putting theory in dialogue with practice, this review argues the collection  as a whole makes a convincing case against the much discussed “end of theory”.

 

 

 

Keywords

identity politics, big data, algorithmic accountability, artificial intelligence, pattern recognition, critical theory, digital humanities

 

Pattern Discrimination is a timely book. After the Cambridge Analytica data scandal unearthed the dangerously muddy interaction of patchy data policies and opportunistic political marketing that benefited an international rise of the far-right, the digital and the algorithmic can no longer be seen as frivolous alternatives to physical interpersonal relationships: they have acquired social and political urgency. In recent years, the skeptical dismissal of “technological solutionism” – the belief, famously criticised by Morozov (2013), that any social problem can be fixed through disruptive technologies – has been joined by an increasingly louder call for algorithmic accountability. O’Neil (2016) has notably warned against algorithmic decision-making taking over fields as diverse as university rankings, lending models, and predictive policing, while Noble (2018) has explored how even simple Google searches  participate in the upholding of inequality and racial stereotypes. On an institutional level, this reckoning has also led to the launch of several research institutes for the study of the societal impact of AI and big data – an encouraging development that confirms a potentially sinister reality: the algorithms are here to stay.

Beyond the urgency of data policies, this newfound infrastructural awareness has also been an opportunity to reiterate the importance of critical theory and the humanities, which in the last decades have often had a problematic relationship with the popular “digital” prefix. A full 20 years after Bowker and Star suggested computer scientists should read African-American poets and radical feminists (1999, 302), the fraught nature of online categorisation has achieved cultural momentum and the times seem to be ripe for a renegotiation of the terms of interaction between technology and culture. Pattern Discrimination sits at this juncture.

The premise of the book is simple: in the age of big data, identity has returned with a vengeance. In the introduction, Clemens Apprich argues in fact that the narcissistic atomisation ushered in by social media conceals how the very architecture of these platforms operates a more covert and reductive identity politics, through the establishment of echo chambers and filter bubbles. “Pattern discrimination” is thus a fundamental axiom of computational culture (ix) that allows us to understand what kind of identity politics these network technologies are enabling. Together with the contributions of the other three authors – Hito Steyerl, Florian Cramer, and Wendy Chun – Apprich’s premise outlines a scenario in which redesigning the relationship between computation, culture, and critique is as urgent as ever. More specifically, the essays dissect how the data deluge has impacted the way we see, interpret, and correlate information – which is, we are shown over and over again, very different from data.

Steyerl and Cramer, who pen the first two chapters of the book, use in fact very material metaphors to discuss the overflowing abundance of datafied activity that are constantly produced and captured every day. Steyerl compares it to a sea, an opaque mass of input that even the NSA has trouble coping with. Quoting Ranciére, the author remembers how in ancient Greece the voices of affluent men were regarded as speech, while women, children, slaves and foreigners were only producing noise (5-6); similarly, today much of the aforementioned data is dismissed as “dirty” – a way to disqualify information that seems too sloppy or unlikely to be computed within the system’s narratives.

The narratives, however, have already been disrupted, and the social medadata and relational graphs fed into police and marketing databases alike feed a trend that goes beyond postmodern paranoia: apophenia – the tendency to perceive connections and meaning between unrelated things. An exquisite, if sinister, example of this apophenic “over-recognition” is Google’s “deep dream” imagery, consisting of psychedelic meshes of dogs and puppy eyes that result from looping image recognition filters on random noise. Rather than representing machine dreams, Steyerl is keen to point out, these images however reveal just how much signal and noise are defined by preexisting categories and probability (9). All that is apophenic is not bad, though: for Steyerl, whether it is just superstitious mumbo-jumbo or an updated form of divination, apophenia has a creative side as well. These contemporary forms of over-recognition might eventually cause “serendipitous misreadings”, just like seeing imaginary animals in the sky might have enabled astronomic discoveries (17).

If the data described by Steyerl is dirty, Cramer conjures up an even more unsettling world of “crap”. Riffing on Justin Pickard’s concept of “crapularity” – the messy rubbish that constitutes 90% of today’s technological redundancies and which is far away from the much anticipated “singularity” – Cramer explores “crapularity hermeneutics”. He also begins with an oneiric connection: from Freud’s interpretation of dreams, Internet companies and investment bankers have now taken us into the realm of analytics. Cramer’s is not a satirical take on the “Internet of shit” (although the author nods to this popular Twitter account), but rather a call for critique against an emerging behaviourist positivism, in part facilitated by cybernetics. With Drucker (2011), Cramer argues in fact that data is also qualitative (24) and that the situated nature of the viewer in respect to the objects and experiences to be interpreted is a crucial element that should not be factored out of algorithmic cultures (an issue all-too-evident in AI’s “white guy problem”, most notably highlighted by Crawford (2016) in relation to the racist implications of predictive policing).

As a consequence, Cramer’s essay moves from Steyerl’s discussion of “corporate animism” to zero-in on the subject. Wondering if post-structuralist “antitheologies” of the subject may have contributed to creating new theologies of the system, Cramer argues for a deromanticisation of identity in the age of crapularity. It is no longer about metaphysical versus ontological thinking, but rather criticism versus positivism (44). In other words, subjectivity is relative rather than absolute (45), and needs to be defined as agency, decisions, and politics – the denial of which would be a fascistic form of posthumanism. This is a call for the humanistic in the digital humanities, but it is easy to see the relevance of Cramer’s call within the metrics- and association-driven algorithms of YouTube – where anti-postmodern, anti-political views such as those popularised by Jordan Peterson are all the rage. In the age of crapularity, in fact, Popper’s falsification principle (in order to be scientific, a fact needs to be falsifiable) is projected from science onto politics, transforming public debate into a market of competing ideas that, on a dystopian level, also creates a business model fuelled by “crap analytics” and system updates (49). In this scenario, for Cramer today’s populism is thus an attempt for the people to “regain agency against posthuman crapularities”, often putting one type of fascism (the populist) against another (the data). Either way, he concludes, subjectivity remains hard-coded into analytics (52).

 Delving deeper into the algorithmic politics of group identities, Wendy Chun’s chapter is centred on the concept of homophily –  a popular axiom within network science, according to which similarity breeds connection. The main argument of the essay is that big data analytics perpetuate “neighbourhoods” of likeness, enforcing the categorisation of users according to behaviours and preferences that hide more controversial social markers like class, race, and gender. Based on the idea of “love of the same”, homophily eventually valorises consensus, erases conflicts, and naturalises discrimination in order to optimise the circulation of commodified emotion. Chun’s skepticism towards homophily is not meant to disqualify big data and network analysis as a whole, it rather demands a different approach. Like Cramer, in fact, Chun identifies psychology as data analytics’ repressed parent, wishing for an implementation of the discipline in the study of networks, along with feminist ethics (62). After all, if Cambridge Analytica used intersectionality to show the oppression felt by young white men (65), the performativity of networks must be tapped into in order to explore our collective unconscious (67-69).

Stating that we abandoned identity politics when they became most crucial, Chun calls then for new theories of connection, ways to queer homophily and tap into the generative power of discomfort described by Sara Ahmed (89). The author points towards a few examples of such queering, most notably the work of Fox Harrell, whose Advanced Identity Representation (AIR) project pioneers a socially imaginative approach to the stereotypical dynamics of online identity construction. Combining “critical computing” and cognitive science, AIR aims at developing models of social computational identity (for example avatars and personal profiles on social media) that respond dynamically to context, thus minimising the implicit stigma built into the underlying infrastructure (Harrell 2010). Looking for co-relation rather than correlation (85), for Chun the future lies in fact in the new patterns we can create together (90).

Clemens Apprich’s final essay on paranoia pulls together the main themes of the book, delivering its final call. Apprich outlines big data as an ideological system that, rather than inviting the new theories of connection wished for by Chun, gestures towards an end of theory (Anderson 2008). In psycholanalytical (and more specifically Lacanian) terms, Apprich suggests that “unfiltered data represents the real, the absolute unknowable, whereas information stands for reality, rendered intelligible by our cognitive filters. Reality, in turn, can be seen as a composite of the imaginary and the symbolic” (109). The deep dream images referenced by Steyerl are thus the reflection of the algorithm repeating the imaginary they were fed with, while the symbolic order is destabilised by the collective delusions of conspiracy theories.

The paranoia resulting from this crisis is an alternative mode of knowledge, but not in the mass-emancipatory, postmedia fashion theorised by philosophers like Felix Guattari; instead, the paranoid thinking machine is trying to fill the symbolic void with the self-referential truths crafted by the far-right. Can, however, such a machine be repurposed to different ends? While anti-trolling campaigns and the anti-harassment tools put in place by platforms like Twitter can be good short-term solutions, what Apprich argues for is a strategy to reorganise our socio-technical world. To reassemble the paranoid thinking machine, in other words, we need collective media, artistic and cultural practices to recuperate the imaginary from individualist consumerism and reductive identity politics (117-188). Not unlike the other essays, then, the conclusion seems to gesture towards critical and creative appropriations of media, moving from the diagnosis of our collective algorithmic psychopathologies towards the design of more imaginative identities.

Pattern Discrimination is a short book, and each of its established authors has a distinctive voice. These two elements – brevity and style – allow the collection to read like a fresh reflection on an elusive yet urgent matter, which in my view makes up for its necessarily open-ended character. Untangling the paranoid correlations that inform the identity politics of big data is of course a complex endeavour, and – at the very least – Pattern Discrimination as a whole makes a consistent and convincing case against the end of theory, arguing for a more critical approach to the analysis of data and networks. The aforementioned cohesiveness is, however, a limit as well: the essays do speak to each other in a choral way, but that means there are also overlaps in focus and arguments. Illustrating or complementing the theory by showcasing virtuous cultural or aesthetic experiments (as suggested by Chun’s mention of Harrell’s work, for example) in dedicated chapters could have made the project more rounded and comprehensive. In this sense, the presence of Steyerl – an excellent visual artist as well as a theorist – could have provided more of a counterpoint to the other contributors. Nonetheless, Pattern Discrimination brings much-needed intellectual impetus and, hopefully, it will be of inspiration for more cultural and aesthetic responses to the current algorithmic zeitgeist.

 

Reading List

Olga Goriunova, The Digital Subject: People as Data as Persons

https://journals.sagepub.com/doi/full/10.1177/0263276419840409

Rosi Braidotti, A Theoretical Framework for the Critical Posthumanities

https://journals.sagepub.com/doi/full/10.1177/0263276418771486

Adam Arvidsson, Facebook and Finance: On the Social Logic of the Derivative

https://journals.sagepub.com/doi/full/10.1177/0263276416658104

Tero Karppi and Kate Crawford, Social Media, Financial Algorithms and the Hack Crash

https://journals.sagepub.com/doi/full/10.1177/0263276415583139

 

Nicola Bozzi is a PhD student in Media and Cultural Studies at the University of Salford, Manchester. His research focuses on global identity stereotypes and the role of art in contemporary society. As a freelancer he has contributed to a range of publications including Frieze, Domus, Elephant, Impakt Festival, Digicult, NOT, and Wired Italia. His research blog is schizocities.com and you can follow him on Twitter and Instagram at @schizocities. Email: n.bozzi@edu.salford.ac.uk

Leave a Reply

Your email address will not be published.

two × 3 =

This site uses Akismet to reduce spam. Learn how your comment data is processed.