Given that Artificial Intelligence (AI) has until so recently been the stuff of science fiction, it is astonishing to see how quickly it has become part of the mundane fabric of our lives, whether we are curating our social media feeds or shouting at Alexa.
As machines are trained ‘to see’ without human intervention, American artist Trevor Paglen’s (b. 1974) new commission at the Barbican Centre’s Curve explores how artificial intelligence technologies are being trained to categorise objects and people, highlighting the hidden prejudices and bias inherent in AI.
“Machine-seeing-for-machines is a ubiquitous phenomenon,” Paglen has commented, “encompassing everything from facial-recognition systems conducting automated biometric surveillance at airports to department stores intercepting customers’ mobile phone pings to create intricate maps of movements through the aisles. But all this seeing, all of these images, are essentially invisible to human eyes. These images aren’t meant for us; they’re meant to do things in the world; human eyes aren’t in the loop.”
When entering the exhibition, the vast mosaic of 35,000 images displayed floor to ceiling along the entire length of the gallery’s curved wall creates a striking visual effect. Moving closer we can see that the installation brings together snap-shot sized photographs clustered around specific keywords.
The words and images are sourced from ImageNet AI, a dataset created by US researchers a decade ago that consists of over 14 million images taken from Flickr and roughly 21,000 crowdsources categories which are used to train the AI systems that surround us.
It starts off promisingly enough, particularly around the first few words on the wall such as APPLE, LICHEN or SUN. However, the further we immerse ourselves into this avalanche of images, the more unsettling it becomes. We realise that images of INVESTORS mainly show white men, CONVICT or BAD PEOPLE predominantly feature Black people.
According to ImageNet AI, ARTIST MODELs are mainly half-naked Asian Women and ImageNet’s differentiation of how a WINE LOVER looks like in comparison to an ALCOHOLIC is highly questionable, since it mainly seems to rely on the depicted drink.
Surprisingly photographs of Barack Obama turn up in a remarkable number of categories, for example under POLITICIAN, OLIGARCH, RACIST, DRUG ADDICT or TRAITOR.
Throughout his artistic career, Paglen has developed a longstanding interest around issues of surveillance, CIA black sites, drone warfare or the apparatus and essence of America’s security system. In From Apple to Anomaly the artist powerfully illuminates what he calls “the deep forms of bias, prejudice, and cruelty that can be built into machine learning systems that classify people”, encouraging us to ask questions about the supposedly neutral applications of AI and machine learning technologies
At the same time, Paglen also highlights the superiority the human brain still retains over any form of AI, such as the ability to understand nuance. For example, words such as SPAM can have different meanings but, as we can see from the images in the exhibition, to ImageNet AI it only means canned cooked pork.
Right at the beginning of the exhibition space, prominently displayed on a separate standing wall, Paglen has included an art historical reference: Rene Magritte’s Ceci n’est pas une pomme (This is not an apple), 1964. Nevertheless, ImageNet AI has included it in its APPLE category as it cannot recognise it as a Surrealist artwork or a philosophical discussion on semiotics.
Paglen’s reference to Art History suggests that artists have always questioned ways of seeing and image making. His critical investigations into the relationship between vision, power and technology show the crucial role artists can play today by highlighting the problems of perception that have political and social implications that will influence our daily lives, now and into the future.
The Curve, Barbican, Silk Street, London EC2Y 8DS. Open Monday-Saturday 10.00-20.00, Sunday 10.00-20.00. Exhibition continues until 5 January 2020. www.lafayetteanticipations.com