Exeter, Devon UK • Apr 20, 2024 • VOL XII

Exeter, Devon UK • [date-today] • VOL XII
Home Comment Image-Net Roulette: Artistic experiment or internet fad?

Image-Net Roulette: Artistic experiment or internet fad?

Katie Fox looks at the rise and impact of the image-based machine learning software, and evaluates the ethics behind it.
5 mins read
Written by

Katie Fox looks at the rise and impact of the image-based machine learning software, and evaluates the ethics behind it.

At a glance, Image-Net Roulette seems like just another online craze, destined to spam your Twitter feed for a week or two before disappearing into the depths of the internet. However, this website, created by Kate Crawford; an academic at the New York University; and artist Trevor Paglen, is not just another quickly put together program designed to go viral. Created as part of Crawford’s research on AI and machine learning, this creation holds a bit more value than other similar crazes like FaceApp, an application that allows people to alter their appearance (and also became a topic of discussion in the media when people started to question their strange data terms and conditions).

Image-Net Roulette takes data from ImageNet – an online database which has a hierarchical system of labelling images. The program was created as some sort of experiment, as it says on the site, to see how humans become classified by machine learning systems.

I was anticipating receiving some sexist tags, but in reality, I got fairly mundane descriptions. Others have received a lot worse. The site actually comes with a warning

I must admit that I try and avoid joining in with internet trends like this, but Image-Net Roulette intrigued me enough to try it out for myself. After seeing it all over my Twitter feed, with people posting shots of the green outlined selfies labelled with bizarre things like ‘mortal’ or ‘sad’, I was curious to see how I would be categorised. I first tried my Facebook profile, and the results returned ‘lass, lassie, young girl, jeune fille’. While slightly odd wording, this is an accurate description of me, a 20-year-old female. The next one however, is a little weirder. A picture of me at a music festival, wearing sunglasses, gives me the label of ‘masquerader’. Ah yes, because we all wear sunglasses to hide ourselves and give off that slightly mysterious vibe… Finally, I used a selfie of me and my cat, which disappointingly only labelled me as ‘person, mortal, human’.

After seeing other people online labelled as ‘stunner’ or ‘beautiful’, I was surprised to find myself somewhat disappointed with my categorisation. What is it about my photos that makes me a ‘young girl’ and not a ‘seductress’? I won’t complain though. Before uploading my pictures, thanks to the discussion around the programme on twitter, I was anticipating receiving some sexist tags, but in reality, I got fairly mundane descriptions. Others have received a lot worse. The site actually comes with a warning: ‘Warning: ImageNet Roulette regularly returns racist, misogynistic and cruel results.’ Racist slurs and offensive terms, along with bizarre categorisations like ‘rape victim’ pop up everywhere when you do a quick search of the site online.

They are effectively building a platform for slurs to be shared online

The question is: how moral is it to run a site that returns offensive results based on people’s personal photos, and does so consistently enough to warrant warning its own users? Well, the site was in fact created to draw attention to the faults of machine learning programs that draw from already problematic data. While the creators wanted to provoke a response, they did also struggle with the fact that the labels provided are often deeply offensive. They are effectively building a platform for slurs to be shared online, in a world where there is already enough hate around. However, they decided that in order to encourage a wider discussion on this modern issue, they should not hide anything, and therefore chose to allow all results to be published.

It is easy for this to turn into a debate of freedom of speech, albeit a modernised take on the topic. Should we allow machines the freedom to output data, that would be deemed unacceptable if published by a human? Personally, I think no. Allowing anyone, be it a person or a computer, to have the platform to stereotype and cause offense in ways like this, is unacceptable. On ImageNet-Roulette, a black male is regularly labelled simply as a ‘black man’, and a white male is often labelled as ‘handsome’. Giving a platform to labels like this, for anyone to see, adds to the stereotypes that are already prominent enough in society, and normalises sexist and racist language, which then leads to more repetition by actual humans, either wanting to cause offense, or doing so without even realising. I agree that it is important to allow for discussion on matters like this, but I believe there are less offensive, less wide-reaching ways of doing so, that are less likely to cause human imitation of computer thoughtlessness.

The Image-net classification of the author

So, what next? The site, as of the 27th September, has shut down. It will remain as a physical art installation in Milan until February 2020. Whilst art installations are often bizarre, I find it strange that this can be labelled as ‘art’. An experiment yes, to be put in a museum of modern technological research, but labelled as art, in the same category as Picasso or Da Vinci?  

In any case, I doubt that this will be last we see of this type of internet craze, whether as a form of art, an experiment, or simple entertainment. Horoscopes, personality quizzes, and even comparisons to other people around us are all examples of how we like to find out more about ourselves and those around us, so any internet game or program that allows us to do so easily, is destined to be a hit.

You may also like

Subscribe to our newsletter

Sign Up for Our Newsletter