by ANASTASIA POWELL, ADRIAN J. SCOTT, ASHER LYNN & ASIA A. EATON
Victorian MP Georgie Purcell recently spoke out against a digitally edited image in the news media that had altered her body and partially removed some of her clothing.
Whether or not the editing was assisted by artificial intelligence (AI), her experience demonstrates the potential sexist, discriminatory and gender-based harms that can occur when these technologies are used unchecked.
Purcell’s experience also reflects a disturbing trend in which images, particularly of women and girls, are being sexualised, “deepfaked” and “nudified” without the person’s knowledge or consent.
What’s AI got to do with it?
The term AI can include a wide range of computer software and smartphone apps that use some level of automated processing.
While science fiction might lead us to think otherwise, much of the everyday use of AI-assisted tools is relatively simple. We teach a computer program or smartphone application what we want it to do, it learns from the data we feed it, and it applies this learning to perform the task in varying ways.
A problem with AI image editing is that these tools rely on the information our human society has generated. It is no accident that instructing a tool to edit a photograph of a woman might result in it making the subject look younger, slimmer and/or curvier, and even less clothed. A simple internet search for “women” will quickly reveal that these are the qualities our society frequently endorses.
The Conversation for more