An app using AI to ‘undress’ women offers a terrifying glimpse into the future

by ARWA MAHDAWI

The world certainly isn’t ready for the DeepNude app, but we’d better start getting ready. PHOTO/: imageBROKER/Alamy Stock

Unless we start taking online misogyny seriously, we are going to face a future where women may not be able to exist online

‘The world is not yet ready for DeepNude’

Want to see Taylor Swift naked? There’s an app for that. It’s called DeepNude and it uses AI to “undress” photos of women and produce a realistic nude image.

Or rather, there was an app for that: the creators of the horrifying program took it down on Thursday after a Vice article about DeepNude catalyzed widespread outrage. “We created this project for user’s entertainment a few months ago,” the app’s creators tweeted. “We never thought it would become viral and we would not be able to control the traffic. Despite the safety measures adopted (watermarks) if 500,000 people use it, the probability that people will misuse it is too high. The world is not yet ready for DeepNude.”

The world certainly isn’t ready for DeepNude, but we’d better start getting ready. One creator may have had a crisis of conscience and taken his program offline, but apps like this are going to continue to pop up – and they will only grow more sophisticated. Not to mention, DeepNude is far from the only program that lets you create realistic fake nudes. Plenty of women are already being digitally inserted into porn photos or videos without their consent – a problem we’re not talking about nearly enough.

“The harm done to women when it comes to this kind of sexual objectification is happening now,” Mary Anne Franks, president of the Cyber Civil Rights Initiative, told the Huffington Post earlier this year. “It’s almost like people have forgotten that this is what this technology really started out as, and the conversation around women has fallen away.”

In recent months there’s been increasing concern about “deepfake” technology, which allows people to produce realistic-seeming photos or video content. But, as Franks noted, a lot of this discussion has revolved around the implications for politicians and powerful people. It ignores the fact that the motivation behind a lot of this tech is to control and humiliate women. DeepNude, for example, didn’t let you undress dudes; the app was specifically focused on women. When Vice tried to use it on a man it apparently “replaced his pants with a vulva”.

If you become the victim of a deepfake there’s very little you can do; our laws simply haven’t caught up to the technology. It’s also basically impossible to protect yourself from becoming the subject of a deepfake. As Franks told the Huffington Post: “There’s nothing you can really do to protect yourself except not exist online.”

The internet can be an incredibly toxic place for women. According to Amnesty International, 29% of American women have experienced threats of physical or sexual violence online. Amnesty International also found that an abusive or problematic tweet is sent to a woman every 30 seconds, on average, with black women being 84% more likely than white women to be mentioned in abusive tweets. Social media platforms have done very little curb to their abuse problem – leading women to censor themselves online or leave social media altogether. The rise of deepfake technology is going to make the internet exponentially more difficult for women. Unless we start taking online misogyny seriously, we are going to face a future where women may not be able to exist online.

The Guardian for more

Comments are closed.