An app using AI to ‘undress’ women offers a terrifying glimpse into the future

by ARWA MAHDAWI

The world certainly isn’t ready for the DeepNude app, but we’d better start getting ready. PHOTO/Broker/Alamy Stock Photo

Unless we start taking online misogyny seriously, we are going to face a future where women may not be able to exist online

‘The world is not yet ready for DeepNude’

Want to see Taylor Swift naked? There’s an app for that. It’s called DeepNude and it uses AI to “undress” photos of women and produce a realistic nude image.

Or rather, there wasan app for that: the creators of the horrifying program took it down on Thursday after a Vice article about DeepNude catalyzed widespread outrage. “We created this project for user’s entertainment a few months ago,” the app’s creators tweeted. “We never thought it would become viral and we would not be able to control the traffic. Despite the safety measures adopted (watermarks) if 500,000 people use it, the probability that people will misuse it is too high. The world is not yet ready for DeepNude.”

The world certainly isn’t ready for DeepNude, but we’d better start getting ready. One creator may have had a crisis of conscience and taken his program offline, but apps like this are going to continue to pop up – and they will only grow more sophisticated. Not to mention, DeepNude is far from the only program that lets you create realistic fake nudes. Plenty of women are already being digitally inserted into porn photos or videos without their consent – a problem we’re not talking about nearly enough.

“The harm done to women when it comes to this kind of sexual objectification is happening now,” Mary Anne Franks, president of the Cyber Civil Rights Initiative, told the Huffington Post earlier this year. “It’s almost like people have forgotten that this is what this technology really started out as, and the conversation around women has fallen away.”

In recent months there’s been increasing concern about “deepfake” technology, which allows people to produce realistic-seeming photos or video content. But, as Franks noted, a lot of this discussion has revolved around the implications for politicians and powerful people. It ignores the fact that the motivation behind a lot of this tech is to control and humiliate women. DeepNude, for example, didn’t let you undress dudes; the app was specifically focused on women. When Vice tried to use it on a man it apparently “replaced his pants with a vulva”.

If you become the victim of a deepfake there’s very little you can do; our laws simply haven’t caught up to the technology. It’s also basically impossible to protect yourself from becoming the subject of a deepfake. As Franks told the Huffington Post: “There’s nothing you can really do to protect yourself except not exist online.”

The internet can be an incredibly toxic place for women. According to Amnesty International, 29% of American women have experienced threats of physical or sexual violence online. Amnesty International also found that an abusive or problematic tweet is sent to a woman every 30 seconds, on average, with black women being 84% more likely than white women to be mentioned in abusive tweets. Social media platforms have done very little curb to their abuse problem – leading women to censor themselves online or leave social media altogether. The rise of deepfake technology is going to make the internet exponentially more difficult for women. Unless we start taking online misogyny seriously, we are going to face a future where women may not be able to exist online.

Alabama is waging a full on war against women

Marshae Jones, a 27-year-old Alabama woman, was shot in the stomach last year; she lost her five-month-old fetus as a result. On Wednesday, Jones was indicted on a manslaughter charge for the death of her unborn baby. “The investigation showed that the only true victim in this was the unborn baby,’’ a police officer said. “She had no choice in being brought unnecessarily into a fight where she was relying on her mother for protection.” Meanwhile, the shooter went free. This case is nothing short of terrifying; it treats women as nothing more than walking wombs. What next? A woman who trips and has a miscarriage gets sent to prison? It’s a very real possibility.

Florida woman arrested for turning in her husband’s guns

A 33-year-old woman from Florida was afraid her husband, who had attempted to run her over, was going to kill her. So she took his guns to the police. The police then promptly arrested with her grand theft of a firearm and armed burglary. Remember when Brian Stelter told women they were being hysterical for saying America was a few steps away from The Handmaid’s Tale? Guess what, Brian? We weren’t.

Two women go on record to corroborate Trump rape allegations

After initially downplaying E Jean Carroll’s account of Trump raping her in the 1990s, the New York Times is now taking it seriously. On Thursday it spoke to two of Carrol’s friends who corroborated her allegations. So, yeah, just a reminder that the most powerful man in the world has credibly been accused of rape and is seemingly not facing any repercussions.

Could cervical cancer be eradicated?

Some good news! Scientists believe that the success of the HPV vaccination means there’s a chance cervical cancer could be eradicated in the next few decades. That’s assuming the anti-vaxxers don’t take over the world, of course.

Praying for a hot Dalai Lama

In 2015 the Dalai Lama told the BBC that a female Dalai Lama would have to be good looking or she wouldn’t be “much use” as nobody would want to see her face. He apparently still holds these views; in an interview with the BBC this week, the 83-year-old reiterated that a female Dalai Lama would have to be attractive. Never thought I’d say this, but I guess the Dalai Lama is cancelled.

The Guardian for more

Comments are closed.