Incels Use AI To Undress & Dress Women Against Their Will
Policing or defiling women’s bodies with AI tools has become an entertaining pastime in the incel community
A Better World Tomorrow is an entirely reader-supported publication and I would like to keep it accessible to all. However, if you read it regularly and appreciate the content, please consider becoming a paid subscriber. Or, you can buy me a coffee to support my writing instead. And please consider sharing this article. It helps spread the word
No matter what kind of clothes women wear or don’t wear, some men will take issue.
If you wear too much, you’re a prude; if you wear too little, you’re a slut.
Women can’t win.
It’s not about aesthetic preferences. It’s about control. Some men love to control women’s bodies. AI has given them new tools to do so.
With AI, even complete noobs can create nude images and videos of women who prefer to remain clothed. And a while ago, they started putting clothes on women who would rather remain undressed.
When fake nude pictures of Taylor Swift appeared on X (formerly Twitter) in January 2024, the whole world finally caught on to the fact that AI can create deep fake nudes.
And it was high time people took note. Swift wasn’t the first woman to be the target of this trend. This has been going on for a while. But she was probably the most famous.
404 Media reported that the images originated in a Telegram group where users routinely share explicit AI-generated images of women.
Unlike many other women. Taylor managed to have her fake nude images swiftly removed from X.
Her extensive fanbase made sure that the moderators of the site were so inundated with reports they actually did something to rectify the situation for a change.
X, like most social media platforms, does have a policy against the sharing of nonconsensual explicit images on the platform:
You may not post or share intimate photos or videos of someone that were produced or distributed without their consent.
Sharing explicit sexual images or videos of someone online without their consent is a severe violation of their privacy and the X Rules. Sometimes referred to as revenge porn, this content poses serious safety and security risks for people affected and can lead to physical, emotional, and financial hardship.
Still, many women have problems getting their pictures removed once they're on the internet.
Unlike Taylor Swift, they don’t have an army of people helping them play whack-a-mole with the men reposting their images again and again.
There are now numerous services like Take it Down offering to help women get their pictures removed.
The outrage around Taylor’s fake nudes triggered a public discussion, even among people who had not been interested in the subject before. Or hadn’t known about it.
Women around the world have been calling for legislation to stop people from sharing this kind of content for a while.
With little success.
Mia Janin, a 14-year-old girl from London, killed herself in January.
Why?
She couldn’t cope with the humiliation of being bullied by male classmates who shared fake nudes of her in Snapchat groups.
Girls as young as 11 or 12 are becoming victims of this trend. While parents and teachers are at a loss on how they should deal with the fallout.
Last October, police in Washington state were alerted by parents because a high school had failed to report underage students using AI to “undress” underage classmates. Police labeled the images a possible child sex crime.
Still, nothing happened to the boys who created these images. The case was ‘referred for diversion,’ which means no charges were raised.
No matter how many cases are reported. Nothing seems to change at all.
Every wave of outrage fizzles out after a few weeks as lawmakers seem unable to develop a strategy to tackle the problem.
It doesn’t even seem to matter that more and more victims are children.
It remains to be seen if the uproar the Swifties caused when they rode to defend their idol will extend beyond having the pictures removed from X.
I’m not holding my breath.
In February, we saw a new iteration of the “AI used to control women’s bodies” nonsense.
4chan, the same cesspool that gave us Taylor Swift’s fake nudes, found a new way to target women.
No longer content to remove the clothes of women who do not want to be naked, they artificially put clothes on women who prefer to be unclothed.
Pictures appeared on 4chan showing how images of women can be altered with a service called DignifAI to be more “modest.”
Men took pictures of women they considered indecent, added clothes, made them appear pregnant and/or surrounded them with children.
If you’re not familiar with 4chan, it has long been the breeding ground for the next generation of incels.
It is the place men go when their views are deemed too extreme, even for hellholes like X or the danker areas of Reddit.
There, they congregate, complain about women, objectify them, and feel sorry for themselves. Or flood the internet with nudes or racist pictures.
Now, AI has given them the tools to do what they never seem to achieve in real life — manipulate women’s bodies.
They take off women's clothes or put clothes on them.
The irony eludes them.
Conservative influencers like Ian Miles Cheong were quick to help them spread their vitriol outside of their echo chamber.
Cheong posted an image showing an adulterated (yes) image of adult content creator Isla David to X.
In the picture, a heavily deformed Isla David is dressed in a white potato sack, surrounded by three small children. Her thighs and legs are shrunken, her feet mangled, and she has a third arm.
All the weirdness of this image didn’t stop incels and mysoginists from cheering about how this change was somehow superior to the original.
My only explanation is that they either aren’t familiar with women’s anatomy or are too focused on the little children in the image. I find both options highly disturbing.
However, the majority of people did notice and were quite verbal about it.
There were over a thousand comments in a short time. And from there, the trend became viral.
More and more pictures of women retouched to please a conservative aesthetic surfaced on X.
You could argue that putting clothes on women is less offensive than nudifying them and making these images public.
And I’d agree. It’s probably less embarrassing and also slightly hilarious to see an image of yourself surrounded by kids in a modest dress — while being confronted with a fake nude of yourself will be traumatic for most of us.
But I have a problem with the mindset behind both actions. It’s the same. These men are trying to control or police women’s bodies.
I wish I could say that this only happens in the dark areas of the internet. And this is an uncommon behavior. But unfortunately, it happens publicly and everywhere.
There are the obvious big issues like right-wing politicians wanting to decide what women are allowed to do with their bodies.
It starts with the little things.
With men telling their girlfriends or wives what they’re allowed to wear when going on a girl's night out. Or debating if they should let them go out at all.
With my ex telling me I wasn’t “allowed” to pierce my nose because only sluts do that.
With men asking what your “body count” is.
All of these things are about limiting women’s bodily autonomy.
What a woman wears or doesn’t wear and what she does with her body is not for men to decide.
But it seems some men refuse to accept that.
Yes, AI tools give men the ability to control women’s online images, but we need an ongoing public discussion not only about the danger of AI but also about women’s right to self-determination.
These new trends are rooted in a deeper issue. They are expressions of the fact that some men think they own women’s bodies.
If you’ve enjoyed my writing and want to support me, please share this story on social media or buy me a cup of coffee!
Please subscribe it really helps with visibility on this platform.
I confess I briefly tried to work out which I found creepier. Undressing someone all porno or dressing them all Stepford...
(... I decided in the end the only winning move is not to play.)