Why You Need To Stop Sharing Images of Your Kids Online Now
AI has created a risk you wouldn't imagine in your wildest dreams
By now, most parents know they can’t let their children surf the internet without guardrails. And close supervision. It’s a dangerous place. They worry about grooming, stolen data and predators.
But most people don’t realize that they’re the ones putting their children at risk. Thanks to the rise of AI, sharing those cute baby and toddler pictures, even with friends and family, has become an even bigger problem than before.
Pedophiles are creating CSAM (child sexual abuse material) on demand using pictures of real children. But it’s not just pictures anymore. We’re now at the point where they can turn them into depraved videos.
Videos that could haunt your children for the rest of their lives.
Your data is not safe to share
People like me who work in cybersecurity have long warned about the sharing of private data. The possibility of anything you post being stolen and used for nefarious purposes has skyrocketed in the last decade. Over the past few years, we’ve emphasized the risk of sharing images and information about children online.
Identity theft, sharing manipulated images that incriminate your child or even recreating your children’s voice to scam or scare you are all within the realm of possibility now. One of the biggest risks we don’t like to talk about is the sexualization of children's images and videos.
Cases like three-year-old Wren Eleanor being exploited online for the viewing pleasure of thousands of grown men who saved her images showed that predators are sexualizing images of little children on platforms like TikTok.
All videos of Wren have now been removed, but in its heyday, the account had 17 million followers, and a large portion of these followers were grown men. Men who left revolting, lewd comments on the videos.
A year ago, German service provider T-Mobile created the “ShareWithCare” campaign. They saw it as their duty to alert parents to the damage they could cause by sharing images of their children online. Artificial intelligence has added a scary level of possibilities.
The new AI models make creating sexual abuse material easy
Since large language models burst onto the scene as the new big thing in tech last year, I’ve followed the development closely. The good and the bad. I've written about how artificial intelligence is emboldening certain men to create non-consensual sexual material of acquaintances and how even schoolboys swap AI-generated nude images of their female classmates with each other.
The implications this new technology has for women, our society, and the data we share online are severe. They warrant a much louder and broader societal discussion.
Unfortunately, this discussion is still not taking place.
So far, we have seen a few instances of publicly discussed cases like the Taylor Swift AI nudes shared on Twitter or Mia Janin’s suicide after classmates shared deepfake nudes of her. When it happens, there’s outrage, media attention and cries for legislation. But the outrage dies quickly, and people move on to the next scandal.
Meanwhile, AI keeps evolving. As it becomes easier to use, the harm it can cause is increasing. Now, all predators need to create the CSAM of their fantasy is an image of your child’s face.
Maybe you think you’re already keeping your children safe because your social media profiles are private, and you only add people you know in real life as friends.
Until a couple of months ago, I would have agreed that this was all you needed to do.
Based on what I learned this week, I wonder if we should go back to the good old days of showing people paper prints only. Turns out it isn’t only “stranger danger” we have to fear. As is frequently the case with sexual abuse of children, the danger lurks close to home as well.
Predators are now ordering AI-created CSAM of children they know in real life in online forums.
The first perpetrator was sentenced in the UK
This week, in a landmark case in the UK, Hugh Nelson was sentenced to 18 years in prison for creating sexual abuse material for strangers on the internet who sent him pictures of real children. Some of these pedophiles sent pictures of children they had contact with in real life.
According to the BBC, he took requests from the UK, France, Italy and the United States. People wanted explicit images of children being harmed both sexually and physically. Images he created for these deviants using an AI-enhanced computer program.
When he was caught because he offered such images to an undercover cop, he had already created over 60 characters, depicting children from six months to middle-aged.
He had no limits to what fantasies he was willing to fulfill. According to the Guardian, the prosecutor said that Nelson stated:
“‘I’ve done beatings, smotherings, hangings, drownings, beheadings, necro, beast, the list goes on’ with a laughing emoji,”
He sold his disgusting creations in chat rooms for the paltry sum of £80 ($104). He wasn’t doing it for the money. During the 18 months he was active, he made only about £5000. He was there for the kick it gave him and to be part of the community. And he was not content with merely creating images but encouraged at least three of the people who approached him to abuse these children in real life.
Hugh Nelson is only the tip of the iceberg
Hugh Nelson is the first person who has now been convicted in Europe for creating AI CSAM. That he was legally sentenced fills me with hope that this ongoing issue will finally be legislated.
But Nelson isn’t the only one; he’s the tip of an iceberg. While we’re still waiting for AI to do something useful that improves our lives, AI photo, video and audio capabilities are progressing at lightning speed.
A few months ago, we were talking primarily about deepfake nude images. Now, AI sexual abuse videos are being created with an alarmingly good quality.
In July, the Internet Watch Foundation (IWF) released an update to their 2023 report on AI-generated child abuse imagery. They concluded that the problem is accelerating:
AI-generated imagery of child sexual abuse has progressed at such an accelerated rate that the IWF is now seeing the first realistic examples of AI videos depicting the sexual abuse of children…These incredibly realistic deepfake, or partially synthetic, videos of child rape and torture are made by offenders using AI tools that add the face or likeness of a real person or victim.
They also found that the images uploaded to these online communities are becoming more severe. They see more and more AI-generated category A abuse. This means perpetrators can now generate complex ‘hardcore’ scenarios in their videos.
The internet never forgets. These videos will be around forever. And even though we know they’re fake, they’re traumatizing. The idea of someone coming across a video showing them as a child being sexually abused turns my stomach. Or even worse, your child’s classmates confront them with such a video.
Someone you don’t know, like Nelson, might create these videos, yes, but in several cases, the images were sent by people close to the children.
It is difficult to protect yourself and your children from this issue. I realize that. But be even more vigilant than you have been so far. Don’t share pictures in WhatsApp or other chat groups. Also, don’t allow schools and similar institutions to share pictures of your children on their websites or in parent chat groups.
Make sure that as few people as possible have access to images of your children. And please speak to other parents about what is happening. Write to your representatives. Push for tighter legislation around nonconsensual deepfake AI images.
The knowledge that their actions can lead to years in prison and a spot on the sex offender list might not deter every perpetrator, but a lot will think twice and go back to videos that don’t use the videos of a (your) real child.
Be safe.
If you’ve enjoyed my writing and want to support me, please share this story on social media or buy me a cup of coffee!
Please subscribe it really helps with visibility on this platform.
This is horrifying. Thanks for bringing this awful darkness it to light to spare more non-consensual victims. If only we had the AI to find the perpetrators before they cause harm.
Turn the Internet off holy crap. I stopped using social media and sharing photos of ours a few years ago now and I don't have any regrets.