How Our New Exciting AI Tools Are Accelerating the Enshittification of Our World
Why AI tools are making our world more boring and more biased
When I come across a text that begins with the words “in these complex times” or “in an ever-evolving world,” I immediately move on. 99% of the time, this introduction is followed by AI-generated crap.
Have you noticed?
AI loves to use more or less vomit-inducing versions of these intros to delve into important-sounding but interchangeable “information”
Aside from the fact that this text probably doesn’t contain a single interesting word, I cringe at the thought that machines have learned to speak like this from humans. How many mediocre articles served as fodder for training ChatGPT and Bard? And why do these tools use the worst of our intellectual output and not the best?
Just kidding. I know why. It’s statistics.
AI gives us the most likely output for any request. And the probability that a text is mediocre is much higher than the probability that it is great. And that is precisely the problem.
AI amplifies and reproduces the most mediocre, interchangeable results that humanity has produced in the last centuries.
It’s Idiocracy on steroids.
In case you haven’t seen the movie Idiocracy, it’s the story of what would happen to the world if we only allowed the intellectually challenged among us to procreate. When it was released in 2006, it was a hilarious comedy. Today, it feels like a reality TV production.
LLMs (Large Language Models) are behind all the exciting new tools we see on the market — email response generators, image generators, AI headshot creators, AI training companions, you name it, there’s an AI tool for it now. They predict what the next most likely word or move will be. Based on what they have learned from data trawled off the internet.
If you’ve been on the internet for any length of time, you’ll know that it’s not the place where the most intelligent or enlightened voices are the most prominent. Or, to put it another way: The internet is a very stupid place. It’s a place where we see the dumbest and basest instincts of humanity on full display.
And we have used this cesspool of mediocrity and viciousness to train our most advanced tools.
I guess we can be glad that the sentences AI spits out start with “in these complex times” or “in an ever-evolving world.” It could be much worse. And it would be a lot worse if we hadn’t underpaid hordes of now-traumatized Kenyans to remove the worst of the swamp from the datasets.
Vice reported that: “The workers were tasked to label and filter out toxic data from ChatGPT’s training dataset and were forced to read graphic details of NSFW content such as child sexual abuse, bestiality, murder, suicide, torture, self-harm, and incest.”
In the movie, humanity becomes rapidly dumber because intelligent people forget to reproduce. It was human evolution in reverse. In our reality, we’re becoming dumber because we’re amplifying the mediocre voices among us instead of the exceptional ones.
There’s mediocrity, but on top of that, there’s corporate greed. Once the money-grubbers realized that AI tools can deliver output that sounds and looks a bit better than that of a well-informed toddler, they saw an opportunity to save money.
Many companies immediately fired their copywriters and began to fill their websites and marketing emails with AI-written texts. Movie studios began to devise strategies to replace writers and actors with AI-generated replicas.
Everything to serve the public with an avalanche of mediocre content. For cheap. A capitalist’s wet dream.
As I wrote in my article on the “Doorman Fallacy,” the urge to make more money out of every little thing has been making our lives worse and worse for a long time.
It has become the norm to view people solely through the lens of cost without understanding what they contribute to the ongoing success of the business. People are treated as if they’re expendable.
What these people don’t see is that machines can’t — yet — make content that humans find interesting or relatable. So if they continue serving us mediocre garbage, we stop engaging because we might as well be talking to our microwaves. And they’ll stop making money.
One of the funnier aspects of this race to put even more rubbish on the internet is that this cheap output is making AI tools dumber.
Training AI models on the results of their predecessors leads to a form of “dementia.” The models forget the original data on which they’re based, and eventually, the model breaks down. The answers they provide become even dumber.
At this point, you’re no longer talking to your boring neighbor but to his pet chimpanzee. It looks and sounds a bit like a human, but you can immediately tell it isn’t without trying.
Model collapse refers to a degenerative process where, over time, AI models lose information about the original content (data) distribution. As AI models are trained on data generated by their predecessors, they begin to “forget” the true underlying data distribution, leading to a narrowing of their generative capabilities.
In addition to the greed, mediocrity and boredom that these tools generate, they also reinforce prejudice and bias. This is one of the biggest problems we see at the moment.
If you ask Midjourney or similar image-generating tools for an image of someone, it’s sure to surprise you with the most biased, stereotypical image you can imagine. Remember when Buzzfeed asked AI to create Barbies for every country?
Yeah, that went as expected.
But rifle-carrying Sudanese and Russian Barbies or light-skinned South American Barbies aside, what remains is the representational harm that AI generates. Biases that exist in the real world are exaggerated in AI, leading to distortion of the way we see people and to the reinforcement of stereotypes.
Jon Cheung from the London Interdisciplinary School explains: Given that generative A.I. will have a huge impact in shaping our visual landscape, it’s important to understand that the reality it presents can often be distorted, where harmful biases relating to gender, race, age and skin color can be more exaggerated and more extreme than in the real world.
AI acts like your drunk right-wing uncle on Thanksgiving, repeating every stereotype about women, foreigners, races and religions he’s ever heard, and making up some more for fun.
Take the story of Rona Wang, an Asian MIT student who tried to use Playground AI image generator to get a professional headshot. AI decided that to look professional, you have to be white.
the AI-generated image showed Wang as if she were ‘caucasian’ with blue eyes, lighter skin, and freckles.
At least the AI let her keep her gender. From what I saw in my many experiments, Midjourney is very partial to the idea that being a professional means you’re a man.
So, should we get rid of AI tools? After all, they’re playing a big part in the enshittification of our day-to-day experience.
You’ll be surprised to hear that I’m actually a big fan of AI tools. I use them a lot every day. But I use them for simple tasks that can easily be delegated to a machine. I use them to summarize texts. I ask for simple explanations, or I brainstorm topics for LinkedIn posts.
With all tools, you have to know what to use them for. Not everything is a nail, and not every problem should be solved with AI.
The source of the issues we see isn’t really AI or LLMs. It’s this insane drive to make as much money as possible out of everything we invent. Immediately. Instead of taking slower steps and developing ideas on how we can use AI tools to make our lives more comfortable — to outsource monkey work to machines — there’s been a ridiculous gold rush.
Those who were prepared to ignore all the ethical and technical problems that this new technology came with immediately began to milk the AI space for every last penny.
They’ve created the impression that this is a mature technology with magical powers. But it’s not; it’s barely out of diapers.
Its understanding of the world is that of a toddler. If we want to prevent it from turning us all into petulant two-year-olds, we must stop following its lead and ensure that it grows up into a responsible adult.
If you’ve enjoyed this article and want to support my writing, buy me a cup of coffee! For more of my writing, subscribe to my newsletter or follow me on Linkedin.