Last Week Tonight with John Oliver
Season 12 Episode 16
Aired on June 22, 2025
Main segment: AI slop
Other segments: Iran–Israel war
Guest: Michael Jones, British woodcarver and content creator
John Oliver explains why you’ve been seeing more AI-generated content online, the harm it can do, and – sadly – why it is threatening his marriage. Do you hear us, cabbage Hulk? Stay the hell away from John’s cabbage wife.
* * *
JOHN: Our main story tonight concerns the internet. It gave us unprecedented access to the entirety of human knowledge — and more recently, this heartwarming video of a single cat raising his son alone… that takes, I’m going to warn you, a very dark turn.
Meow meow meow meow meow meow meow meow meow meow meow meow meow meow meow meow
[Applause]
[Music]
[Applause]
[Music]
JOHN: Yeah, I know. The CGI Garfield movie goes so much harder than you remember.
The only thing more upsetting than that video is that one of the top comments on it is someone asking, “Is this real?” And if you wrote that, lean in close. Everyone else — just please cover your ears real quick, ’cause the next part is just between me and them.
Okay. Yes. Those — those are real cats. They’re — they’re my cats, in fact. And I know you’re thinking, “Why are they wearing clothes?” It’s because I taught them shame. Now, I’m about to lie to everyone else and say that they’re not real. But you and I will always know the truth. Okay? Okay.
So clearly, they’re not real. But another comment under that video simply says: “Why did I watch this and why does this have almost 16 million likes?”
And that’s actually what we’re going to be talking about tonight. Because that video was made with AI. It’s one of dozens of similar videos by the same creator. And if you’ve been seeing more stuff like that lately, it’s because the spread of AI generation tools has made it very easy to mass-produce — and flood social media sites with — cheap, professional-looking, often deeply weird content.
There’s even a name for it: AI slop. And it can come in lots of forms. It can be news articles, or music on Spotify, but you’ve probably encountered it the most in weird images and videos that have begun to dominate everyone’s feeds — whether it’s images of a Jesus made out of shrimp, or videos like Baron Trump wowing the judges on America’s Got Talent while his dad plays backup piano, a pug raising a baby on a desert island, or Pope Francis taking a selfie with Jesus while flying through heaven.
And my favorite part there is that AI somehow chose to give Jesus a watch. And I love the idea that every once in a while, Jesus in heaven looks down at his watch and goes: “Shit. It’s already 4:00 p.m. I’m late to go flying with one of the three dead popes that actually made it up here.”
Slop can be incredibly popular. This AI soup recipe was the seventh most viewed post on Facebook at the end of last year. And at another point, three of the top 20 most seen posts there were AI-generated — including this one of a giant fan bed, which had 35 million views.
And at the start of this year, this image of a horse made out of bread got nearly 50 million views on Facebook — and was even loved by Mark Zuckerberg.
AI slop is basically the newest iteration of spam. As the CEO of one AI content detection platform put it: “Not all AI content is spam, but right now all spam is AI content.”
And it’s becoming a problem on multiple levels. For a start, websites that were previously useful are now becoming much less so — as this woman makes clear:
Pinterest is becoming unusable and I just need to talk about it. This is me just searching ‘garden’ and almost everything is AI generated. Like, what is that cat?
I keep scrolling and like, you literally click on anything and it’s like — this is so obviously AI. And it’s so frustrating.
I want actual images!
Pinterest used to be my favorite app and a place for people to share pictures they’ve taken and for other people to be inspired by those pictures. But now, it is just exhausting and gives me a headache every time I try to use it.
JOHN: I know she might seem calm there, but for a Pinterest user, that is white-hot rage. She is seconds away from knitting a full-blown manifesto.
But it’s not just Pinterest. There’s now a booming market of AI-generated videos about completely fictitious news stories — like this one titled:
Judge finds Caroline Leavitt for wearing a cross (misspelled), only to discover she’s a legal genius. It is 37 minutes long, has over a million views, and here’s just a taste of the court proceedings:
“That cross,” Hargrove pointed at her neck, his tone rigid. “It doesn’t belong here. The courtroom is a neutral place, not somewhere to display faith.”
She realized instantly Hargrove wasn’t just targeting her — but the cross itself, a symbol he seemed to despise.
JOHN: Yeah, the news. You know how the news is — HAL 9000’s voice, comic book font, and of course, third-person omniscient narration.
That channel has a lot of fake confrontations that follow a similar formula, like:
Caroline Levit Bankrupts The View
Brad Pitt Storms Off After Heated Clash with Caroline Levit
Caroline Levit Mocked by Famous Pianist, Then She Played and Silenced the World
And my personal favorite: Pope Leo 14th Orders Caroline Levit to Take Off the Cross
Which makes sense, doesn’t it?
“You can’t wear that in here, Caroline. This is the fucking Vatican. Have some respect!”
The problem is, the comment section under that courtroom video is filled with people who clearly have absolutely no idea that it isn’t real — saying things like:
“What a tremendous victory for Christians everywhere. Proud of you, Caroline, for putting that judge in his place and intelligence for once. She is a force to be reckoned with.”
And when one commenter points out that the video is fake, a reply says:
“No, it happened. But it was Pam Bondi. I just watched a YouTube video.”
Because of course, there are also multiple slop videos out there where the exact same thing happens to Pam Bondi. And if that’s starting to give you an uneasy feeling in your stomach right now, get used to it — because it seems extremely likely that we’re going to be drowning in this shit for the foreseeable future.
So given that, tonight let’s talk about AI slop — what it is, where it comes from, and the harm it can do.
Before we take all of the fun out of it, I’ll acknowledge: some AI-generated content can be enjoyable. Exhibit A: Catchester by the Sea. Which makes sense. The whole point is to grab your attention. That is why you’ll see visually arresting stuff like videos of:
Incredibly buff babies
Cute dogs working human jobs
People transforming into fruits and vegetables
That is actually a weirdly common trend. And I think my favorite version of it might be this:
[Music]
JOHN: Okay. I have so many questions there — from why Cabbage Hulk kept his human hands, to why he suddenly tore his own head off to reveal a different kind of cabbage. But the one thing I know for sure is: he better stay the fuck away from my wife. Stay away from her.
Now, as for AI music — at first, it can sound pretty close to the real thing. Like this country song, whose lyrics are only about 15° off:
[Music]
JOHN: Cool. That is from a band called The Devil Inside, and one good clue their music is AI-generated is that they’ve put out over 10 albums in the past two years. Other clues include the fact that they complain about “wicked dust” twice in the same chorus, and two of their top five songs on Spotify are also dust-related — which makes this feel less like a dark country band and more like a secret ad for Swiffer.
It is now incredibly easy to create slop like that — which, as this CEO of an AI music generator points out, is one of its key selling points:
It’s not really enjoyable to make music now. People enjoy it, sure, but it takes a lot of time. It takes a lot of practice. You need to get really good at an instrument, or really good at a piece of production software.
I think the majority of people don’t enjoy the majority of the time they spend making music.
JOHN: Okay first — and this is a hot take — I think a lot of people might actually enjoy making music. It’s probably why they were doing it for the 40,000 years of human history prior to AI. But if that is not you, and you would rather make music by simply pressing a button, then good news: you don’t need AI. There are already a wide variety of toys made for babies just like you.
But he is right that AI tools have now reduced the barrier of entry to people who want to produce writing, images, or music that can seem plausibly professional. And even big platforms are now getting in on this.
Here’s Mark Zuckerberg proudly unveiling a new suite of AI tools for Meta users:
We call it Emu — for Expressive Media Universe, continuing with our animal theme.
Um, and just take a look at these images, because today we’re starting to roll out a bunch of products with this in it.
And it’s, uh… you know, they’re high quality, photorealistic — but you know, one of the coolest things is that Emu generates them fast.
JOHN: Okay, first — that is just the least charismatic product demo I’ve ever seen. You think Steve Jobs would be caught dead uttering a dud phrase like “continuing with our animal theme”? He’d sooner acknowledge his own children.
But Zuckerberg’s obviously trying to make sure Meta capitalizes on the popularity of AI by keeping it within their ecosystem. He said that adding AI images to user feeds is the next logical jump for Facebook and Instagram. They’ve even tweaked their algorithm so that more than a third of what people see on their Facebook feed now comes from accounts they don’t follow. That’s how slop sneaks in — without your permission.
So, it seems to be good business for them. But what’s in it for the people that produce it?
Well, platforms like Meta, Twitter, YouTube, and TikTok now have monetization programs where they make direct payments to people who successfully go viral. And there’s now a whole industry of AI slop gurus offering to sell you their secrets of how to make profitable slop — for a small fee.
Remember that cat video from before? Here’s its creator trying to sell you a training course:
“Hey, do you want to go viral and make the same videos as I do? It’s really simple. Click the link in my bio, and I’ll guide you step by step on how to do it.”
JOHN: That class costs $17.99 and will supposedly unlock the secrets to crafting viral cat AI videos that capture hearts on TikTok. And you know who should be pissed off about that more than anyone else? Cats.
They’ve been going viral since the dawn of the internet. And now, along comes this sombrero-wearing motherfucker, who isn’t even a real cat by the way, trying to cut them out of the profits. Cats would be so mad about this… if they ever gave a single shit about anything humans do.
Now, the good news is: you don’t need to give your credit card number to an AI cat, because I’ll give you the three basic steps for free.
It’s pretty simple.
Step One: You make a page on a social media app and build a following. While most platforms require a certain threshold of followers or engagement before they’ll pay you, you can easily get around that by buying a pre-existing account — like this one that already has thousands of followers.
Step Two: Create and post as much engagement-bait slop as possible. Here’s a video by another slop guru showing you just how quickly that can be done:
“So what we’re going to do — we’re going to type in home decor inspiration Pinterest. Why? Why Pinterest? Because Pinterest is a heavy image-based platform. And what you’re going to find as we scroll through these: probably 80% of these images are AI generated. Now what’s so cool is that we can come to this free tool right here — Ideogram — and ask it: ‘Create a beautiful bedroom design with black curtains and make it hyperrealistic.’ You’re going to find that the images — boom — just rendered. And right off the bat, for free, we have four images that could do really, really well.”
JOHN: Okay, putting aside that those photos look like a suicidal West Elm — that is why Pinterest is drowning in AI now, and why users like this woman are swearing vengeance with their indoor voice.
And yet, the maker of that video is willing to take $379 from you to teach you more.
But it’s not just Pinterest photos. Here’s another guru explaining how you can buy his proprietary AI tool to make literally every part of a viral video:
“Let’s, for example, do a funny texting story.
And then let’s go down here and select gameplay video.
And let’s do Subway Surfers for the background of this one.
Click generate — just like that, our next video is done.
Let’s take a look at what it looks like:
‘Ever send a text that completely backfired? Let me tell you about my friend Jake.
One day, he meant to text his girlfriend ‘Can’t wait to see you tonight,’ but instead, he accidentally sent it to his boss.
His boss replied: “Uh, I hope this isn’t about work.”
Panicking, Jake quickly typed back…’
As you can see, it’s got a great original story that could go viral on social media on any of those short-form platforms.
And you can make hundreds of these every single month.
And all you need is a couple to take off to potentially make a real business out of even doing this.”
JOHN: Yeah. All he did was type funny texting story, and AI vomited out the rest.
Although, to be fair, I don’t know if you can call “accidentally sending a text” a “great original story” — as it is one of the four basic plots of any sitcom. Along with:
Accidentally scheduled two dates at the same time
Roommates divide the apartment in half with tape
Blue travelers’ bus broke down outside and now they’re coming to dinner
Still, as that leather-clad child explained, AI slop is ultimately a volume game. Like any form of spam, slop creators will often jump on a trend that feels like it’s got any kind of traction. So if one type of image or video goes viral, they’ll churn out hundreds of copycats.
That’s why, suddenly, you might start seeing a ton of old people posing with birthday cakes. In that one, it’s apparently the “hundth” birthday of a woman named Dutton Dwi.
Similarly, you may have seen images of:
Soldiers holding up signs about how it’s their birthday
People posing next to wood carvings of animals with captions like “Made it with my own hands”
But interestingly, when internet sleuths traced this AI image — which has nearly a million likes — back to what seems to be its original source, it turned out to be this real image of an actual wood carving created by an artist named Michael Jones.
He’s apparently based in England and does some amazing work — look at this carving he did of a horse. Or this one of… whatever the fuck that is. Those are real sculptures he made — with a fucking chainsaw.
And yet, his work has been stolen and turned into endless variations. And he himself has said that getting ripped off by slop is a huge issue for him and other carvers all over the world, who are sadly missing out on the rightful credit and exposure to their work.
Which makes sense. And it is a good reminder that AI generators rip off the work of actual artists without compensating them.
But assuming that you, as a slop creator, don’t give a shit about that — you can simply move on to:
Step Three: Getting Paid.
And there are a couple of ways to do this — from being paid directly by the platform (like I mentioned earlier), to other revenue streams based around affiliate marketing or linking to items for sale online.
For instance, this slop account on Instagram features a variety of AI videos — from world leaders walking the catwalk, to babies, to videos of animals attacking each other. Like this elephant absolutely going to town on an alligator.
Now, if you go to the bio on that page, you’ll see it links to this site selling all sorts of random products that they’ll get a commission from if you buy them — from:
Snap-on teeth veneers
To this weird duck-shaped pillow for infants, which brags that it’s somehow based in research and development for stunned babies
To this “Naughty Boy Creative Table Lamp” whose switch is… his tiny dick.
And look — it is wedding season, and if you want to go off-registry… that is all the way off.
Now, I’m guessing you can’t make a ton of money off that. And honestly, you’re unlikely to make a ton from the platforms either — because for all the talk of riches on those slop gurus’ videos, the money involved here can be relatively small.
On Meta, for instance, one reporter found that payments for single images could vary from just a few cents per photo to hundreds of dollars — if it goes mega viral. And while that clearly isn’t enough to live on, it can be more appealing if you live in a country where that kind of money goes further.
That is why many slop pages are now operated from places like India, Thailand, Vietnam, Indonesia, and Pakistan.
But there are some real downsides to the unchecked growth of all of this — some of which are admittedly pretty minor. Like the fact that lots of us are now having to explain to our parents that the unbelievable thing they saw on Facebook… is not real.
“Look at this owl. He’s beautiful. It’s a Norwegian Giant Owl from Norway.”
“Mother, dude. That’s not a real cat. Can you see that? Do you see how like soft it is, and it’s like—”
“What do you mean it’s not a real cat? That is ridiculously cute.”
“Mom. How is that real?”
“Mom, that’s AI.”
“Is it?”
“That’s fake.”
“No fucking way.”
“That’s not real.”
“They’re motherfuckers.”
JOHN: Yeah. They are motherfuckers indeed. And real quick — for anyone watching this, here’s a pretty good rule of thumb: if you see an animal that’s so cute it defies reality, and it is not Mudd*n, odds are it’s AI.
But we’ve clearly got bigger problems than people being duped by non-existent animals. There’s an environmental impact — from the energy and resources consumed in producing all of this shit.
And then there’s the fact that some slop makers specialize in videos that claim to depict real-world calamities, which can lead to the spread of worrying misinformation. Like this:
These explosions are fake, but they’ve racked up millions of views. They’ve already been shared across social media, including this claim that the explosions are in Ukraine. Before this account disappeared, a reverse image search could take you back to the creator — with more videos. And they all have similar issues: See how huge this white car looks? And also, how are these homes still standing with a gigantic fireball behind them? The audio track is the same in every video. So, why make content like this? This creator used to make videos about tornadoes. And before that, about planes on fire — possibly experimenting in the hopes of going viral.
JOHN: Oh, come on. Air travel is scary enough now without people making up new disasters. This is why you have to be careful and look for clues — like how the side of that plane says nonsense like Fracken Fire, and not something much more believable, like Boeing.
The thing is, sometimes fake videos about news stories can really catch on.
During the L.A. fires, a lot of people were initially fooled by photos and videos — like this one of the Hollywood sign engulfed in flames. That was pretty unnerving to L.A. residents, like these two men who went to check it out in person:
“When we saw the posts on Instagram, we were devastated. We’ve lived here all our lives, and uh, when we saw that on Instagram, it broke our heart. So we wanted to come and see firsthand, and uh, we’re really shocked that the— there’s no fires. There’s no fires. And they’re faking it, basically. You know, when people are actually losing their houses, you guys should, uh, help each other out instead of, uh, trying to create false narratives and, uh, trying to promote BS.”
JOHN: He’s right. You shouldn’t be faking calamities — if for no other reason than that you could make those two so worried they trek up to the Hollywood sign. A hike that’s a mixed bag even when it isn’t on fire, since it has online reviews like:
“Watch out for all the horse poop. There was a lot of horse poop on the floor.”
“This trail stinks because of all the horse poo.”
And my personal favorite:
“You should never hike this trail during a windstorm. I was being hit by flying bits of dried horse poop for about 3/4 of a mile before deciding to turn back. It was something. At this point, I decided to abort my hike and ran back to my car. I had bits of dried horse poop stuck in my hair and my sweater. I had a good laugh about it after — but listen to me, children: never hike this trail if it gets too windy. You have been warned.”
And it’s not just the fires. Right now, the Israel-Iran conflict has unleashed a wave of AI disinformation, with one expert saying it’s the first time we’ve seen generative AI be used at scale during a conflict.
And during last year’s flooding in North Carolina, a bunch of fake images started circulating online, which quickly became a problem for first responders:
“We saw a flood of images on social media depicting — and here they are on your screen right now — depicting what appeared to be victims in the flood.
What had happened was, the relief workers — also, who use social media to, for example, find areas where people might need rescue, like this person rescuing a dog or what looked to be, you know, a toddler being rescued —
they use social media just like us to look for areas that need help.
And so this was creating a lot of noise, and it was making it more difficult for them to act quickly.”
JOHN: Yeah, you don’t want emergency personnel allocating resources based on fake information.
Although, I will say, if medics are going to rush to the scene of an AI-generated video — for God’s sake, make it this one. Somebody send an ambulance to that poor cabbage man. He just tore the top half of his purple head off. He cannot be okay.
You may remember that those images were also used by Republicans as evidence that President Biden was failing to deal with the emergency. One member of the Republican National Committee actually posted that photo of the girl with the puppy, captioned:
“This picture has been seared into my mind. My heart hurts.”
But even after being told it was fake, she refused to take it down — writing:
“I don’t know where this photo came from, and honestly, it doesn’t matter. It is emblematic of the trauma and pain people are living through.”
And it’s pretty fucking galling for the same people who spent the past decade screaming “fake news” at any headline they didn’t like… to be confronted with actual fake news — and suddenly be extremely open to it.
You can’t just believe something because you saw it in a picture. Otherwise, our graphics team could make you believe any sad situation. Like:
Oscar the Grouch is in the Illuminati
Or Bigfoot is having trouble selling his feet pics online
Actually… that is compelling. I don’t care where it came from. But it’s seared into my heart.
Now, I will say — at least when it came to last year’s elections, many experts agree that AI’s negative impacts were far less extreme than originally feared. And it is good that large groups of people weren’t fooled into thinking that:
Kamala Harris wore a communist uniform
Or that Trump was violently arrested
Or that Biden put on a Trump hat
Except — wait — that last one was actually real. He really did that. Two months before election day.
How on earth did Democrats lose.
But even if people treated fake images last year with skepticism, there are two catches:
First: AI is already significantly better now than it was then. So it may be easier to fool them going forward.
Second: The very fact people doubted what they saw on the internet actually comes with a bit of a downside — as this disinformation researcher explains:
“Because people know that something might be a deepfake, they actually then stop believing in real things that actually did happen.
And they kind of… discount a true story because they’re able to tell themselves:
‘Oh, I think that might be a deepfake, actually.’”
JOHN: Right. It’s not just that we can get fooled by fake stuff — it’s that the very existence of it then empowers bad actors to dismiss real videos and images as fake.
It’s an idea called the liar’s dividend — which I know sounds like an airport spy thriller James Patterson banged out in a weekend — but it is a real problem.
And you don’t have to look far to see people doing this.
A lawyer for one of the January 6th defendants argued that the government’s evidence was deepfaked — which is an extremely funny thing to say about a coup attempt that was livestreamed.
And just two weeks ago, when Gavin Newsom posted this real photo of soldiers deployed to L.A. sleeping on the floor, the conspiracy theorist Laura Loomer retweeted a claim that it was fake, saying: “Looks like Gavin Newsom used an AI photo to smear President Trump.”
And of course, Trump’s done this too. He falsely claimed photos of Kamala Harris greeting a large crowd were AI-generated. And when Democrats cut together videos of him having old man moments, he blamed that on AI as well — saying: “Artificial intelligence was used by them against me in their videos of me.”
Which actually might be the most AI-generated sounding sentence in this entire piece.
So, to summarize: AI slop can be somewhat lucrative for its creators, massively lucrative for the platforms that use it to drive engagement, and worryingly corrosive to the general concept of objective reality.
And now I’ve shown you a simple three-step process on how to make it — but, you know, please don’t. Especially given I just saved you from paying $17.99 to a cat in a sombrero.
So, what can we do?
Well… the truth is: not a lot.
Some platforms have started labeling AI content, but even those attempts have been lackluster. Meta, for instance, recently started requiring an AI label when content is realistic — but notably, that only applies to audio and video, not to images. And if creators don’t actively add those labels, it can be hard for a platform to detect AI — as it’s getting increasingly hard to spot.
On an individual level. If you’re sick of all the slop in your feed, you can block accounts that post it or click “not interested,” which might reduce the amount of it that you see. But that’s not going to stop it all.
And look, I’m not saying some of this stuff isn’t fun to watch. What I am saying is: some of it’s potentially very dangerous. And even when it isn’t, the technology that makes it possible only works because it trains on the work of actual artists. So any enjoyment you might get from weird, funny AI slop tends to be undercut when you know that someone’s hard work was stolen in order to create it.
So I don’t have a big fix for all of this — or indeed any of it. What I do have, though, is a petty way to respond. Because perhaps one small way to get back at all the AI slop ripping off artists… would be to create real art by ripping off AI slop.
So please — come with me. Come with me.
Because we managed to track down Michael Jones, the chainsaw artist, and we commissioned a special piece from him — ripping off what I consider to be the finest, most inexplicable piece of slop produced to date.
So ladies and gentlemen, I proudly present — from the chainsaw of Michael Jones — a carved manifestation of the Cabbage Hunk.
Look at him. In all his muscular cabbage glory. Look at the details there. This thing is an absolute masterpiece.
And seeing this — doesn’t it make you want to thank the real artist who made it?
Well, I’ve got some great news for you. You can do that. ‘Cause we flew him here too.
Michael is here. Michael Jones, everyone!
Michael Jones, thanks so much, mate. Fantastic.
What a fun way to celebrate the destruction of our shared objective reality.
We are fucked — that is our show. Thank you so much for watching. We’ll see you next week. Good night.
Michael Jones! Michael Jones!
[Music]



