Cameron Peters here — it's Wednesday morning.
After Meta and OpenAI launched dueling AI video apps last month — Vibes and Sora 2, respectively — AI slop is everywhere, even more than it already was. To understand what tech companies see in the prospect of unending short-form AI video, and how to tell if you’re being subjected to it, Vox’s Today, Explained podcast spoke with Hayden Field, a senior AI reporter at Vox’s sister publication, The Verge.
Read on for an excerpt of Field’s conversation with my colleague Sean Rameswaram. (You can also listen to the full podcast here.) |
|
|
| Sean Rameswaram What is Mark Zuckerberg trying to do with Vibes? |
|
|
| Hayden Field
That is the million-dollar question. These companies, especially Meta right now, really want to keep us consuming AI-generated content and they really want to keep us on the platform.
I think it's really just about Zuckerberg trying to make AI a bigger piece of the everyday person's life and routine, getting people more used to it and also putting a signpost in the ground saying, “Hey, look, this is where the technology is at right now. It's a lot better than it was when we saw Will Smith eating spaghetti.” |
|
|
| Sean Rameswaram How did it get so much better so fast? Because yes, this is not Will Smith eating spaghetti. |
|
|
| Hayden Field
AI now trains itself a lot of the time. It can get better and train itself at getting better. One of the big things standing in their way is really just compute. And all these companies are building data centers, making new deals every day. They're really working on getting more compute, so that they can push the tech even more. |
|
|
| Sean Rameswaram Let's talk about what OpenAI is doing. They just released something called Sora 2. What is Sora? |
|
|
| Hayden Field
Sora is their new app and it's basically an endless scroll AI-generated video social media app. So you can think of it as an AI-generated TikTok in a way. But the craziest part, honestly, is that you can make videos of yourself and your friends too, if they give you permission. It's called a Cameo and you record your own face moving side to side. You record your voice speaking a sequence of numbers and then the technology can parody you doing any number of things that you want. So that's kind of why it's so different than Meta's Vibes and why it feels different when you're scrolling through it. You’re seeing videos of real people and they look real. I was scrolling through and seeing Sam Altman drinking a giant juice box or any number of other things. It looks like it's really Sam Altman or it looks like it's really Jake Paul. |
|
|
| Sean Rameswaram How does one know whether what they're seeing is real or not in this era where it's getting harder to discern?
|
|
|
| Hayden Field
These tips I'm about to give you aren't foolproof, but they will help a bit. If you watch something long enough, you'll probably find one of the telltale signs that something's AI-generated.
One of them is inconsistent lighting. It's hard sometimes for AI to get the vibes of a place right. If there's a bunch of lamps — maybe it's really dark in one corner, maybe it doesn't have the realistic quality of sunlight — that could be something you could pick up on. Another thing is unnatural facial expressions that just don't seem quite right. Maybe someone's smiling too big or they're crying with their eyes too open. Another one is airbrushed skin, skin that looks too perfect. And then finally, background details that might disappear or morph as the video goes on. This is a big one.
Taylor Swift, actually — some of her promo for her new album apparently had a Ferris wheel in the background and the spokes kind of blurred as it moved. |
|
|
| Sean Rameswaram Anything else out there that we should be looking for? |
|
|
| Hayden Field
I just wish we had more rules about this stuff and how it could be disclosed. For example, OpenAI does have a safeguard: Every video that you download from Sora has a watermark or at least most videos. Some pro users can download one without a watermark. |
|
|
| Sean Rameswaram Oh, cool, so if you pay them money, you could lose the watermark. Very nice. |
|
|
| Hayden Field But the other thing is I've seen a bunch of YouTube tutorials saying, “Here's how to remove the Sora watermark.”
|
|
|
| Sean Rameswaram Do companies like OpenAI or Meta care if we can tell if this is real or not? Or is that exactly what they want?
|
|
|
| Hayden Field
They say they care. So I guess that's all we can say right now. But it's hard because by the very nature of technology like this, it's going to be misused. So you just have to see if you can stem that misuse as much as possible, which is what they're trying to do. But we're going to have to wait and see how successful they are at that. And right now, if history is any guide, I'm a little concerned. |
|
|
⮕ Keep tabs
Trumpian diplomacy: My colleague Josh Keating explains the hard questions that the Gaza ceasefire put off — and what we know about whether it will stick.
Accessing a miracle: An innovative new drug has the potential to revolutionize HIV treatment, writes Vox’s Pratik Pawar. But breakthrough science doesn’t guarantee a breakthrough in public health. “I belong to this country”: The US welcomed Afghan refugees after withdrawing from the country in 2021. Now, the Trump administration wants to send one man back to his death. [Washington Post] Group chat leak: Politico obtained months of racist, violent, antisemitic Telegram messages between leaders of state Young Republican groups. [Politico]
|
|
|
| What the Gaza ceasefire really means |
What happens next in Gaza now that a ceasefire has been reached and how the last two years might have taught the world the wrong lessons about war. |
|
|
What’s it like to storm a Revolutionary War fort? The Atlantic’s Caity Weaver set out to accompany a band of reenactors and find out. You can read her deep dive into their world here. |
|
|
Today’s edition was produced and edited by me, staff editor Cameron Peters. Thanks for reading! |
|
|
Are you enjoying the Today, Explained newsletter? Forward it to a friend; they can sign up here. And as always, we want to know what you think. Let us know by filling out this form or just replying to this email.
|
|
|
|