Fact File: How Sora’s AI videos made going viral easy, and tips on how to spot them
Entertainment, emotion, surprise — these are some of the ingredients that make videos go viral. But seeing is no longer believing, and many of the gripping videos now filling social media feeds bear the watermark of an AI app that lets users create seemingly real videos that are anything but.
Videos generated with the Sora 2 text-to-video app, developed by OpenAI, have fooled many internet users since the app launched in September. From fake ICE arrests to the Louvre heist, videos from Sora and other artificial intelligence models claiming to show real events are racking up millions of views online.
And while digital hoaxes once took a lot of time to create and a little luck to go viral, the ease at which they can now be generated and shared has raised alarm about the lack of ethical guardrails and how these videos fuel distrust online. But, as one expert says, some simple digital literacy skills could help protect people from getting fooled.
FROM HUNDREDS OF HOURS TO THE CLICK OF A BUTTON


