AI takes us deeper into the morass

Videos made on OpenAI's Sora are watermarked to indicate they are produced by artificial intelligence. Sora branding is seen in the background in this photo illustration. Credit: NurPhoto / Jonathan Raa via Getty Images
The relationship of humans to truth was tenuous enough before the rise of artificial intelligence. We have long been a baffling species, by turns skeptical of what's genuine and accepting of what's not. By making fakes look and sound more realistic and swamping social media feeds, AI amplifies those tendencies.
Some of what's generated is silly — Uncle John with a woolly mammoth, Muffin the cat perched precariously on the snout of an alligator.
Some of it is unsettling, like deepfake videos of dead celebrities created on the AI-powered video-generation platform Sora. Earlier this month, family members of comedian George Carlin and actor Robin Williams, both deceased, asked Sora maker OpenAI to put restrictions on these deepfakes. OpenAI responded with a free speech argument, and it's inevitable that some of this will be litigated in court.
While many of these videos are easily seen as ludicrous — physicist Stephen Hawking performing tricks in his wheelchair, Michael Jackson doing stand-up comedy — some are more concerning, like a video of the Rev. Martin Luther King Jr. floundering through a speech and financial scams that used AI to mimic the voices of Taylor Swift, comedian Steve Harvey, and podcaster Joe Rogan.
Videos created on Sora are watermarked to clearly indicate they are produced by AI, but it also is inevitable that others created on other platforms or by other means will not have a similar identifier.
The cumulative damage is that the ubiquity and quality of this AI content make it just another part of everyday life, eroding the skepticism we should have and the objections we should raise. Extensive use of AI videos by no less than the president of the United States to criticize opponents and praise himself further normalizes this new technology.
Not being able or willing or caring to discern what's true is bad enough. But this past week introduced another AI-related fault line in which artificial intelligence is not only a creator of untruth but also a defense against truth.
This came into focus when President Donald Trump's nominee to head the Office of Special Counsel, Paul Ingrassia, withdrew his name from consideration. This came after texts were uncovered in which Ingrassia said he had a "Nazi streak" and urged that the Martin Luther King Jr. holiday be "tossed into the seventh circle of hell," cratering his support among Republican senators who would vote on his confirmation.
His lawyer fingered a different culprit: AI might have done it.
"In this age of AI, authentication of allegedly leaked messages, which could be outright falsehoods, doctored, or manipulated, or lacking critical context, is extremely difficult," attorney Edward Andrew Paltzik told Politico. Paltzik had previously suggested the texts could have been "manipulated."
Similarly, after Politico exposed racist and antisemitic comments in a group chat among members of the Young Republicans organization, one participant told the news organization he had "no way of verifying accuracy" and was "deeply concerned" that "the message logs in question may have been deceptively doctored."
In one respect, this was business as usual. Politicians and public relations people have always questioned the accuracy of damning reports and leaks. Having an already controversial boogeyman like AI to blame might make these defenses more potent. Surely, we'll see more such denials in the future.
It's quite a world we've made for ourselves where AI can be used to create outright fiction passing as truth and also to claim as fake things that actually are true.
In their 1969 dystopian chart-topper "In the Year 2525," the pop-rock duo Zager and Evans sang, "Ain't gonna need to tell the truth, tell no lie ..."
Here we are in 2025, increasingly unable to tell the difference.
Columnist Michael Dobie's opinions are his own.
