There is rising concern that U.S. adversaries will use new...

There is rising concern that U.S. adversaries will use new technology to make authentic-looking videos to influence political campaigns or jeopardize national security. Credit: AP

As the war for truth rages on, most prominently in our political discourse, we need to face an onrushing new reality:

We ain’t seen nothing yet.

For all the evidence of all the lies that infected the 2016 presidential campaign and beyond — from the fake ads and fake news and fake accounts on social media, to the fake political websites spewing fake stories from places like Russia and Macedonia, to the bots steering those stories to people identified as being susceptible to them, to Russia’s denial of evidence in indictments proving its involvement — it’s about to get a whole lot worse.

That’s the sobering conclusion of a report on artificial intelligence and international security released last week by the Center for a New American Security, an independent and bipartisan think tank based in Washington.

The report included a section titled “deep fakes.” For anyone who cares about the truth, it’s chilling.

It talks about artificial intelligence systems capable of taking any person for whom there exists a lot of audio recordings — pretty much any political figure, in other words — and creating synthetic voice recordings for that person that sound real. Even more alarming, that’s also increasingly becoming true for video. At the moment, these “deep fake” audio and video recordings can easily be detected by ordinary people.

But, the report warns, given the progress of these technologies, “Society may be only a few years away from such forgeries being able to fool not just the untrained eye and ear, but sophisticated forgery detection experts and systems.”

And then what?

We have a president in Donald Trump who creates his own reality out of whole cloth, with facts that aren’t facts, numbers that aren’t real, statements that don’t square with the truth, accounts of meetings contradicted by others, over and over and over. And there is a segment of the population that either doesn’t know about the armies of fact-checkers disproving these contentions, doesn’t do its own cold-eyed analysis, or doesn’t care. They’re already buying.

Now add the possibility of a “deep fake” audio and/or video to back him up. For example, the president has said that Democrats want to let MS-13 run wild in our communities, let drugs pour into our cities, and let immigrants take jobs from Americans. In the future, a real-seeming audio recording or videotape produced by a Trump sympathizer or Russian troll could surface online showing Sen. Chuck Schumer saying exactly that.

If some American voters could be convinced that Pope Francis had endorsed Trump . . .

Obviously, Democratic sympathizers could do the same thing to Republicans. Forgery knows no ideology (though numerous studies showed that fake news stories in 2016 favored Trump over Hillary Clinton by a 5-to-1 ratio).

Artificial intelligence will be part of the solution. AI tools can detect disinformation, weed it out and block bots. Machine learning algorithms, like Botometer API on Twitter, are getting better at ferreting out bots for removal. Twitter says that since May it has been “locking” nearly 10 million suspicious accounts weekly; published reports say Twitter suspended upward of 70 million accounts in May and June.

But if you view this cat-and-mouse game as a digital arms race, the Center for a New American Security makes clear who’s winning: “Yet too often the ability to create and spread disinformation outpaces AI-driven tools to detect it.”

In simpler times, the report notes, President Richard Nixon hung on to support in the Senate during the Watergate scandal — until the release of the infamous Oval Office tapes. They were concrete proof of his guilt.

What if these “advances” in technology not only increase our ability to disseminate lies, but also weaken our ability to accept the truth?

Michael Dobie is a member of Newsday’s editorial board.