Gabon’s president had not been seen for months. Out of the African country to get treatment for a stroke, Ali Bongo was rumored to be dead.
Then an odd video of him emerged on New Year’s Eve, denounced by political foes as fake. That prompted the Gabonese military to stage an unsuccessful coup.
Bongo eventually returned to Gabon, and gave his first televised speech last week.
Still unknown: Whether the video was fake or real.
Humans have long understood the power of images. They can convince and inspire — and sow political chaos.
We’ve also understood the power of mistruth, especially in politics. Hence the occupation “dirty trickster.”
Now we’re about to understand the power of fusing the visual with the lie. The era of deepfakes is upon us.
Computer-generated deepfake videos use actual footage to create a false reality. They’re ever easier to create and ever harder to detect. Cyber experts working to identify deepfakes say they’re outnumbered by those working to make them.
Disinformation — fake news stories, online personas and Facebook groups — roiled the 2016 presidential election.
That’s amateur hour next to the potential of deepfakes.
A video shows a candidate using a racial slur. Or uttering obscenities. Or presenting an unpopular viewpoint.
What we’ve seen so far has been easily detectable. The fake footage of Facebook founder Mark Zuckerberg boasting of his “total control of billions of people’s stolen data.” The doctored video of House Speaker Nancy Pelosi to make her appear intoxicated.
The good stuff is coming. And we’re utterly unprepared, judging by expert testimony at Thursday’s House Intelligence Committee deepfake hearing.
Big Tech companies like Facebook and YouTube don’t want to play referee on what’s fake. They have different policies; Facebook says posts don’t have to be “true.” And algorithms can’t catch everything.
There is little reason to expect Congress to act. Senate Majority Leader Mitch McConnell has been blocking any election security bill, a gross dereliction of duty given Russia’s meddling in 2016. On Thursday, Republicans even blocked a bill that essentially would force campaigns to report to the FBI foreign offers of assistance — akin to asking them to report a crime.
Candidates aren’t developing the rapid-response strategies they’ll need to deal with deepfake videos. But leading Democratic contender Joe Biden vowed Friday not to use or share deepfakes, a pledge made by many of his opponents.
But not President Donald Trump, who shared the Pelosi video.
The president’s role is troubling. He blurs the line between fact and fiction, denies saying things he said, and claims he said things he never did. Now a deepfake video could “prove” him correct. And he likely would tout that.
Trump, who never misses his shot to twist something to his advantage, reportedly is saying that the “Access Hollywood” video with his boast of assaulting women was fake — proving well-founded the fears of experts that deepfake videos will let people deny the legitimacy of real videos.
And so humans will come to understand the power of images to destroy our faith in images.
Michael Dobie is a member of Newsday’s editorial board.