TODAY'S PAPER
70° Good Morning
70° Good Morning
OpinionColumnistsMichael Dobie

If the smart guys worry, you should, too.

Renowned physicist Stephen Hawking attends a press conference

Renowned physicist Stephen Hawking attends a press conference in London, Monday, July 20, 2015. Credit: AP

Stephen Hawking, Elon Musk, Steve Wozniak and a bunch of their friends wrote a letter to the world last week. It called for a ban on autonomous weapons. Huh, you say? Think: robot armies. Machines built and programmed to kill without being tethered to or controlled by humans. Terminators, with or without Schwarzeneggerian inflections.

Hawking is the renowned physicist. Musk heads up Tesla and rocketmaker SpaceX. Wozniak co-founded Apple. Their friends included hundreds of artificial intelligence researchers. They wrote that AI has advanced so much that deployment of these killing machines is "feasible within years, not decades . . . " and raised the specter of an inevitable global AI arms race.

When people who push the envelope of science and technology are worried about pushing the envelope of science and technology, it's something to worry about.

The U.S. military is among several governments researching and developing weapons that are increasingly automated. The Predator drone is the best known. South Korea uses a sort of gun turret on its border with North Korea that automatically locks onto a target and fires a machine gun or grenade launcher from more than a mile away -- with no human involvement.

How large a step is it to a mobile machine with a consciousness that allows it to make decisions about who to kill? Like its creator? We humans?

That's the doomsday worry of sci-fi writers, that robots will run amok, turn on us, and obliterate us, thereby transgressing the first part of Isaac Asimov's first law of robotics: A robot may not injure a human.

Relax, some AI experts say, we're nowhere near that moment. But in an arena where technology breakthroughs arrive at breathtaking speed, I'm not feeling warm and cozy about that reassurance. I mean, where are we when a robot in a Volkswagen plant in Germany kills a contractor -- accidentally, of course -- and the prosecutors' list of those that might be charged includes the robot?

But, frankly, I'm less concerned about being terminated than with how these developments corrode us from within.

Robot warriors would be the latest in a long line of "advancements" that increase the distance between humans and war. Step by step, it gets more remote. Once, two people hit and stuck each other with weapons, spilled blood, and watched it spilled. Then came guns, and better guns, that let us kill from further away. Then bombs, and better bombs, and drones. And now this.

The decision to go to war should be difficult. It should be the toughest call any leader has to make. But if you're not sending people into combat, if no human blood will be spilled, if no parent's doorbell must be rung, if no casket must be greeted, if no trumpet must play "Taps," if no child must watch a father lowered into the ground, the calculus gets easier.

And in a world that seems to grow ever more violent, that seems to be ever more riven by its differences, that displaces more and more of its people and flings them all over its borders and shores -- that's scary.

Hooray for technology. We must keep moving forward. But let's be careful about how it's adapted. Anything can be used in a different way, and anything that can go wrong, will. You can program a robot to clean your house, and grip and squeeze the switch on a vacuum . . . or the trigger of a gun.

Comments

We're revamping our Comments section. Learn more and share your input.

Columns