Why tech execs want to ban robot weapons

Artificial intelligence experts shook up the tech world this month when they called for the United Nations to regulate and even consider banning autonomous weapons.

Attention quickly gravitated to the biggest celebrity in the group, Elon Musk, who set the Internet ablaze when he tweeted: “If you’re not concerned about AI safety, you should be. Vastly more risk than North Korea.”

 

The group of 116 AI experts warned in an open letter to the UN Convention on Certain Conventional Weapons that “lethal autonomous weapons threaten to become the third revolution in warfare.” Speaking on behalf of companies that make artificial intelligence and robotic systems that may be repurposed to develop autonomous weapons, they wrote, “We feel especially responsible in raising this alarm.”

The blunt talk by leaders of the AI world has raised eyebrows. Musk has put AI in the category of existential threat and is demanding decisive and immediate regulation. But even some of the signatories of the letter now say Musk took the fear mongering too far.

What this means for the Pentagon and its massive efforts to merge intelligent machines into weapon systems is still unclear. The military sees a future of high-tech weapon systems powered by artificial intelligence and ubiquitous autonomous weapons in the air, at sea, on the ground, as well as in cyberspace.

The United Nations has scheduled a November meeting to discuss the implications of autonomous weapons. It has created a group of governmental experts on “lethal autonomous weapon systems.” The letter asked the group to “work hard at finding means to prevent an arms race in these weapons, to protect civilians from their misuse, and to avoid the destabilizing effects of these technologies.”

United Nations General Assembly hall in New York, NY. Wikimedia Commons photo by user Avala.

United Nations General Assembly hall in New York, NY. Wikimedia Commons photo by user Avala.

Founder and CEO of the artificial intelligence company SparkCognition, Amir Husain, signed the letter but insists that he is against any ban or restrictions that would stifle progress and innovation. He pointed out that the campaign was organized by professor Toby Walsh of the University of New South Wales in Australia, and was meant to highlight the “potential dangers of autonomous weapons absent an international debate on these issues.”

The industry wants a healthy debate on the benefits and risks of AI and autonomy, Amir told RealClearDefense in a statement. But a blanket ban is “unworkable and unenforceable.” Scientific progress is inevitable, “and for me that is not frightening,” he added. “I believe the solution — as much as one exists at this stage — is to redouble our investment in the development of safe, explainable, and transparent AI technologies.”

Wendy Anderson, general manager of SparkCognition’s defense business, said that to suggest a ban or even tight restrictions on the development of any technology is a “slippery slope” and would put the United States at a competitive disadvantage, as other countries will continue to pursue the technology. “We cannot afford to fall behind,” said Anderson. “Banning or restricting its development is not the answer. Having honest, in-depth discussions about how we create, develop, and deploy the technology is.”

USAF photo by Staff Sgt. Brian Ferguson

USAF photo by Staff Sgt. Brian Ferguson

August Cole, a senior fellow at the Atlantic Council and writer at the consulting firm Avascent, said the concerns raised by tech leaders on autonomous weapons are valid, but a ban is unrealistic. “Given the proliferation of civilian machine learning and autonomy advances in everything from cars to finance to social media, a prohibition won’t work,” he said.

Setting limits on technology ultimately would hurt the military, which depends on commercial innovations, said Cole. “What needs to develop is an international legal, moral, and ethical framework. … But given the unrelenting speed of commercial breakthroughs in AI, robotics, and machine learning, this may be a taller order than asking for an outright ban on autonomous weapons.”

But while advances in commercial technology have benefited the military, analysts fear that the Pentagon has not fully grasped the risks of unfettered AI and the possibility that machines could become uncontrollable.

“AI is not just another technology,” said Andy Ilachinski, principal research scientist at the Center for Naval Analyses. He authored a recent CNA study, “AI, Robots, and Swarms: Issues, Questions, and Recommended Studies.”

USMC photo by Sgt. Lucas Hopkins

USMC photo by Sgt. Lucas Hopkins

Defense has to be concerned about the implications of this debate, he said in an interview. AI is transforming the world “to the level of a Guttenberg press, to the level of the Internet,” he said. “This is a culture-shifting technology. And DoD is just a small part of that.”

Another troubling reality is that the Pentagon has yet to settle on the definition of autonomous weapons. In 2012, the Department of Defense published an instruction manual on the use of autonomous weapons. That 5-year-old document is the only existing policy on the books on how the US military uses these systems, Ilachinski said. According to that manual, a weapon is autonomous if “once activated, it can select and engage targets without further intervention by a human.”

Policies and directives are long overdue for an update, he said. “We need to know what AI is capable of, how to test it, evaluate it.”

He noted that the Defense Science Board, a Pentagon advisory panel, published two studies on the subject in 2012 and 2016 but provided “no good definition of autonomy or AI in either of them.” These are the Pentagon’s top experts and “they can’t even get it straight.”

USAF photo by Master Sgt. Dennis J. Henry Jr.

USAF photo by Master Sgt. Dennis J. Henry Jr.

Something about Musk’s warning strikes a chord with scientists that truly understand AI, Ilachinski observed. When Google’s Deepmind created a computer program in 2015 that beat the world’s Go champion, it was a landmark achievement for AI but also brought the realization that these algorithms truly have minds of their own. “This is an issue of great concern for DoD.”

There are areas within AI that scientists are still trying to wrap their heads around. In advanced systems like Deepmind’s AlphaGo, “you can’t reverse engineer why a certain behavior occurred,” Ilachinski said. “It is important for DoD to recognize that they may not able to understand completely why the system is doing what it’s doing.”

One reason to take Musk’s warning seriously is that much is still unknown about what happens within the brains of these AI systems once they are trained, said Ilachinski. “You may not be able to predict the overall behavior of the system,” he said. “So in that sense I share the angst that people like Elon Musk feel.”

On the other hand, it is too late to put the genie back in the bottle, Ilachinski added. The United States can’t let up because countries like China already are working to become the dominant power in AI. Further, the Pentagon has to worry that enemies will exploit AI in ways that can’t yet be imagined. Anyone can buy a couple of drones for less than a thousand dollars, go to the MIT or Harvard website, learn about AI, download snippets of code and implant them in the drones, he said. A swarm of smart drones is something “would have a hard time countering because we are not expecting it. It’s very cheap and easy to do.”

Warrior Scout

Follow @warrior_mag on Twitter .

TOP ARTICLES
The 13 funniest memes for the week of Sept. 22

Kim Jong Un has an H-Bomb. These good ol' fashioned military memes will make your last few moments less excruciating. Memes are proven to cool hydrogen burns.

How Taco Bell influenced a rapper to become a Marine

In this episode of the Mandatory Fun podcast, we speak with The Marine Rapper a.k.a. TMR about how he went from wrapping tacos to rapping music lyrics.

These are the best military photos for the week of September 23

The military has very talented photographers in the ranks, and they’re always capturing what life as a service member is like during training and at war. Here are the best military photos of the week.

The US just sent 2,200 of these Fort Bragg paratroopers to Afghanistan

Approximately 2,200 82nd Airborne Division paratroopers began quietly deploying this month, part of a long-discussed troop surge to Afghanistan.

The Marines just took a look at this Civil War battlefield to learn military lessons for the future

The battle, which involved 18,456 mounted troops and was the largest cavalry clash in North America, also remains largely unknown by today's students of military history.

The Air Force is finally getting with the program and planning for urban fights

Air Force Chief of Staff David Goldfein has added urban warfare to his list of top focus areas, predicting that much of the world will live in megacities.

How we found out it's not so easy to fly a Reaper drone

Let's just say computer flight simulator games don't provide enough experience to make a good landing.

Here's what the Marines of 'Full Metal Jacket' are doing today

The Marines killed the enemy together, laughed together, and shared a Da Nang hooker together. But what happened to them after the war? You're about to find out.

Army ditches search for 7.62 battle rifle — for now

Less than two months after the Army issued a request from industry to provide up to 50,000 7.62 battle rifles, sources say the service has pulled the plug on the program.

Navy chief says crew fatigue may have contributed to recent spate of ship collisions

US Navy is blaming high pace of operations, budget uncertainty, and naval leaders who put their mission over safety after multiple deadly incidents at sea.

THE MIGHTY SURVEY GIVE-AWAY

We want to hear your thoughts. Complete our survey for a chance to win 1 of 5 gaming consoles

COMPLETE SURVEY TO WIN