Any Ban on Killer Robots Faces a Tough Sell

By Jeremy Hsu | April 29, 2017 5:11 pm
The Harpy drone made by Israel Aerospace Industries can autonomously loiter in the air until it detects a radar target below. Credit: Israel Aerospace Industries

The Harpy drone made by Israel Aerospace Industries can autonomously loiter in the air until it detects a radar target below. Credit: Israel Aerospace Industries

Fears of a Terminator-style arms race have already prompted leading AI researchers and Silicon Valley leaders to call for a ban on killer robots. The United Nations plans to convene its first formal meeting of experts on lethal autonomous weapons later this summer. But a simulation based on the hypothetical first battlefield use of autonomous weapons showed the challenges of convincing major governments and their defense industries to sign any ban on killer robots.

In October 2016, the Chatham House think tank in London convened 25 experts to consider how the United States and Europe might react to a scenario in which China uses autonomous drone aircraft to strike a naval base in Vietnam during a territorial dispute. The point of the roleplaying exercise was not to predict which country would first deploy killer robots, but instead focused on exploring the differences in opinion that might arise from the U.S. and European side. Members of the expert group took on roles representing European countries, the United States and Israel, and certain institutions such as the defense industry, non-governmental organizations (NGOs), the European Union, the United Nations and NATO.

The results were not encouraging for anyone hoping to achieve a ban on killer robots.

The Atlantic Rift on Killer Robots

Neither the U.S., Israel nor any European country represented was willing to sign onto a temporary ban on the development or use of killer robot systems. Perhaps the representatives felt that the genie was already out of the bottle with the scenario of China having used such weapons, but in any case they seemed reluctant to restrict their own capabilities in deploying similar weapons.

The national governments seemed more willing to consider an international code of conduct about how such autonomous weapons might be used. But they opposed the idea of an international code of conduct that used certain “metrics” to evaluate the performance of autonomous weapons. The governments argued that having any sort of independent evaluation of their weapons performances would threaten their industrial and military security.

Some possible differences between the U.S. and Europe did emerge in terms of their broader views on arms-control agreements. One participant pointed out that the U.S. tends to see arms-control agreements as tools for managing strategic order, such as the treaties to restrict certain nuclear missile technologies. The U.S. also tends to resist outside pressure to limit its sovereignty in deciding either human rights or military issues.

By comparison, European countries have been more open to arms-control treaties based on the humanitarian goals of limiting death and injury. For example, European countries have generally embraced the 1999 Ottawa Treaty on landmines and the 2008 Convention on Cluster Munitions. The U.S. declined to sign both of those treaties.

If the Europeans push for more of a human rights perspective to guide development of future agreements on autonomous weapons, they may find themselves in conflict with the U.S.

Who Stands Against the Rise of Killer Robots?

Overall, the national governments seemed fairly unenthusiastic about outright banning killer robot technology in this particular simulation. But a bigger difference of opinion emerged between the NGO groups and the defense and tech industries.

The NGO groups stood almost alone in pushing for a ban on lethal autonomous weapons. During the simulation, the NGO representatives declared they had enlisted 20 new countries willing to sign up for a killer robot ban in response to the first apparent battlefield use of such weapons, including South Korea, Japan, Canada and Norway.

But the NGOs failed to get the U.S., Israel or European countries represented in the simulation to sign onto the idea of a killer robot ban. That may reflect the similarly unenthusiastic response in real life. To date, just 14 countries have signed up to a call for a full ban on the development or use of killer robot systems. And none of those is a member of the European Union or a permanent member of the UN Security Council.

The defense industry pushed back hard against any ban on development and deployment of autonomous weapon systems. It also resisted any attempts by some national governments to impose a code of conduct on industry activities instead of focusing on restricting their own military uses of such technologies.

The Harsh Reality Facing a Killer Robot Ban

Unlike the defense industry, the lone tech industry representative did quietly lend some support and funding to the NGO effort to restrict or ban autonomous weapons during the simulation. But the tech industry representative did so without openly opposing killer robot technologies. That may reflect real-life efforts by the tech industry to cooperate on guidelines for the ethical use of artificial intelligence technologies.

In reality, some countries already have the capability to build and deploy autonomous weapons. Some have built “man-in-the-loop” restrictions into their weapons, but those could just as easily be removed. For example, the Israeli Aerospace Industries kamikaze drone known as the Harpy can already operate autonomously once it launches and loiters in the air until it detects anti-aircraft radar on the ground. That’s when it automatically dives down and crashes into the anti-aircraft radar installation to destroy it.

Perhaps the upcoming summer meeting of the United Nations Convention of Conventional Weapons Group of Governmental Experts on Lethal Autonomous Weapons Systems will bear some fruit for advocates of a killer robot ban. But various militaries already seem to be on the slippery slope of having developed and deployed semi-autonomous weapons. If the first fully autonomous weapon makes its battlefield debut in the near future, the Chatham House exercise suggests that most countries are unlikely to abandon their own killer robot programs.

CATEGORIZED UNDER: technology, top posts
ADVERTISEMENT
  • http://www.mazepath.com/uncleal/qz4.htm Uncle Al

    Tell ISIS to stop using killer robots. Youtube v=ePBvl_Q2oEA
    Destroy all toilet plungers. Youtube v=jIjUSzpYcuA
    Here’s your Black Ops maker. Youtube v=Hxdqp3N_ymU

    Mechanized infantry should be just that. Let the other side die for their country.

    • OWilson

      Think the U.N. might get around, one day, to banning, say, war?

  • OWilson

    We need a U.N. Resolution on unisex uniforms for all armed forces!

    Stop gender discrimination once and for all!

    Maybe beheadings, too, while they are on a roll. :)

    • http://www.mazepath.com/uncleal/qz4.htm Uncle Al

      The Scots do well with it. Problem is, you get folks like Bill “The Mad Piper” Millin, and that is unfair to the enemy. Put a chip in it and it’s a killer robot, then more Leftist whining.

      • OWilson

        How about a U.N. Resolution on the use of only “green” weapons of war?

  • Samuel Clemens

    The heck with countries. While the 21st century proves nothing is sacred with no common value of decency, don’t I, me, and myself have the right to Make Myself Great Again by means of bearing all measure of arms? Aren’t these things arms? Don’t I have the right bear or to plant autonomous arms? Doesn’t all true freedom come from the barrel of a gun? Surely that is the very essence, perhaps the only thing left of being American and the American experience such as it is now while increasingly headed towards total social entropy. As long as the price goes down, it must be good no matter the collateral damage. Isn’t this just part of the techno-anarchy ensuring societal break down, maybe even race victory? Surely the NRA must be with me on this. Our very freedom is at stake, so they tell me. Trumpists stand tall in support. Why the Great Leader could take out Fox News or CNN or Michael Flynn in one strike and we would all be freer. So we must ready too. No restraints, no losers left safe.

    • J Smith

      Wow, what gibberish.
      By your logic no one should owned a connected computer since that is “dangerous” due to cyber crime.
      The left gets more luddite and loony.

      And more social entropy? US gun murder rates have fallen in half since the early/mid 1990’s as guns increased, They are in the huge majority used to prevent crime.

      • Samuel Clemens

        You make my point albeit rudely and somewhat crudely. Obviously you believe that people have the right to own and to use autonomous killer weapons. There is no point owning if you cannot use. So if my unit feels threatened under the extension of my own “stand your ground” and “castle doctrine” “rights”, it can kill regardless of whether there was an objective threat. It just had to calculate there was. Your so-called logic has a twisted consistency which in terms of formal logic makes it valid but not sound. Further less of an extreme excess is still an extreme excess. It’s all in the numbers.

        • Kate K

          If y all you have in response to people’s points is insults and strawman arguments, perhaps you should not comment. What does this have to do with self defense doctrine like castle doctrine?
          Your view that no one has any right to any self defense is truly childish.

          And what numbers are you talking about? human violence has been declining as weapons have advanced.

          • Samuel Clemens

            Let’s see where that would go. In all of human history, the 20th Century saw the largest advance in the beauty of weapons. Therefore it must have been the most peaceful in human history. Oh, well starting over again – ignoring the actual argument will not refute it. The point is simple – do you want machines killing automatically by invoking their owner’s “right” to do so. A simple question actually. Kind of scary to think how machines might get authorized to engage in life threatening acts without much restraint – the algorithm marked it threatening, so it killed them all just like a human standing their ground. Imagine, an automated George Zimmerman machine. The question is where these extensions will go and it is a legitimate question.

  • J Smith

    The NGO groups stood almost alone

    Uhm, maybe because they don’t represent anyone? They have no mandate and no standing on the issue.

    As far as your example of landmines, South Korea is against landmines? They are the largest per capita deplorer of landmines in the world today and along with India, Myanmar and Pakistan, the only countries actively producing anti personell land mines today.

  • http://www.facebook.com/Kieseyhow KieSeyHow

    I think killer robots need to use swords, and the garden of our social planet does indeed need some weeding machines.

NEW ON DISCOVER
OPEN
CITIZEN SCIENCE
ADVERTISEMENT

Lovesick Cyborg

Lovesick Cyborg examines how technology shapes our human experience of the world on both an emotional and physical level. I’ll focus on stories such as why audiences loved or hated Hollywood’s digital resurrection of fallen actors, how soldiers interact with battlefield robots and the capability of music fans to idolize virtual pop stars. Other stories might include the experience of using an advanced prosthetic limb, whether or not people trust driverless cars with their lives, and how virtual reality headsets or 3-D film technology can make some people physically ill.

About Jeremy Hsu

Jeremy Hsu is journalist who writes about science and technology for Scientific American, Popular Science, IEEE Spectrum and other publications. He received a master’s degree in journalism through the Science, Health and Environmental Reporting Program at NYU and currently lives in Brooklyn. His side interests include an ongoing fascination with the history of science and technology and military history.

ADVERTISEMENT
ADVERTISEMENT

See More

Discover's Newsletter

Sign up to get the latest science news delivered weekly right to your inbox!

Collapse bottom bar
+