Categories
Latest
Popular

‘No to Killer Robots!’ Tech Experts Oppose the Autonomous Weapons Race

Image courtesy of Victor Habbick / freedigitalphotos.net

Image courtesy of Victor Habbick / freedigitalphotos.net

Over a thousand tech scientists, researchers, and experts have signed an open letter aimed at warning everybody about the risks of an autonomous weapons race. The letter was announced last July 28 at the opening of the International Joint Conference on Artificial Intelligence (IJCAI) 2015 in Buenos Aires, and published online by the Future of Life Institute (FLI).

Several prominent personalities from the science and tech fields have signed the letter including renowned physicist Stephen Hawking, Apple co-founder Steve Wozniak, Harvard professor Lisa Randall, University of California’s Daniel Cox, Google DeepMind co-founder Mustafa Suleyman, and entrepreneur Elon Musk.

The letter was also endorsed by Google AI chief Demis Hassabis, consciousness expert Daniel Dennett, and Massachusetts Institute of Technology professor Noam Chomsky.

Autonomous Weapons Defined

The open letter defines autonomous weapons as weapons that choose and engage targets without the involvement of humans. They are weapons with computers embedded in them and artificial intelligence. One example is an armed quadcopter that has the capabilities to search for and get rid of human targets based on a set of criteria. Autonomous weapons, however, don’t include remotely piloted drones and cruise missiles.

Autonomous weapons are being described as the third revolution of warfare. They are considered as the successor of nuclear arms (with nuclear arms being the successor of gunpowder). The signatories of the letter are trying to prevent this unnecessary revolution in warfare from materializing.

By Flying Eye (Own work) [CC-BY-SA-3.0 (http://creativecommons.org/licenses/by-sa/3.0)], via Wikimedia Commons

By Flying Eye (Own work) [CC-BY-SA-3.0 (http://creativecommons.org/licenses/by-sa/3.0)], via Wikimedia Commons

Arguments of the Open Letter

The open letter is expressing a clear and strong opposition to an autonomous weapons race. The main arguments are as follows:

No country or military power should start AI weapons development. – If any of the major militarily powerful nations of the world would start making weapons that make use of artificial intelligence, an AI arms race is likely to ensue. It will be very unlikely for other countries not to step up and match what others would achieve when it comes to militarized artificial intelligence.

Autonomous weapons lower the threshold for going to battle or declaring war. – Because AI weapons significantly lower the number of casualties in battles, their existence will likely create more aggressive militaries that will not hesitate to engage opponents at the slightest of reasons. While autonomous weapons create the benefit of lower risks especially when it comes to offensives, they can also lead to less prudent militaries since the loss of human lives will no longer become a critical factor.

Autonomous weapons can easily become ubiquitous. – With continuously improving technology and because the materials needed in creating autonomous are not as rare and expensive as what are required in nuclear weapons, they can easily become ubiquitous. There is no certainty that they will not become accessible to criminals and terrorists. It’s only a matter of time before they get traded in black markets.

Autonomous weapons can be a tool for destabilizing governments or nations. – Ill-intentioned individuals or groups can use autonomous weapons to commit acts that lead to destabilization and chaos. Since they are mere machines that won’t succumb to tortures and other attempts to reveal their perpetrators, they are perfect tools for starting chaos and discord.

Autonomous weapons can serve as effective tools for assassinations and subduing populations. – AI drones with the ability to locate and attack targets as programmed or instructed are effective methods for assassinations. They may also be used in mass violence and crimes like ethnic cleansing.

Using AI in warfare besmirches the reputation of AI scientists and experts. – In the same way chemists and biologists worldwide are not interested in the development of chemical and biological weapons and have supported their banning, almost every artificial intelligence scientist supports the call for the banning of AI weapons. Similarly, most physicists around the world support the prohibition of space-based nuclear and laser weapons, so AI experts believe that it is well within their moral right and responsibility to reject the use of the technology they develop in killing people.

By Copyleft (Own work) [CC-BY-SA-3.0 (http://creativecommons.org/licenses/by-sa/3.0)], via Wikimedia Commons

By Copyleft (Own work) [CC-BY-SA-3.0 (http://creativecommons.org/licenses/by-sa/3.0)], via Wikimedia Commons

The open letter expresses the belief that there are more benefits for humanity if an AI arms race were not pursued. AI technology can be useful in making battlefields safer for people, civilians in particular, so it should be aimed at such a goal instead of making it a tool for killing. As  the letter writes, the signatories are calling for a “ban on offensive autonomous weapons beyond meaningful human control.”