'We Do Not Have Long to Act': Elon Musk and Others Warn the U.N. About Autonomous Weapons
The prospect of tanks, machine guns and drones that think for themselves becoming widespread is giving tech luminaries the heebie-jeebies.
The prospect of tanks, machine guns, and drones that think for themselves becoming widespread is giving tech luminaries the heebie-jeebies.
In an open letter Aug. 21, a group of specialists from 26 nations called for the United Nations to ban the development and use of autonomous weapons. The signatories include Tesla CEO Elon Musk and DeepMind co-founder Mustafa Suleyman, as well as other leaders in robotics and artificial-intelligence companies. (Google acquired DeepMind in 2014.)
The letter says of autonomous weapons:
“Once developed, they will permit armed conflict to be fought at a scale greater than ever, and at timescales faster than humans can comprehend. These can be weapons of terror, weapons that despots and terrorists use against innocent populations, and weapons hacked to behave in undesirable ways. We do not have long to act. Once this Pandora’s box is opened, it will be hard to close.”
Such open letters have been written before by AI luminaries, including one in 2015 that warned:
“Autonomous weapons select and engage targets without human intervention. They might include, for example, armed quadcopters that can search for and eliminate people meeting certain pre-defined criteria, but do not include cruise missiles or remotely piloted drones for which humans make all targeting decisions. Artificial Intelligence (AI) technology has reached a point where the deployment of such systems is—practically if not legally—feasible within years, not decades, and the stakes are high: autonomous weapons have been described as the third revolution in warfare, after gunpowder and nuclear arms.”
In recent years Musk has been particularly prominent in warning against the dangers of AI. He’s donated millions to fund research that ensures artificial intelligence will be used for good, not evil, and joined other tech luminaries in establishing OpenAI, a nonprofit with the same goal in mind.
Part of Musk’s donations went to the creation of the Future of Life Institute. Today’s letter was posted on the website of the organization, which is focused on ensuring “tomorrow’s most powerful technologies are beneficial for humanity.”
Today’s letter comes on the date when a first meeting was slated to be held for the U.N.’s recently established Group of Governmental Experts on Lethal Autonomous Weapon Systems. That meeting was canceled, the letter notes, “due to a small number of states failing to pay their financial contributions to the U.N.” Critics have argued for years that UN action on autonomous weapons is taking too long.
The group’s first meeting is now planned for November.