• Video
  • Shop
  • Culture
  • Family
  • Wellness
  • Food
  • Living
  • Style
  • Travel
  • News
  • Book Club
  • Newsletter
  • Privacy Policy
  • Your US State Privacy Rights
  • Children's Online Privacy Policy
  • Interest-Based Ads
  • Terms of Use
  • Do Not Sell My Info
  • Contact Us
  • © 2026 ABC News
  • News

Elon Musk, Stephen Hawking Want to Save the World From Killer Robots

Elon Musk, Steve Wozniak and Stephen Hawking joined artificial intelligence researchers to call for a ban on offensive autonomous weapons.
AFP/Getty Images
ByALYSSA NEWCOMB
July 27, 2015, 3:57 PM

— -- Elon Musk and Stephen Hawking are among the leaders from the science and technology worlds calling for a ban on autonomous weapons, warning that weapons with a mind of their own "would not be beneficial for humanity."

Along with 1,000 other signatories, Musk and Hawking signed their names to an open letter that will be presented this week at the International Joint Conference on Artificial Intelligence in Buenos Aires, Argentina.

Autonomous weapons are defined by the group as artillery that can "search for and eliminate people meeting certain pre-defined criteria, but do not include cruise missiles or remotely piloted drones for which humans make all targeting decisions."

Related Articles

Why You Might Not Have to Fear Robots

Related Articles

What Elon Musk Says Could Be More Dangerous Than Nuclear Weapons

Related Articles

Google Paves the Way to a Robotic Future

"Artificial Intelligence (AI) technology has reached a point where the deployment of such systems is -- practically if not legally -- feasible within years, not decades, and the stakes are high: autonomous weapons have been described as the third revolution in warfare, after gunpowder and nuclear arms," the letter, posted on the Future of Life Institute's website says.

If one country pushes ahead with the creation of robotic killers, the group wrote it fears it will spur a global arms race that could spell disaster for humanity.

"Autonomous weapons are ideal for tasks such as assassinations, destabilizing nations, subduing populations and selectively killing a particular ethnic group," the letter says. "We therefore believe that a military AI arms race would not be beneficial for humanity. There are many ways in which AI can make battlefields safer for humans, especially civilians, without creating new tools for killing people."

While the group warns of the potential carnage killer robots could inflict, they also stress they aren't against certain advances in artificial intelligence.

"We believe that AI has great potential to benefit humanity in many ways, and that the goal of the field should be to do so," the letter says. "Starting a military AI arms race is a bad idea, and should be prevented by a ban on offensive autonomous weapons beyond meaningful human control."

Up Next in News—

Gas station clerk speaks out after foiling alleged kidnapping

April 15, 2026

Oklahoma high school principal takes down would-be shooter, hailed as hero

April 15, 2026

Family seeks answers after influencer Ashlee Jenae is found dead on vacation in Tanzania

April 15, 2026

Couple shares warning after nearly losing down payment in mortgage fraud

April 10, 2026

Shop GMA Favorites

ABC will receive a commission for purchases made through these links.

Sponsored Content by Taboola

The latest lifestyle and entertainment news and inspiration for how to live your best life - all from Good Morning America.
  • Contests
  • Terms of Use
  • Privacy Policy
  • Do Not Sell My Info
  • Children’s Online Privacy Policy
  • Advertise with us
  • Your US State Privacy Rights
  • Interest-Based Ads
  • About Nielsen Measurement
  • Press
  • Feedback
  • Shop FAQs
  • ABC News
  • ABC
  • All Videos
  • All Topics
  • Sitemap

© 2026 ABC News
  • Privacy Policy— 
  • Your US State Privacy Rights— 
  • Children's Online Privacy Policy— 
  • Interest-Based Ads— 
  • Terms of Use— 
  • Do Not Sell My Info— 
  • Contact Us— 

© 2026 ABC News