Tech Titans From Google & Tesla Take A Stand Against Artificial Intelligence Weapons

Sign up for daily news updates from CleanTechnica on email. Or follow us on Google News!

As part of the annual International Joint Conference on Artificial Intelligence in Stockholm last week, leaders from the tech world, including Elon Musk along with Demis Hassabis, Shane Legg, and Mustafa Suleyman, co-founders of Google DeepMind, signed a policy statement crafted by the Future of Life Institute calling for “laws against lethal autonomous weapons.” The pledge represents the thinking of 170 organizations and more than 2,400 people.

Artificial Intelligence weapon
Credit: YouTube

[W]e the undersigned agree that the decision to take a human life should never be delegated to a machine,” the pledge says. [W]e will neither participate in nor support the development, manufacture, trade, or use of lethal autonomous weapons. There is a moral component to this position, that we should not allow machines to make life-taking decisions for which others — or nobody — will be culpable. There is also a powerful pragmatic argument: lethal autonomous weapons, selecting and engaging targets without human intervention, would be dangerously destabilizing for every country and individual. Stigmatizing and preventing such an arms race should be a high priority for national and global security.

As NPR points out, the latest pledge is similar to an open letter to the United Nations last year imploring that august body to act swiftly to ban the use of artificially intelligent weapons. Elon Musk was one of numerous luminaries who signed that document. It contents are short and to the point. It reads in relevant part:

As companies building the technologies in Artificial Intelligence and Robotics that may be repurposed to develop autonomous weapons, we feel especially responsible in raising this alarm. We warmly welcome the decision of the UN’s Conference of the Convention on Certain Conventional Weapons (CCW) to establish a Group of Governmental Experts (GGE) on Lethal Autonomous Weapon Systems. Many of our researchers and engineers are eager to offer technical advice to your deliberations.

Lethal autonomous weapons threaten to become the third revolution in warfare. Once developed, they will permit armed conflict to be fought at a scale greater than ever, and at timescales faster than humans can comprehend. These can be weapons of terror, weapons that despots and terrorists use against innocent populations, and weapons hacked to behave in undesirable ways. We do not have long to act. Once this Pandora’s box is opened, it will be hard to close. We therefore implore the High Contracting Parties to find a way to protect us all from these dangers. 

Anyone who has gotten beyond middle school history knows such protestations are pointless. The history of humanity is inextricably bound up with tales about how advances in weaponry allowed one group of humans to slaughter another group of humans more efficiently. It started with the phalanx, then progressed to pikes and broadswords. Naval vessels soon learned how to lob cannonballs at each other and it was not long after the Wright brothers flew at Kitty Hawk that people figured out how to drop things from airplanes and shoot other aircraft out of the sky.

Land mines have been banned by every country except the US, which happens to be home to several corporations that manufacture them. Chemicals weapons have been banned for generations after the horrors of mustard gas were first visited on troops in World War I, yet they are in use in Syria today. Hiroshima and Nagasaki put an end to nuclear weapons. They have been banned to all who don’t already possess them. And yet the US military has secret labs were biological and chemical weapons are stored and studied — just in case.

What are the odds that a ban on AI weapons will be any more effective than bans on nuclear, biological, and chemical weapons have proven to be? If you said “slim to none,” you win a prize.

Science fiction has a way of presaging reality. Jules Verne first wrote about nuclear powered submarines long before any existed. Movies like Terminator II glorify the coming age of robotic warriors. How long before that vision becomes real? “Once there is awareness, people will be extremely afraid. As they should be,” Musk said a year ago this month.

Well, not all people as it turns out. Congressman Pete Olsen, a Republican from Texas (naturally), told NPR, “These are machines that are learning over time from activities they’ve done. They become sort of intelligent through that learning. This is the great value, great tremendous benefit for our country.” If only Republican congresssmen from Texas (or any other state) became “sort of intelligent” over time, the world might have a chance to avoid yet another Doomsday weapon system capable of terrorizing everyone on Earth.

As it is, with ignorant jackasses like Olsen on the loose, we can be certain weaponized AI will become a future scourge that takes science fiction out of the realm of fantasy and turns it into a terrible reality. Can’t you just feel the Rapture as semi-sentient machines methodically eradicate all human life in order to bring “great value, great tremendous benefit for our country?”


Have a tip for CleanTechnica? Want to advertise? Want to suggest a guest for our CleanTech Talk podcast? Contact us here.

CleanTechnica Holiday Wish Book

Holiday Wish Book Cover

Click to download.


Our Latest EVObsession Video


I don't like paywalls. You don't like paywalls. Who likes paywalls? Here at CleanTechnica, we implemented a limited paywall for a while, but it always felt wrong — and it was always tough to decide what we should put behind there. In theory, your most exclusive and best content goes behind a paywall. But then fewer people read it!! So, we've decided to completely nix paywalls here at CleanTechnica. But...
 
Like other media companies, we need reader support! If you support us, please chip in a bit monthly to help our team write, edit, and publish 15 cleantech stories a day!
 
Thank you!

Advertisement
 
CleanTechnica uses affiliate links. See our policy here.

Steve Hanley

Steve writes about the interface between technology and sustainability from his home in Florida or anywhere else The Force may lead him. He is proud to be "woke" and doesn't really give a damn why the glass broke. He believes passionately in what Socrates said 3000 years ago: "The secret to change is to focus all of your energy not on fighting the old but on building the new."

Steve Hanley has 5241 posts and counting. See all posts by Steve Hanley