US lawmakers introduce bill to prevent AI-controlled nuclear launches

The bipartisan legislation would codify the requirement of ‘meaningful human control’ for the decision.

‘Wargames’ (1983) (MGM / UA Entertainment Company)

Bipartisan US lawmakers from both chambers of Congress introduced legislation this week that would formally prohibit AI from launching nuclear weapons. Although Department of Defense policy already states that a human must be “in the loop” for such critical decisions, the new bill — the Block Nuclear Launch by Autonomous Artificial Intelligence Act — would codify that policy, preventing the use of federal funds for an automated nuclear launch without “meaningful human control.”

Aiming to protect “future generations from potentially devastating consequences,” the bill was introduced by Senator Ed Markey (D-MA) and Representatives Ted Lieu (D-MA), Don Beyer (D-VA) and Ken Buck (R-CO). Senate co-sponsors include Jeff Merkley (D-OR), Bernie Sanders (I-VT), and Elizabeth Warren (D-MA). “As we live in an increasingly digital age, we need to ensure that humans hold the power alone to command, control, and launch nuclear weapons – not robots,” said Markey. “That is why I am proud to introduce the Block Nuclear Launch by Autonomous Artificial Intelligence Act. We need to keep humans in the loop on making life or death decisions to use deadly force, especially for our most dangerous weapons.”

Artificial intelligence chatbots (like the ever-popular ChatGPT, the more advanced GPT-4 and Google Bard), image generators and voice cloners have taken the world by storm in recent months. (Republicans are already using AI-generated images in political attack ads.) Various experts have voiced concerns that, if left unregulated, humanity could face grave consequences. “Lawmakers are often too slow to adapt to the rapidly changing technological environment,” Cason Schmit, Assistant Professor of Public Health at Texas A&M University, told The Conversation earlier this month. Although the federal government hasn’t passed any AI-based legislation since the proliferation of AI chatbots, a group of tech leaders and AI experts signed a letter in March requesting an “immediate” six-month pause on developing AI systems beyond GPT-4. Additionally, the Biden administration recently opened comments seeking public feedback about possible AI regulations.

“While we all try to grapple with the pace at which AI is accelerating, the future of AI and its role in society remains unclear,” said Rep. Lieu. “It is our job as Members of Congress to have responsible foresight when it comes to protecting future generations from potentially devastating consequences. That’s why I’m pleased to introduce the bipartisan, bicameral Block Nuclear Launch by Autonomous AI Act, which will ensure that no matter what happens in the future, a human being has control over the employment of a nuclear weapon – not a robot. AI can never be a substitute for human judgment when it comes to launching nuclear weapons.”

Given the current political climate in Washington, passing even the most common-sense of bills isn’t guaranteed. Nevertheless, perhaps a proposal as fundamental as “don’t let computers decide to obliterate humanity” will serve as a litmus test for how prepared the US government is to deal with this quickly evolving technology.