Nuke-launching AI would be illegal under proposed US law

An AI-generated image of a nuclear mushroom cloud.

Enlarge / An AI-generated image of a nuclear mushroom cloud. (credit: Midjourney)

On Wednesday, US Senator Edward Markey (D-Mass.) and Representatives Ted Lieu (D-Calif.), Don Beyer (D-Va.), and Ken Buck (R-Colo.) announced bipartisan legislation that seeks to prevent an artificial intelligence system from making nuclear launch decisions. The Block Nuclear Launch by Autonomous Artificial Intelligence Act would prohibit the use of federal funds for launching any nuclear weapon by an automated system without “meaningful human control.”

“As we live in an increasingly digital age, we need to ensure that humans hold the power alone to command, control, and launch nuclear weapons—not robots,” Markey said in a news release. “That is why I am proud to introduce the Block Nuclear Launch by Autonomous Artificial Intelligence Act. We need to keep humans in the loop on making life or death decisions to use deadly force, especially for our most dangerous weapons.”

The new bill builds on existing US Department of Defense policy, which states that in all cases, “the United States will maintain a human ‘in the loop’ for all actions critical to informing and executing decisions by the President to initiate and terminate nuclear weapon employment.”

Read 6 remaining paragraphs | Comments

Source

Leave a Reply

Your email address will not be published. Required fields are marked *