New Bill Authorizes Deadly AI

The lights were barely flickering in the Senate chamber when Elissa Slotkin slipped a legislative Trojan horse past the gate — dressed up as safety, loaded with something far more dangerous.

Senate Bill S.4113, the charmingly titled “AI Guardrails Act of 2026,” landed on March 17th with all the subtlety of a wolf wearing a crossing guard vest. The pitch? Limit how the Pentagon uses artificial intelligence. The reality? Hand one person — a single Pentagon official — the keys to autonomous killer robots and tell Congress about it later. Maybe over brunch.

The Restriction That Isn’t

Here’s how the sleight of hand works. The bill lays out three big prohibitions: no AI launching nukes, no domestic surveillance without legal basis, and no lethal autonomous weapons without human oversight. Sounds great. Sounds responsible. Sounds like something you’d frame on a wall.

Then, buried right underneath those shiny guardrails, sits the escape hatch.

The Secretary of Defense — currently Pete Hegseth — “may waive the prohibitions… for up to one year” and renew that waiver if “extraordinary circumstances affecting the national security of the United States require the waiver.”

Read that again. The same bill that says “you can’t do this” immediately says “unless this one guy says you can.” That’s not a guardrail. That’s a speed bump with a detour sign.

The waiver covers development, field deployment, system modifications, changes to mission sets, target sets, operational environments, and algorithmic behavior. That’s not a narrow exception — that’s the whole buffet. And Congress? They get a notification after the fact. Some of it might even be classified. So the American people’s elected representatives get a redacted memo while AI-driven systems make life-and-death calls in the field.

No Boundaries. Literally.

And here’s where it gets genuinely unsettling. There is no language in the waiver clause limiting where these systems can be used. Foreign battlefield? Sure. American soil? The bill doesn’t say no. Domestic targets? Silence. The kind of silence that keeps constitutional lawyers up at night.

The only technical safeguard? The system’s error rate can’t exceed that of human operators performing the same job. Cold comfort when you realize we’re comparing a machine’s kill accuracy to a human’s — and calling it good enough.

Follow the Money, Find the Motive

Now let’s talk about the bill’s author. Senator Slotkin isn’t some wide-eyed freshman who stumbled into defense policy. She’s a former CIA analyst, former Department of Defense official, and former Acting Assistant Secretary of Defense for International Security Affairs. She didn’t wander into this space — she built a career in it.

Her donor list reads like a Silicon Valley-Pentagon speed dating roster. According to OpenSecrets data, top contributors include Alphabet Inc at $96,669 and Amazon at $53,771 — both major AI developers and federal cloud contractors. General Motors kicked in $57,081 and Ford added $54,020 — companies deep into autonomous systems with military applications. Kirkland & Ellis ($52,360) and WilmerHale ($81,463) round out the picture — law firms that structure the very defense contracts this bill enables.

The bill opens a door, and every company funding Slotkin’s campaign is already standing in the hallway with a foot in the frame.

Where’s Trump in All This?

Trump has been vocal about AI — pushing to centralize federal oversight and strip states of patchwork regulations. That instinct to consolidate makes sense when you’re trying to prevent fifty different bureaucracies from strangling innovation. But consolidation cuts both ways. When a bill like S.4113 lands on the desk, the question stops being “who controls AI” and becomes “who controls the kill switch.”

Trump’s Pentagon pick, Hegseth, would be the man holding the waiver pen. That’s either reassuring or terrifying depending on who sits in that chair next. And that’s the whole problem — this bill doesn’t expire with an administration. It builds permanent architecture.

The bill has been read twice in the Senate and referred to the Armed Services Committee. It’s moving. Quietly. The way these things always do.

They called it a guardrail. It’s a launch pad. And unless someone in Congress grows a spine and reads past the title, autonomous lethal AI won’t be a sci-fi nightmare — it’ll be Pentagon policy, signed, sealed, and classified before you ever hear about it.


Most Popular

Most Popular