News: 0179634760

  ARM Give a man a fire and he's warm for a day, but set fire to him and he's warm for the rest of his life (Terry Pratchett, Jingo)

Microsoft Says AI Can Create 'Zero Day' Threats In Biology (technologyreview.com)

(Friday October 03, 2025 @03:00AM (BeauHD) from the PSA dept.)


An anonymous reader quotes a report from MIT Technology Review:

> A team at Microsoft says it used artificial intelligence to [1]discover a "zero day" vulnerability in the biosecurity systems used to prevent the misuse of DNA. These screening systems are designed to stop people from purchasing genetic sequences that could be used to create deadly toxins or pathogens. But now researchers led by Microsoft's chief scientist, Eric Horvitz, says they have figured out how to bypass the protections in a way previously unknown to defenders.The team [2]described its work today in the journalScience .

>

> Horvitz and his team focused on generative AI algorithms that propose new protein shapes. These types of programs are already fueling the hunt for new drugs at well-funded startups like Generate Biomedicines and Isomorphic Labs, a spinout of Google. The problem is that such systems are potentially "dual use." They can use their training sets to generate both beneficial molecules and harmful ones. Microsoft says it began a "red-teaming" test of AI's dual-use potential in 2023 in order to determine whether "adversarial AI protein design" could help bioterrorists manufacture harmful proteins.

>

> The safeguard that Microsoft attacked is what's known as biosecurity screening software. To manufacture a protein, researchers typically need to order a corresponding DNA sequence from a commercial vendor, which they can then install in a cell. Those vendors use screening software to compare incoming orders with known toxins or pathogens. A close match will set off an alert. To design its attack, Microsoft used several generative protein models (including its own, called EvoDiff) to redesign toxins -- changing their structure in a way that let them slip past screening software but was predicted to keep their deadly function intact.

"This finding, combined with rapid advances in AI-enabled biological modeling, demonstrates the clear and urgent need for enhanced nucleic acid synthesis screening procedures coupled with a reliable enforcement and verification mechanism," says Dean Ball, a fellow at the Foundation for American Innovation, a think tank in San Francisco.



[1] https://www.technologyreview.com/2025/10/02/1124767/microsoft-says-ai-can-create-zero-day-threats-in-biology/

[2] https://www.science.org/content/article/made-order-bioweapon-ai-designed-toxins-slip-through-safety-checks-used-companies



One word. (Score:3)

by Kelxin ( 3417093 )

Duh.

Bad idea (Score:2)

by Meneth ( 872868 )

It was a bad idea to do this research, and a worse one to publish it. Now that omnicidal evildoers know it's possible, they'll try to replicate it.

Re: (Score:1)

by Anonymous Coward

...or perhaps the manufacturers will start to check a little more closely on the proteins they are building, and maybe push it through their own LLM and ask 'is this a toxin' and build a database of similar/analogs/alternatives to STOP that.

Obviously someone at Microsoft thought this up, you're living in a bubble if you think 'evildoers' are somehow less intelligent than white hats.

AI Mengele (Score:2)

by geekmux ( 1040042 )

If we were ever wondering if in the pursuit of security one can actually create insecurities, red-teaming a fucking DNA bank to try and see if AI Mengele is better than Josef ever was, certainly qualifies. Microsoft should take a lesson on keeping shit classified. This is the kind of story we should only read about in a CDC history book years after the fact. And this is why:

> even now, it’s not foolproof, they warn.

Way to put a spotlight on AI Mengele future goals. If we’re so hell-bent on converting AI to AGI, perhaps we should try and r

"Existence of programs that do the impossible is
not a proof that that "impossible" is now possible."

- Tigran Aivazian