News: 0181837680

  ARM Give a man a fire and he's warm for a day, but set fire to him and he's warm for the rest of his life (Terry Pratchett, Jingo)

Florida Launches Criminal Investigation Into ChatGPT Over School Shooting (npr.org)

(Tuesday April 21, 2026 @11:30PM (BeauHD) from the role-of-AI dept.)


Florida's attorney general has [1]launched a criminal investigation into OpenAI over allegations that the accused gunman in a [2]shooting at Florida State University last year used ChatGPT to help plan the attack. OpenAI says the chatbot is "not responsible for this terrible crime" and only provided factual information available from public sources. NPR reports:

> The Republican attorney general, James Uthmeier, said at a [3]press conference in Tampa on Tuesday that accused gunman Phoenix Ikner consulted ChatGPT for advice before the shooting, including what type of gun to use, what ammunition went with it, and what time to go to campus to encounter more people, according to an initial review of Ikner's chat logs. "My prosecutors have looked at this and they've told me, if it was a person on the other end of that screen, we would be charging them with murder," Uthmeier said. "We cannot have AI bots that are advising people on how to kill others."

>

> Uthmeier's office is [4]issuing subpoenas to OpenAI seeking information about its policies and internal training materials related to user threats of harm and how it cooperates with and reports crimes to law enforcement, dating back to March 2024. At the press conference, Uthmeier acknowledged the investigation is entering into uncharted territory and is uncertain about whether OpenAI has criminal liability. "We are going to look at who knew what, designed what, or should have done what," he said. "And if it is clear that individuals knew that this type of dangerous behavior might take place, that these types of unfortunate, tragic events might take place, and nevertheless still turned to profit, still allowed this business to operate, then people need to be held accountable."

>

> [...] Ikner, 21, is facing multiple charges of murder and attempted murder for the April 2025 shooting near the student union on FSU's Tallahassee campus, where he was a student at the time. His trial is set to begin on Oct. 19. According to court filings, more than 200 AI messages have been entered into evidence in the case.



[1] https://www.npr.org/2026/04/21/nx-s1-5793967/florida-openai-investigation-mass-shooting-fsu

[2] https://en.wikipedia.org/wiki/2025_Florida_State_University_shooting

[3] https://thefloridachannel.org/videos/4-21-26-attorney-generals-press-conference/

[4] https://www.myfloridalegal.com/newsrelease/attorney-general-james-uthmeier-launches-criminal-investigation-openai-chatgpt



Chatbot Lies (Score:2)

by gurps_npc ( 621217 )

Chatbot does NOT only provide factual information. It is an AI that works by making predictions. Those predictions are sometimes false.

I am constantly surprised of the stupidity of people using it and the people making it.

Re: (Score:1)

by Anonymous Coward

The responsible party is the person who pulled the trigger.

Re: (Score:1)

by rtkluttz ( 244325 )

Exactly, next people are going to be doing legal discovery on levi's jeans because the jeans helped the shooter keep his balls from flapping during the shooting. Stop trying to blame tools and keep the blame squarely on the human that does the evil thing.

Re: (Score:1)

by Anonymous Coward

Next people will want to prosecute Mafia bosses just because they ordered their henchman to commit crimes! What is this crazy world coming to?!

Re:Chatbot Lies (Score:5, Insightful)

by SomePoorSchmuck ( 183775 )

> Exactly, next people are going to be doing legal discovery on levi's jeans because the jeans helped the shooter keep his balls from flapping during the shooting. Stop trying to blame tools and keep the blame squarely on the human that does the evil thing.

Osama bin Laden was not on any of the planes that flew into buildings. All he did was sit there and help plan and train the people who did it.

Or, you go to a construction demolitions expert and ask him what's the best way to place explosives around the football stadium to make sure the exits collapse first so no one can escape. He looks at floor plans and pics, tells you what supplies you need, where to plant the charges, and how to rig the IEDs to blow simultaneously.

But all he gave you was information, so he has no legal or moral culpability for the death and destruction you cause?

Re:Chatbot Lies (Score:4, Insightful)

by WaffleMonster ( 969671 )

> Osama bin Laden was not on any of the planes that flew into buildings. All he did was sit there and help plan and train the people who did it.

> Or, you go to a construction demolitions expert and ask him what's the best way to place explosives around the football stadium to make sure the exits collapse first so no one can escape. He looks at floor plans and pics, tells you what supplies you need, where to plant the charges, and how to rig the IEDs to blow simultaneously.

> But all he gave you was information, so he has no legal or moral culpability for the death and destruction you cause?

Machines don't have agency. If you use technology to help you commit crimes you are the one with agency and so you are blamed for it.

Re: (Score:2)

by HiThere ( 15173 )

But the company providing the technology also has some agency in the matter. How much it's reasonable to argue about,

Re: (Score:2)

by WaffleMonster ( 969671 )

> But the company providing the technology also has some agency in the matter.

Those providing LLM service have no more agency over usage of their technology than a manufacturer of integrated circuits or power and telecom utilities.

Re: (Score:2)

by sjames ( 1099 )

The Engineer had agency. The AI (or google search, or a stack of text books) does not.

Of course, if the mad bomber instead posed as a student and found some non-evil reason for wanting the exits to collapse first (even a thin one like directing the dust upwards), the engineer is less culpable or not culpable at all.

But we need to be very careful about imagining an AI has agency. There are many legal and philosophical implications behind that.

Re: Chatbot Lies (Score:1)

by easyTree ( 1042254 )

And discard some leverage over ClosedAI ?

Kickbacks don't kick themselves back - it's a continuous grift!

Re: (Score:2)

by ObliviousGnat ( 6346278 )

A bad engineer blames the user.

Re:Chatbot Lies (Score:4)

by ClickOnThis ( 137803 )

I don't think the alleged shooter is stupid. He was a student at the university where the shooting took place. I'd be more inclined to think he is mentally ill.

As for the makers of ChatGPT being stupid -- no I don't think that either. They're among the smartest people on the planet. If anything I'd say they were careless, for not building a red-flag alert into their product that reports suspicious behavior. Maybe there should be laws that require such a thing.

And that leaves ChatGPT itself, which I am not inclined to call stupid, mentally ill, or careless. I'm not ready (yet) to give it that agency.

Re: (Score:3)

by gurps_npc ( 621217 )

The chatGPT makers are NOT among the smartest people, you have fallen victim to propaganda.

The technology behind ChatGPT was invented by:

Dznuret Bahdanau, Kyunghyun Cho and Yoshua Bengi in

[1]https://arxiv.org/abs/1409.0473/ [arxiv.org] in May of 2016.

Everyone else just copied their work with minor improvements and adding immense amount of memory and processing.

Most of the guys who currently are in charge of the Large Language Models are more interested in money than in science. They are above average intelligenc

[1] https://arxiv.org/abs/1409.0473/

Re: (Score:2)

by ClickOnThis ( 137803 )

> The chatGPT makers are NOT among the smartest people, you have fallen victim to propaganda.

Whatever. They're not stupid, that's the point.

Re: (Score:2)

by CalgaryD ( 9235067 )

To be fair, if this "stupid" thing helps people to do actual things to happen, how is it "bad and not working"? You can do not like it as much as you want but it can program and write poetry better than 90% of "real human population". If it is not impressive and and disruptive technology, I do not know what is.

Really, if you do not understand something, it does not mean it is bad, it means you do not understand it.

Regarding this specific case, even a hammer can be used to kill a person, it does not mean

Re: (Score:2)

by ClickOnThis ( 137803 )

> Regarding this specific case, even a hammer can be used to kill a person, it does not mean that the person who designed or made the hammer is a bad guy. The decision to attack people was on that person, let us stop blaming tools and start holding the bad guys accountable. So simple.

If a hammer engaged in a dialogue with its user about how to commit a crime, then I think the hammer manufacturers might at least need to answer some questions.

Re: (Score:2)

by CalgaryD ( 9235067 )

This is a tool that talks. This is how this tool works. Are you trying to imply that the chatbot has some intent and made the person to do the bad thing to gain some benefit? This would definitely change things.

If you think about it, it was the computer that allowed that person to interact with the chatbot...

Re: (Score:2)

by ClickOnThis ( 137803 )

> This is a tool that talks. This is how this tool works. Are you trying to imply that the chatbot has some intent and made the person to do the bad thing to gain some benefit? This would definitely change things.

The intent of the tool is irrelevant (assuming it even has one.) The behavior of the tool is what matters here. And if it behaves in such a way as to encourage harm, then its manufacturers need to answer for it.

Re: (Score:1)

by Whispered_Name ( 5844340 )

"Guns don't kill people; people kill people."

Re: (Score:2)

by CalgaryD ( 9235067 )

I thought it is 3D printers who do this, but printing weapon parts.

likely no criminal liability (Score:2)

by OrangeTide ( 124937 )

It would be a problematic precedent if there were criminal liability. And such a ruling can potentially hamstring phone books, encyclopedias, taxi services, and gun manufacturers. Any ancillary service or device used for a crime is a target with an imaginative prosecutor.

Civil liability? I sure hope so. The AI industry does not regulate itself and the government has so far refused to regulate what they believe is a golden goose.

Re: (Score:2)

by OrangeTide ( 124937 )

The thread that runs through your examples is knowingly allowing or directly facilitating known illegal activity.

I don't argue that those are serious examples of criminal liability.

Luckily for my own argument, I did not argue that all businesses are immune from criminal liability because they are just doing business. Often illegal business in your cited cases.

Re: (Score:2)

by OrangeTide ( 124937 )

s/I don't argue that those are serious examples of criminal liability./I agree those are serious examples of criminal liability./

sorry - I reworded it a double negative in my original edit, and didn't remove both negatives (facepalm)

Re: (Score:2)

by gurps_npc ( 621217 )

Where do you live where they still make phone books?

I have not seen one for a decade at least.

Re: (Score:2)

by OrangeTide ( 124937 )

The People's Republic of California. They still make them, and they leave a pile of them at the end of my private road every year.

I'm not buying it (Score:5, Insightful)

by tech10171968 ( 955149 )

I remember when Columbine happened. I also remembered when the Federal building in Oklahoma got blown up. Guess what WAS'T around back then? That's right: OpenAI wasn't a thing. But those events still happened. Blaming a chatbot for a tragedy is like blaming McDonald's for your obesity: even if the restaurant didn't exist, you were going to end up in that condition because of your eating habits anyhow. The name of the restaurant might have changed but the song remains the same. This guy had it in his head to shoot up the school, OpenAI or no OpenAI. Rounds were going to fly downrange even if AI didn't exist. This is some lazy logic.

Re: (Score:2)

by evslin ( 612024 )

Remember when life was simpler and we could just blame video games and Marilyn Manson for this shit?

Re: (Score:2)

by Baron_Yam ( 643147 )

I once convinced my landlady I wasn't going to try and sacrifice her and her son in an effort to summon Satan after she found out I played D&D.

OK, she wasn't terrified or anything, but she was seriously concerned for a few minutes until I managed to explain the game to her.

Re: (Score:2)

by rsilvergun ( 571051 )

Yeah but the people who banged on about how Doom was the problem got a lot of press out of it and some of them built entire careers out of it.

That's what this is about. He knows damn well they are covered by section 230 of the cda, and as much as the right wing would love to strike that down so that they could finish taking over the internet this isn't going to be the case that does it.

He is just after a bit of press and a little bit of think of the children bullshit.

Re: (Score:2)

by ObliviousGnat ( 6346278 )

Did AI help the shooter do more damage?

Anything to avoid the topic of gun control (Score:3, Interesting)

by Powercntrl ( 458442 )

> The Republican attorney general, James Uthmeier, said at a press conference in Tampa on Tuesday that accused gunman Phoenix Ikner consulted ChatGPT for advice before the shooting, including what type of gun to use, what ammunition went with it ...

All questions that your local gun store clerk would be more than happy to answer for you.

> and what time to go to campus to encounter more people

I'm fairly certain Google Maps also lists busy times for specific locations, at least it does for restaurants and stores.

This is all very on-brand from Florida, a place where according to Republican logic, this is not supposed to happen because [1]open carry [mynews13.com] should've brought all those supposed "good guys with a gun" out of the woodwork. Gee, I can't possibly imagine why more guns isn't making us safer. /s

[1] https://mynews13.com/fl/orlando/news/2025/09/19/law-enforcement-and-gun-experts-give-guidance-on-florida-s-new-open-carry-law

Would the military like a word? (Score:1)

by blue trane ( 110704 )

"We cannot have AI bots that are advising people on how to kill others."

If the AI hallucinates and a girl's school gets blown up, does Hegseth consider it a plus because it shows how crazy the US military can be?

Re: Would the military like a word? (Score:1)

by easyTree ( 1042254 )

"You follow the rules - we do as we please" - HexDeath.

Search Engine (Score:2)

by Himmy32 ( 650060 )

Would we hold Google or Bing accountable in the same way? Is the knowledge itself is illegal?

The Solution (Score:2)

by dohzer ( 867770 )

The only way to stop a bad guy with an LLM is with a good guy with an LLM.

This just in... (Score:2)

by Patent Lover ( 779809 )

Florida law enforcement are dumb as fuck. Details at 11.

THe .GOV tried this with the Anarchist Cookbook (Score:2)

by Hey_Jude_Jesus ( 3442653 )

in the 1970's The court ruled it was free speech. AI is nothing but a software program with a relational database.

share, n.:
To give in, endure humiliation.