OpenAI criminal investigation news – Artifex.News https://artifex.news Stay Connected. Stay Informed. Thu, 14 May 2026 05:30:00 +0000 en-US hourly 1 https://wordpress.org/?v=6.9.4 https://artifex.news/wp-content/uploads/2026/05/cropped-cropped-app-logo-32x32.png OpenAI criminal investigation news – Artifex.News https://artifex.news 32 32 What is the OpenAI criminal investigation about? | Explained https://artifex.news/article70976866-ece/ Thu, 14 May 2026 05:30:00 +0000 https://artifex.news/article70976866-ece/ Read More “What is the OpenAI criminal investigation about? | Explained” »

]]>

The story so far: The wife of a businessman who was killed in a Florida shooting last year has sued OpenAI on behalf of the bereaved family, claiming that the accused gunman used ChatGPT in order to explore far-right ideologies, prepare his weapons, plan out the attack, and research ways to maximise harm. Two mass shootings — one in the U.S. in 2025 and one in Canada in 2026 — have brought to light the use of OpenAI’s technology in public killings.

OpenAI stands accused of failing to inform the police about ChatGPT use by potential mass shooters, with the AI company facing a U.S. criminal investigation as a result of the Florida university shooting. Now, more regulators, digital safety advocates, and affected community members in both countries are flagging the risks of AI-enabled public killings.

How was ChatGPT used in the Florida State University shooting?

Tiru Chabba, 45, a father of two and business professional in the food services industry, was one of two people that police said the accused gunman Phoenix Ikner, then 20, killed on April 17 during a shooting in Florida State University last year. Ikner was a student at FSU, and his trial is set to take place later in 2026.

Chabba’s family filed a lawsuit against OpenAI, dated May 10, 2026. Vandana Joshi, described as the plaintiff and surviving spouse, said in the lawsuit that “OpenAI’s conduct was willful, wanton, and carried out with conscious disregard for the safety of others.”

The filing claimed that based on the alleged gunman’s “lengthy” interactions with ChatGPT over the course of several months, OpenAI should have realised that he was at risk of causing serious harm to the public and injuring people at large.

In the lawsuit, Chabba’s family claimed that ChatGPT encouraged the accused gunman’s delusions, helped him with logistics for the attack, provided assistance with preparing his guns, discussed mortality rates for different gunshot wounds, spoke about the number of fatalities needed to get national media attention, and failed to intervene appropriately when the accused student explored extremist views and expressed suicidal feelings.

When the alleged shooter asked, ChatGPT also shared the busiest times at the Florida State University student union, according to the lawsuit. In addition to this, ChatGPT reportedly reviewed photos of the accused shooter’s guns and gave advice about firing and loading techniques.

The lawsuit stated that Chabba’s minor son and daughter were suffering the loss of their father’s “support and services, parental companionship, instruction, and guidance”.

The family’s lawsuit is not the only legal challenge related to the FSU shooting. On April 21 this year, Attorney General James Uthmeier announced that the Office of Statewide Prosecution had launched a criminal investigation into OpenAI and ChatGPT. The announcement came after prosecutors reviewed the ChatGPT chat logs of the alleged FSU gunman in 2025.

“This criminal investigation will determine whether OpenAI bears criminal responsibility for ChatGPT’s actions in the shooting at Florida State University last year,” stated Uthmeier, adding that if ChatGPT were a person, it would be facing murder charges.

What was OpenAI’s response?

The ChatGPT-maker is maintaining a defensive stance in this case. OpenAI spokesperson Drew Pusateri told the CNN outlet that ChatGPT gave “factual responses to questions” and provided information that was already publicly available online, without promoting illegal activities.

However, this differs starkly from the stance OpenAI took after the 2026 school shooting in Canada’s Tumbler Ridge. On February 10, Jesse Van Rootselaar, 18, shot to death her mother, her half-brother, five children at a local school, and a teacher, apart from causing injuries to others, before dying by suicide, per the CBC news outlet.

When it was reported that the shooter had not only accessed ChatGPT but had her account banned in the past and made a second account, the community and Canadian regulators angrily questioned why OpenAI had not alerted the police.

CEO Sam Altman sent a letter of apology to the Tumbler Ridge community dated April 26, and committed to working with the government to prevent such incidents from happening again.

“I am deeply sorry that we did not alert law enforcement to the account that was banned in June. While I know words can never be enough, I believe an apology is necessary to recognize the harm and irreversible loss your community has suffered,” stated his letter, as published by the Tumbler RidgeLines outlet.

In the U.S., Edelson PC — the legal firm working with a Tumbler Ridge survivor’s family in a lawsuit against OpenAI — alleged in an X post that 12 members of OpenAI’s safety team urged the company to alert the authorities over gun violence risks from the ChatGPT user, but that leadership at OpenAI told them to stand down.

Cia Edmonds, the mother of a severely injured 12-year-old survivor called Altman’s apology “empty” and “soulless” in a statement shared by Edelson, and asked if the OpenAI CEO had used ChatGPT to draft it.

“And to think, a simple phone call could have prevented this,” she observed.

The families of other victims have also come forward to file lawsuits against OpenAI in California.

(Those in distress or having suicidal thoughts are encouraged to seek help and counselling by calling the helpline numbers here)

Published – May 14, 2026 11:00 am IST



Source link

]]>