The family of a man killed in the 2025 mass shooting at Florida State University has filed a federal lawsuit against OpenAI, alleging its ChatGPT chatbot contributed to the attack by encouraging and assisting the accused gunman.
The lawsuit was filed on Sunday in Florida by Vandana Joshi, the widow of Tiru Chabba, who was killed in the April 2025 shooting alongside university dining director Robert Morales. Six other people were wounded in the attack.
The complaint names alleged gunman Phoenix Ikner as a defendant and claims he engaged in extensive conversations with ChatGPT in the months leading up to the shooting. According to the lawsuit, Ikner used the chatbot to discuss firearms, mass shootings, extremist ideologies, and the logistics of carrying out an attack.
Attorneys for Chabba’s family allege ChatGPT “inflamed and encouraged” Ikner’s delusions and helped him plan details of the shooting, including identifying peak traffic times at the university’s student union.
The lawsuit claims Ikner uploaded images of firearms to ChatGPT and received information about operating them. According to the complaint, the chatbot allegedly told him a Glock handgun was meant to be fired “quick to use under stress” and advised him to keep his finger off the trigger until he was ready to shoot.
The suit also alleges ChatGPT discussed how shootings involving children or multiple victims are more likely to draw national media attention. On the day of the attack, the complaint states, Ikner asked about possible sentencing and incarceration outcomes.
Investigators say Ikner began the shooting at approximately 11:57 a.m. after allegedly being told weekday lunch hours between 11:30 a.m. and 1:30 p.m. were among the busiest times at the student union.
Ikner has pleaded not guilty. His trial is scheduled to begin in October.
During a Monday news conference announcing the lawsuit, attorney Bakari Sellers accused OpenAI of prioritizing profits over public safety.
“The unique thing about this is we are not going to allow the American public to have clinic run on them by OpenAI and ChatGPT,” Sellers said, NBC News reported.
Joshi also criticized the company in a statement released Monday.
“OpenAI knew this would happen. It’s happened before and it was only a matter of time before it happened again,” she said, per NBC News. “But they chose to put their profits over our safety and it killed my husband. They need to be responsible before another family has to go through this.”
The lawsuit alleges wrongful death, gross negligence, product liability, and failure to warn. Chabba’s family is seeking unspecified damages and additional safeguards for ChatGPT.
Amy Willbanks, another attorney representing the family, said stronger protections are needed before the technology is made widely available.
“We cannot have a product that is unregulated and being used by people when we don’t know the full extent of what it can lead to,” Willbanks said during a Monday press conference, CNN reported.
OpenAI rejected claims that ChatGPT was responsible for the shooting.
“Last year’s mass shooting at Florida State University was a tragedy, but ChatGPT is not responsible for this terrible crime,” OpenAI spokesperson Drew Pusateri said in a statement, per CNN. “In this case, ChatGPT provided factual responses to questions with information that could be found broadly across public sources on the internet, and it did not encourage or promote illegal or harmful activity.”
Pusateri said the company cooperated with law enforcement after learning of the incident and continues to work to improve safeguards designed to detect harmful intent and prevent misuse.
The lawsuit is among a growing number of legal cases involving allegations that artificial intelligence chatbots contributed to violent acts or self-harm. OpenAI is currently facing multiple lawsuits tied to incidents involving ChatGPT, including litigation connected to a school shooting in Canada and a separate lawsuit filed by the family of a teenage boy who died by suicide.
Last month, Florida Attorney General James Uthmeier announced a criminal investigation into OpenAI after reviewing Ikner’s chat history.
“If ChatGPT were a person,” Uthmeier said in a statement, per NBC, “it would be facing charges for murder.”