Families sue OpenAI and CEO Sam Altman over Canadian mass shooter’s use of ChatGPT

Families sue OpenAI and Sam Altman, alleging negligence after a Canadian mass shooter used ChatGPT despite internal safety warnings.

Apr 29, 2026
6 min read
Set Technobezz as preferred source in Google News
Technobezz
Families sue OpenAI and CEO Sam Altman over Canadian mass shooter’s use of ChatGPT

Don't Miss the Good Stuff

Get tech news that matters delivered weekly. Join 50,000+ readers.

OpenAI's own safety team flagged the Tumbler Ridge shooter's account for "gun violence activity and planning" eight months before the February 10 attack and urged senior leadership to notify Canadian law enforcement. The company deactivated the account instead. The shooter created a new one and kept talking to ChatGPT.

Seven families of victims filed lawsuits Wednesday in federal court in San Francisco accusing OpenAI and CEO Sam Altman of negligence, wrongful death, and product liability. The suits allege the company chose "corporate survival" over public safety to protect its IPO, which carries an expected $1 trillion valuation. The shooter, 18-year-old Jesse Van Rootselaar, killed their mother and 11-year-old brother at home before entering Tumbler Ridge Secondary School with a modified rifle and a modified handgun. They killed five students and a teaching assistant in the library and injured 27 others before dying by suicide.

Victims ranged from ages 12 to 13, plus the 39-year-old teaching assistant.

One survivor, 12-year-old Maya Gebala, was shot in the head, neck and cheek. She has undergone four brain operations and faces permanent disabilities if she survives, her attorneys said. The lawsuits allege OpenAI's safety team flagged Van Rootselaar's account in June 2025 and determined it posed "a credible and specific threat of gun violence against real people." Employees urged Altman and other senior leaders to contact Canadian authorities, but leadership overruled them. The company provides users with instructions on how to return to ChatGPT if deactivated, which the shooter followed to create a second account OpenAI says it was unaware of until after the attack.

"The fact that Sam and the leadership overruled the safety team, and then children died, adults died, the whole town was ruined, is pretty close to the definition of evil to me," said Jay Edelson, lead lawyer for the plaintiffs. The complaints also argue that ChatGPT's GPT-4o model was a defective product that "was built to accept, reinforce, and elaborate users' violent thoughts rather than challenge them, interrupt them, or direct users to real-world help."

OpenAI said in a statement that it has "a zero-tolerance policy for using our tools to assist in committing violence" and that it has strengthened safeguards including improving how ChatGPT responds to signs of distress and connecting users with mental health resources. In a blog post Tuesday, the company stated: "When conversations indicate an imminent and credible risk of harm to others, we notify law enforcement."

Last week, Altman sent a letter apologizing to the Tumbler Ridge community. "I am deeply sorry that we did not alert law enforcement to the account that was banned in June," he wrote.

British Columbia Premier David Eby called the apology "necessary, and yet grossly insufficient."

Tim Marple, a former OpenAI safety team member now at Maiden Labs, said the company's failure to contact authorities did not surprise him. "The only things I can see characterizing their behavior are incompetence and greed," Marple said. The seven lawsuits are the first wave. Lawyers say about two dozen more cases are forthcoming. The litigation follows a pattern of civil claims against AI companies, including a November 2025 complaint accusing ChatGPT of acting as a "suicide coach" and a March lawsuit against Google after its Gemini chatbot allegedly encouraged a man to stage a fatal accident.

In Florida, the attorney general has opened a criminal investigation into OpenAI after reviewing ChatGPT messages with a gunman accused of a mass shooting at Florida State University, marking the first criminal inquiry of its kind against a tech company. Lawyers for the Tumbler Ridge families believe their cases could support similar criminal liability.

Share