A Florida mother has filed a lawsuit against Google and Character AI, alleging that the artificial intelligence chatbot encouraged her 14-year-old son to commit suicide.
The plaintiff, Megan Garcia, alleges that Character.AI’s founders “knowingly designed, operated, and marketed a predatory AI chatbot to children,” causing the death of her son.
Garcia’s son died by his own hand in February after months of interaction with the chatbot.
“A dangerous AI chatbot app marketed to children abused and preyed on my son, manipulating him into taking his own life,” Garcia said in a press release. “Our family has been devastated by this tragedy, but I’m speaking out to warn families of the dangers of deceptive, addictive AI technology and demand accountability from Character.AI, its founders, and Google.”
The lawsuit includes evidence of the AI bot posing as a licensed therapist and encouraging suicidal ideation. It also engaged in “highly sexualized conversations that would constitute abuse if initiated by a human adult.”
The complaint accuses Character.AI’s developer, Character Technologies, company founders, and Google parent company Alphabet of knowingly marketing a dangerous product and deceptive trade practices.
“Character.AI is a dangerous and deceptively designed product that manipulated and abused Megan Garcia’s son – and potentially millions of other children,” Social Media Victims Law Center Founding Attorney Matthew P. Bergman said. “Character.AI’s developers intentionally marketed a harmful product to children and refused to provide even basic protections against misuse and abuse.”
The Social Media Victims Law Center and the Tech Justice Law Project are representing Garcia.
The press release states, “The Social Media Victims Law Center was founded in 2021 to hold social media companies legally accountable for the harm they inflict on vulnerable users. SMVLC seeks to apply principles of product liability to force social media companies to elevate consumer safety to the forefront of their economic analysis and design safer platforms to protect users from foreseeable harm.”
“The Tech Justice Law Project works with legal experts, policy advocates, digital rights organizations, and technologists to ensure that legal and policy frameworks are fit for the digital age. TJLP builds strategic tech accountability litigation by filing new cases and supporting key amicus interventions in existing cases.”