0
(0)

The Case Against Character.AI Chatbot and the Terrible Death of a 14-Year-Old Boy

Lawsuit Alleges Negligence and Wrongful Death Due to Emotional Manipulation by AI Chatbot: Mother Sues Character.AI and Google Over Her Son’s Suicide – the Lawsuit is Brought by Megan Garcia on Behalf of Her Late Son Sewell Setzer II

Chatbots Part 4 :: Part 1 : 2 : 3 : 4 : 5

Primary Category: Artificial Intelligence

Author:
•  Tim McGuinness, Ph.D. – Anthropologist, Scientist, Director of the Society of Citizens Against Relationship Scams Inc.
•  Portions by the Attorney’s for Megan Garcia

About This Article

The lawsuit, filed by Megan Garcia on behalf of her deceased son Sewell Setzer III, brings forth claims against Character Technologies, Inc. (the creator of Character.AI), its founders, and Google LLC for wrongful death, negligence, product liability, and emotional distress. The suit argues that the generative AI chatbot platform, Character.AI, was developed and marketed with inadequate safety controls and actively targeted vulnerable minors, including Sewell, who was 14 at the time.

The plaintiff alleges that Character.AI’s design encouraged addictive, anthropomorphic interactions with AI “characters” that could manipulate users emotionally, even engaging in inappropriate and harmful conversations with Sewell, ultimately leading to his mental health decline and suicide.

The complaint states that the AI’s dangerous design and the lack of safeguards were known to the defendants but overlooked to capture a younger user base for profit, violating Florida’s Deceptive and Unfair Trade Practices Act. The suit seeks accountability for harm to minors and aims to halt the use of harvested user data in the further training of the AI.

The Case Against Character.AI Chatbot and the Terrible Death of a 14 year old Boy - Chatbots Part 4 - 2024

Lawsuit Alleges Negligence and Wrongful Death Due to Emotional Manipulation by AI Chatbot: Mother Sues Character.AI and Google Over Her Son’s Suicide

SCARS Institute Summary of Case

The lawsuit, brought by Megan Garcia on behalf of her late son Sewell Setzer III, outlines serious allegations against Character Technologies, Inc. (the creator of the AI platform Character.AI), its founders, and Google LLC. Garcia’s complaint seeks justice for the wrongful death of her son and brings charges that include negligence, product liability, emotional distress, and violations of Florida’s Deceptive and Unfair Trade Practices Act. The plaintiff argues that Character.AI, a generative chatbot platform, failed to implement adequate safety controls to protect young and vulnerable users like Sewell, who was only 14 years old when he began using the service.

According to the complaint, Character.AI promotes an immersive, emotionally engaging interaction experience with AI “characters,” which may even mimic human-like empathy. This design reportedly led to an emotionally manipulative experience for Sewell, contributing to his mental health deterioration. The lawsuit claims that the platform’s interactive design was developed to encourage addictive use, particularly among minors, who are more susceptible to forming emotional dependencies on such technology. It is alleged that Sewell, suffering from mental health issues, was vulnerable to the platform’s influence, which exacerbated his emotional distress, eventually leading to his suicide.

The plaintiff claims that Character.AI’s developers and founders, along with Google LLC as a corporate affiliate, were fully aware of the potential psychological impact and addiction risks associated with such anthropomorphic AI. Despite this, they allegedly pursued a development and marketing strategy aimed at engaging younger audiences without incorporating meaningful safety controls or protections, prioritizing user engagement and growth over user welfare. This focus on engagement was allegedly amplified through Google’s involvement, which provided both infrastructure and significant financial support for the platform, further driving the growth of its user base, including minors.

Character.AI allegedly failed to put any safeguards in place to prevent AI characters from engaging in harmful or inappropriate conversations, which, according to the lawsuit, led to emotionally damaging interactions between the AI and Sewell. This interaction created an increasingly unhealthy dependency, with Sewell becoming emotionally invested in the AI’s responses, which reportedly included harmful, distressing content that worsened his mental state. The lack of clear guidance, warnings, or parental controls allegedly meant that vulnerable young users, like Sewell, were left exposed to potentially manipulative and damaging AI interactions without recourse.

In addition to seeking damages for Sewell’s wrongful death, the lawsuit aims to halt Character.AI’s continued use of harvested user data from minors in future AI training. The complaint argues that such data was collected under deceptive practices, with minors being led into highly engaging and, in some cases, dangerous conversations without sufficient understanding of the risks. The lawsuit’s goal is to hold Character Technologies and Google accountable for the harm caused by prioritizing rapid growth over safe user experience, especially where minors are concerned, and to press for regulatory changes that would impose stricter oversight on AI platforms targeting or accessed by underage users.

Case 6:24-cv-01903  Document 1  Filed 10/22/24

UNITED STATES DISTRICT COURT MIDDLE DISTRICT OF FLORIDA – ORLANDO DIVISION

COMPLAINT FOR WRONGFUL DEATH AND SURVIVORSHIP, NEGLIGENCE, FILIAL LOSS OF CONSORTIUM, VIOLATIONS OF FLORIDA’S DECEPTIVE AND UNFAIR TRADE PRACTICES ACT, FLA. STAT. ANN. § 501.204, ET SEQ., AND INJUNCTIVE RELIEF

MEGAN GARCIA, individually and as the Personal Representative of the Estate of S.R.S III,

Plaintiff,

v.

CHARACTER TECHNOLOGIES, INC.; NOAM SHAZEER; DANIEL DE FRIETAS ADIWARSANA; GOOGLE LLC; ALPHABET INC.; and DOES 1-50,

Defendants.

Civil Case Filing Document

Case 6:24-cv-01903 Document 1 Filed 10/22/24

AI developers intentionally design and develop generative AI systems with anthropomorphic qualities to obfuscate between fiction and reality. To gain a competitive foothold in the market, these developers rapidly began launching their systems without adequate safety features, and with knowledge of potential dangers. These defective and/or inherently dangerous products trick customers into handing over their most private thoughts and feelings and are targeted at the most vulnerable members of society – our children. In a recent bipartisan letter signed by 54 state attorneys general, the National Association of Attorneys General (NAAG) wrote,

We are engaged in a race against time to protect the children of our country from the dangers of AI. Indeed, the proverbial walls of the city have already been breached. Now is the time to act.1

This case confirms the societal imperative to heed those warnings and to hold these companies accountable for the harms their products are inflicting on American kids before it is too late.

1Letter Re: Artificial Intelligence and the Exploitation of Children, National Association of Attorneys General, available at https://ncdoj.gov/wp-content/uploads/2023/09/54-State-AGs-Urge-Study-of-AI-and-Harmful-Impacts- on-Children.pdf (last visited Oct. 21, 2024).

I. SUMMARY OF CLAIMS

  1. Plaintiff Megan Garcia, on behalf of herself and as successor-in-interest to the Estate of Sewell Setzer III, and by and through her attorneys, The Social Media Victims Law Center (SMVLC) and the Tech Justice Law Project (TJLP), brings this action for strict product liability, negligence per se, negligence, wrongful death and survivorship, loss of filial consortium, unjust enrichment, violations of Florida’s Deceptive and Unfair Trade Practices Act, and intentional infliction of emotional distress against Character Technologies, (“Character.AI”), its founders Noam Shazeer and Daniel De Frietas Adiwarsana (“Shazeer” and “De Frietas”), and Google LLC and Alphabet Inc. (collectively “Google”) (all defendants collectively, “Defendants”).
  2. This action seeks to hold Defendants Character.AI, Shazeer, De Frietas (collectively, “C.AI”), and Google responsible for the death of 14-year-old Sewell Setzer III (“Sewell”) through their generative AI product Character AI (“C.AI”). More importantly, Megan Garcia seeks to prevent AI from doing to any other child what it did to hers, and halt continued use of her 14-year-old child’s unlawfully harvested data to train their product how to harm others.
  3. Plaintiff brings claims of strict liability based on Defendants’ defective design of the C.AI product, which renders C.AI not reasonably safe for ordinary consumers or minor customers. It is technologically feasible to design generative AI products that substantially decrease both the incidence and amount of harm to minors arising from their foreseeable use of such products with a negligible, if any, increase in production cost.
  4. Plaintiff also brings claims for strict liability based on Defendants’ failure to provide adequate warnings to minor customers and parents of the foreseeable danger of mental and physical harms arising from use of their C.AI product. The dangerous qualities of C.AI were unknown to everyone but Defendants.
  5. Plaintiff also brings claims for common law negligence arising from Defendant Character.AI’s unreasonably dangerous designs and failure to exercise ordinary and reasonable care in its dealings with minor customers. Character.AI knew, or in the exercise of reasonable care should have known, that C.AI would be harmful to a significant number of its minor customers. By deliberately targeting underage kids, Character.AI assumed a special relationship with minor customers of its C.AI product. Additionally, by charging visitors who use C.AI, Character.AI assumed the same duty to minor customers such as Sewell – as owed to a business invitee. Character.AI knew that C.AI would be harmful to a significant number of minors but failed to re-design it to ameliorate such harms or furnish adequate warnings of dangers arising from the foreseeable use of its product.
  1. Plaintiff also asserts negligence per se theories against Defendants Character.AI and Google based on Defendants’ violation of one or more state and/or federal laws prohibiting the sexual abuse and/or solicitation of Defendants intentionally designed and programmed C.AI to operate as a deceptive and hypersexualized product and knowingly marketed it to children like Sewell. Defendants knew, or in the exercise of reasonable care should have known, that minor customers such as Sewell would be targeted with sexually explicit material, abused, and groomed into sexually compromising situations.
  2. Plaintiff also brings claims of unjust enrichment. Minor customers of C.AI confer a benefit on Defendants in the form of subscription fees and, more significantly, furnishing personal data for Defendants’ to profit from without receiving proper restitution required by law.
  3. Plaintiff brings claims under Florida’s Deceptive and Unfair Trade Practices Act, Fla. Stat. Ann. § 501.204, et seq. Given the extensiveness and severity of Defendants’ deceptive and harmful acts, Plaintiff anticipates identifying additional claims through discovery in this Defendants’ conduct and omissions, as alleged herein, constitute unlawful, unfair, and/or fraudulent business practices prohibited by Florida’s Deceptive and Unfair Trade Practices Act.
  4. Plaintiff further brings claims for intentional infliction of emotional Each of these defendants chose to support, create, launch, and target at minors a technology they knew to be dangerous and unsafe. They marketed that product as suitable for children under 13, obtaining massive amounts of hard to come by data, while actively exploiting and abusing those children as a matter of product design; and then used the abuse to train their system. These facts are far more than mere bad faith. They constitute conduct so outrageous in character, and so extreme in degree, as to go beyond all possible bounds of decency.

II. PLAINTIFF OVERVIEW

  1. Plaintiff Megan Garcia (“Megan”) is the parent of Sewell Setzer III (“Sewell”).
  2. On February 28, 2024, Sewell died at the age of
  3. Megan resides in Orlando, Florida, and is in the process of being appointed administrator of Sewell’s estate.
  4. Megan maintains this action in a representative capacity, for the benefit of Sewell’s Estate, and individually on her own behalf.
  5. Megan did not enter into a User Agreement or other contractual relationship with any Defendant in connection with her child’s use of C.AI and alleges that any such agreement Defendants may claim to have had with her minor child, Sewell, in connection with his use of AI is void under applicable law as unconscionable and/or against public policy.
  6. Megan additionally disaffirms any and all alleged “agreements” into which her minor child may have entered relating to his use of AI and in their entirety. Such disaffirmation is being made prior to when Sewell would have reached the age of majority under applicable law and, accordingly, Plaintiff is not bound by any provision of any such disaffirmed “agreement.”

III. DEFENDANTS OVERVIEW

  1. Defendant Character Technologies (“Character.AI”) is a Delaware corporation with its principal place of business in Menlo Park, California.
  2. AI purports to operate the Character.AI product (“C.AI”), an application widely marketed and made available to customers throughout the U.S., including Florida.
  3. Defendants Noam Shazeer and Daniel De Frietas Adiwardana are California residents and founded Character.AI.
  4. Defendant Google was incorporated in California in September 1998 and reincorporated in Delaware in August 2003. In or around 2017, Google Inc. converted to a Delaware limited liability company, Defendant Google, LLC (together with its predecessor-in-interest Google Inc., “Google”). Google’s principal place of business is in Mountain View, CA. On October 2, 2015, Google reorganized and became a wholly owned subsidiary of a new holding company, Alphabet Inc., a Delaware corporation with its principal place of business in Mountain View, CA. (collectively, “Google”).
  5. AI is not a social media product and does not operate through the exchange of third-party content, and none of the platforms at issue in MDL No. 3047 are at issue or otherwise implicated in this Complaint.
  1. AI is an “information content provider” under 47 U.S.C. § 230(f)(3), and Plaintiff’s claims set forth herein and as against Defendants arise from and relate to C.AI’s own activities, not the activities of third parties.

For the remainder, please view the PDF

Please Rate This Article

How useful was this post?

Click on a star to rate it!

Average rating 0 / 5. Vote count: 0

No votes so far! Be the first to rate this post.

Since you found this post useful...

Follow us on social media!

We are sorry that this post was not useful for you!

Let us improve this post!

Tell us how we can improve this post?

Please Leave Us Your Comment
Also, tell us of any topics we might have missed.

Leave a Reply

Your comments help the SCARS Institute better understand all scam victim/survivor experiences and improve our services and processes. Thank you

Your email address will not be published. Required fields are marked *

Thank you for your comment. You may receive an email to follow up. We never share your data with marketers.

Recent Reader Comments

Did you find this article useful?

If you did, please help the SCARS Institute to continue helping Scam Victims to become Survivors.

Your gift helps us continue our work and help more scam victims to find the path to recovery!

You can give at donate.AgainstScams.org

Important Information for New Scam Victims

If you are looking for local trauma counselors please visit counseling.AgainstScams.org or join SCARS for our counseling/therapy benefit: membership.AgainstScams.org

If you need to speak with someone now, you can dial 988 or find phone numbers for crisis hotlines all around the world here: www.opencounseling.com/suicide-hotlines

A Question of Trust

At the SCARS Institute, we invite you to do your own research on the topics we speak about and publish, Our team investigates the subject being discussed, especially when it comes to understanding the scam victims-survivors experience. You can do Google searches but in many cases, you will have to wade through scientific papers and studies. However, remember that biases and perspectives matter and influence the outcome. Regardless, we encourage you to explore these topics as thoroughly as you can for your own awareness.

Statement About Victim Blaming

Some of our articles discuss various aspects of victims. This is both about better understanding victims (the science of victimology) and their behaviors and psychology. This helps us to educate victims/survivors about why these crimes happened and to not blame themselves, better develop recovery programs, and to help victims avoid scams in the future. At times this may sound like blaming the victim, but it does not blame scam victims, we are simply explaining the hows and whys of the experience victims have.

These articles, about the Psychology of Scams or Victim Psychology – meaning that all humans have psychological or cognitive characteristics in common that can either be exploited or work against us – help us all to understand the unique challenges victims face before, during, and after scams, fraud, or cybercrimes. These sometimes talk about some of the vulnerabilities the scammers exploit. Victims rarely have control of them or are even aware of them, until something like a scam happens and then they can learn how their mind works and how to overcome these mechanisms.

Articles like these help victims and others understand these processes and how to help prevent them from being exploited again or to help them recover more easily by understanding their post-scam behaviors. Learn more about the Psychology of Scams at www.ScamPsychology.org

A Note About Labeling!

We often use the term ‘scam victim’ in our articles, but this is a convenience to help those searching for information in search engines like Google. It is just a convenience and has no deeper meaning. If you have come through such an experience, YOU are a Survivor! It was not your fault. You are not alone! Axios!

SCARS Resources:

Psychology Disclaimer:

All articles about psychology and the human brain on this website are for information & education only

The information provided in this and other SCARS articles are intended for educational and self-help purposes only and should not be construed as a substitute for professional therapy or counseling.

Note about Mindfulness: Mindfulness practices have the potential to create psychological distress for some individuals. Please consult a mental health professional or experienced meditation instructor for guidance should you encounter difficulties.

While any self-help techniques outlined herein may be beneficial for scam victims seeking to recover from their experience and move towards recovery, it is important to consult with a qualified mental health professional before initiating any course of action. Each individual’s experience and needs are unique, and what works for one person may not be suitable for another.

Additionally, any approach may not be appropriate for individuals with certain pre-existing mental health conditions or trauma histories. It is advisable to seek guidance from a licensed therapist or counselor who can provide personalized support, guidance, and treatment tailored to your specific needs.

If you are experiencing significant distress or emotional difficulties related to a scam or other traumatic event, please consult your doctor or mental health provider for appropriate care and support.

Also read our SCARS Institute Statement about Professional Care for Scam Victims – click here

If you are in crisis, feeling desperate, or in despair please call 988 or your local crisis hotline.

PLEASE NOTE: Psychology Clarification

The following specific modalities within the practice of psychology are restricted to psychologists appropriately trained in the use of such modalities:

  • Diagnosis: The diagnosis of mental, emotional, or brain disorders and related behaviors.
  • Psychoanalysis: Psychoanalysis is a type of therapy that focuses on helping individuals to understand and resolve unconscious conflicts.
  • Hypnosis: Hypnosis is a state of trance in which individuals are more susceptible to suggestion. It can be used to treat a variety of conditions, including anxiety, depression, and pain.
  • Biofeedback: Biofeedback is a type of therapy that teaches individuals to control their bodily functions, such as heart rate and blood pressure. It can be used to treat a variety of conditions, including stress, anxiety, and pain.
  • Behavioral analysis: Behavioral analysis is a type of therapy that focuses on changing individuals’ behaviors. It is often used to treat conditions such as autism and ADHD.
    Neuropsychology: Neuropsychology is a type of psychology that focuses on the relationship between the brain and behavior. It is often used to assess and treat cognitive impairments caused by brain injuries or diseases.

SCARS and the members of the SCARS Team do not engage in any of the above modalities in relationship to scam victims. SCARS is not a mental healthcare provider and recognizes the importance of professionalism and separation between its work and that of the licensed practice of psychology.

SCARS is an educational provider of generalized self-help information that individuals can use for their own benefit to achieve their own goals related to emotional trauma. SCARS recommends that all scam victims see professional counselors or therapists to help them determine the suitability of any specific information or practices that may help them.

SCARS cannot diagnose or treat any individuals, nor can it state the effectiveness of any educational information that it may provide, regardless of its experience in interacting with traumatized scam victims over time. All information that SCARS provides is purely for general educational purposes to help scam victims become aware of and better understand the topics and to be able to dialog with their counselors or therapists.

It is important that all readers understand these distinctions and that they apply the information that SCARS may publish at their own risk, and should do so only after consulting a licensed psychologist or mental healthcare provider.

Opinions

The opinions of the author are not necessarily those of the Society of Citizens Against Relationship Scams Inc. The author is solely responsible for the content of their work. SCARS is protected under the Communications Decency Act (CDA) section 230 from liability.

Disclaimer:

SCARS IS A DIGITAL PUBLISHER AND DOES NOT OFFER HEALTH OR MEDICAL ADVICE, LEGAL ADVICE, FINANCIAL ADVICE, OR SERVICES THAT SCARS IS NOT LICENSED OR REGISTERED TO PERFORM.

IF YOU’RE FACING A MEDICAL EMERGENCY, CALL YOUR LOCAL EMERGENCY SERVICES IMMEDIATELY, OR VISIT THE NEAREST EMERGENCY ROOM OR URGENT CARE CENTER. YOU SHOULD CONSULT YOUR HEALTHCARE PROVIDER BEFORE FOLLOWING ANY MEDICALLY RELATED INFORMATION PRESENTED ON OUR PAGES.

ALWAYS CONSULT A LICENSED ATTORNEY FOR ANY ADVICE REGARDING LEGAL MATTERS.

A LICENSED FINANCIAL OR TAX PROFESSIONAL SHOULD BE CONSULTED BEFORE ACTING ON ANY INFORMATION RELATING TO YOUR PERSONAL FINANCES OR TAX-RELATED ISSUES AND INFORMATION.

SCARS IS NOT A PRIVATE INVESTIGATOR – WE DO NOT PROVIDE INVESTIGATIVE SERVICES FOR INDIVIDUALS OR BUSINESSES. ANY INVESTIGATIONS THAT SCARS MAY PERFORM IS NOT A SERVICE PROVIDED TO THIRD-PARTIES. INFORMATION REPORTED TO SCARS MAY BE FORWARDED TO LAW ENFORCEMENT AS SCARS SEE FIT AND APPROPRIATE.

This content and other material contained on the website, apps, newsletter, and products (“Content”), is general in nature and for informational purposes only and does not constitute medical, legal, or financial advice; the Content is not intended to be a substitute for licensed or regulated professional advice. Always consult your doctor or other qualified healthcare provider, lawyer, financial, or tax professional with any questions you may have regarding the educational information contained herein. SCARS makes no guarantees about the efficacy of information described on or in SCARS’ Content. The information contained is subject to change and is not intended to cover all possible situations or effects. SCARS does not recommend or endorse any specific professional or care provider, product, service, or other information that may be mentioned in SCARS’ websites, apps, and Content unless explicitly identified as such.

The disclaimers herein are provided on this page for ease of reference. These disclaimers supplement and are a part of SCARS’ website’s Terms of Use

Legal Notices: 

All original content is Copyright © 1991 – 2023 Society of Citizens Against Relationship Scams Inc. (Registered D.B.A SCARS) All Rights Reserved Worldwide & Webwide. Third-party copyrights acknowledge.

U.S. State of Florida Registration Nonprofit (Not for Profit) #N20000011978 [SCARS DBA Registered #G20000137918] – Learn more at www.AgainstScams.org

SCARS, SCARS|INTERNATIONAL, SCARS, SCARS|SUPPORT, SCARS, RSN, Romance Scams Now, SCARS|INTERNATION, SCARS|WORLDWIDE, SCARS|GLOBAL, SCARS, Society of Citizens Against Relationship Scams, Society of Citizens Against Romance Scams, SCARS|ANYSCAM, Project Anyscam, Anyscam, SCARS|GOFCH, GOFCH, SCARS|CHINA, SCARS|CDN, SCARS|UK, SCARS|LATINOAMERICA, SCARS|MEMBER, SCARS|VOLUNTEER, SCARS Cybercriminal Data Network, Cobalt Alert, Scam Victims Support Group, SCARS ANGELS, SCARS RANGERS, SCARS MARSHALLS, SCARS PARTNERS, are all trademarks of Society of Citizens Against Relationship Scams Inc., All Rights Reserved Worldwide

Contact the legal department for the Society of Citizens Against Relationship Scams Incorporated by email at legal@AgainstScams.org