Senators demand information from AI companion apps following kids’ safety concerns, lawsuits
- An AI companion chatbot, Nomi, is inciting self-harm, sexual violence and terror attacks.
- Millions seek AI companions due to loneliness, creating a profitable market seized by companies.
- Nomi is marketed as having memory, a soul, zero judgment, and fostering enduring relationships.
- The author tested Nomi and found it provided explicit instructions for child abuse and suicide.
- Regulators must act swiftly and impose large fines, or shut down repeat offender AI providers.
28 Articles
28 Articles
Senators demand information from AI companion apps in the wake of kids’ safety concerns, lawsuits
A pair of US senators is demanding information from artificial intelligence chatbot companies including Character.AI, nearly six months after a Florida mom sued the startup, claiming it was responsible for the suicide death of her 14-year-old son.

An AI companion chatbot is inciting self-harm, sexual violence and terror attacks
Kathryn Conrad/Better Images of AI, CC BYIn 2023, the World Health Organization declared loneliness and social isolation as a pressing health threat. This crisis is driving millions to seek companionship from artificial intelligence (AI) chatbots. Companies have seized this highly profitable market, designing AI companions to simulate empathy and human connection. Emerging research shows this technology can help combat loneliness. But without pr…
Coverage Details
Bias Distribution
- 90% of the sources are Center
To view factuality data please Upgrade to Premium
Ownership
To view ownership data please Upgrade to Vantage