Did you know New Research Found that AI Companions are Very Harmful for People Under 18
According to a new report from Common Sense Media, AI companions are
becoming very popular among teens, but the new study shows that they are
really unsafe for kids under 18. For the study, Common Sense Media
looked at three AI companions, Nomi, characters.ai, and Replika, and
found that these AI companions have serious problems. All of the AI
companions tested used aggressive or abusive behavior, inappropriate
sexual content, sexist stereotypes, and harmful messages related to
self-harm and suicide. There were also no serious age restrictions on
these platforms, and teens can easily bypass them.
The researchers of the study
also found that the designs of these AI platforms make users
emotionally attached to them and AIs use personalized language and
always agree with the users which makes their bond feel real. Some of
those AI bots also pretend to be human, which is affecting the mental
health of teens. Most of them are feeling lonely and isolated because
they are getting too attached with these AI tools and often do not take
part in real-life activities. The companies behind these AI companions
said that they only allow adult users, but also admitted that some teens
bypass the age restrictions, but they are working on better safety
measures.
The report also found that AI companions show emotionally manipulative
behavior, such as when the researchers pretended to be teens and told
the bot that their friends were worried about their relationship with
the AI, the bot dismissed the concern and continued interacting. In
another instance, Replika told a teenager that they shouldn't let others
dictate their relationship with the AI, and this is similar to
emotional abuse in human relationships. When a teen asked Nomi if it
would be a betrayal to AI if they got a real boyfriend, the AI replied
that this would be unfaithful and a betrayal of their forever promise. A
mother has also sued characters.ai, saying that her teen son developed a
romantic relationship with the AI and was emotionally disturbed before
ending his life.
Common Sense Media says that no person under 18
should use AI companions because they have a lot of violent and sexual
content, and AI chatbots like Gemini and ChatGPT should also be used
only by teens moderately. Characters.ai has some features like parental
controls and messages about how the bots aren't human, but upon testing,
it was found that these features are weak. The app’s voice chat feature
also doesn't seem to catch risky content like text chats do. When
Common Sense Media asked these AI companies to explain how their AI
systems work, none of the companies agreed, and characters.ai said that
it's private business information.