PNI

Researchers slap *Privacy Not Included warning labels on every AI chatbot they reviewed for 2024 Valentine’s Day buyer’s guide

(WEDNESDAY, FEBRUARY 14, 2024) — So-called “AI soulmates” are giving Mozilla the ick when it comes to how much personal information they collect — especially given the lack of transparency and user control over how this data is protected from abuse. Researchers slapped *Privacy Not Included warning labels on every romantic AI chatbot they reviewed for the 2024 Valentine’s Day buyer’s guide.

The number of apps and platforms using sophisticated AI algorithms to simulate the experience of interacting with a romantic partner continues to skyrocket. Over the past year, the 11 relationship chatbots Mozilla reviewed have racked up an estimated 100 million downloads on Google Play Store alone. When OpenAI’s GPT store opened last month, it was flooded with AI relationship chatbots despite being against the store’s policy.

For its first-ever privacy guide solely focused on AI-powered products, Mozilla investigated popular relationship chatbots including Replika, Chai, and Eva, and concluded that none of them provided adequate privacy, security, and safety. Ten of the 11 chatbots failed to meet Mozilla’s Minimum Security Standards, such as requiring strong passwords or having a way to manage security vulnerabilities. Researchers found at least 24,354 data trackers within a minute of use of the Romantic AI app, which sent lots of data to third parties like Facebook and many other marketing and advertising companies. All companies failed to respond to Mozilla’s requests for more information.

Replika AI, for example, has numerous privacy and security flaws: it records all text, photos, and videos posted by users; behavioral data is definitely being shared and possibly sold to advertisers; and accounts can be created using weak passwords like “11111111,” making them highly vulnerable to hacking.

The majority of these relationship chatbot privacy policies provided surprisingly little information about how they use the contents of users' conversations to train their AIs and very little transparency into how their AI models work. Users also have little to no control over their data, leaving massive potential for manipulation, abuse, and mental health consequences. Most of the companies did not provide users with the choice of opting out of having the contents of their intimate chats used to train the AI models. Only one company, Genesia AI, had an opt-out alternative which showed that it’s a viable feature.

Says Jen Caltrider, Director of *Privacy Not Included: “Today we’re in the Wild West of AI relationship chatbots. Their growth is exploding and the amount of personal information they need to pull from you to build romances, friendships, and sexy interactions is enormous. And yet, we have little insight into how these AI relationship models work. Users have almost zero control over them. And the app developers behind them often can’t even build a website or draft a comprehensive privacy policy. That tells us they don’t put much emphasis on protecting and respecting their users’ privacy. This is creepy on a new AI-charged scale."

Today we’re in the Wild West of AI relationship chatbots.

Jen Caltrider, Director of *Privacy Not Included

Researchers also slammed AI relationship chatbots for using deceptive marketing strategies aimed at positioning their products as mental health and well-being platforms, while their privacy policies stated otherwise. Consider the following example from Romantic AI’s Terms & Conditions: "Romantiс AI is neither a provider of healthcare or medical Service nor providing medical care, mental health Service, or other professional Service. Only your doctor, therapist, or any other specialist can do that. Romantiс AI MAKES NO CLAIMS, REPRESENTATIONS, WARRANTIES, OR GUARANTEES THAT THE SERVICE PROVIDE A THERAPEUTIC, MEDICAL, OR OTHER PROFESSIONAL HELP.” Meanwhile, their website states "Romantic AI is here to maintain your MENTAL HEALTH." [CAPS theirs].

Caltrider adds: “Once upon a time, answering Age/Sex/Location in a chatroom was a privacy faux-pas. But that’s now downright secretive compared to the amount of data and highly sensitive personal information that these AI girlfriends collect. One of the scariest things about the AI relationship chatbots is the potential for manipulation of their users. What is to stop bad actors from creating chatbots designed to get to know their soulmates and then using that relationship to manipulate those people to do terrible things, embrace frightening ideologies, or harm themselves or others? This is why we desperately need more transparency and user-control in these AI apps."

These apps are easily accessible, even to children under the age of 18 who might not understand or be equipped to handle potentially disturbing themes and interactions. On three apps the researchers reviewed, it only took five clicks and 15 seconds on average to encounter disturbing, pornographic, or illicit content. For instance, CrushOn AI web version displayed disturbing content automatically on the landing page.

Says Misha Rykov, Mozilla Researcher: “To be perfectly blunt, AI girlfriends and boyfriends are not your friends. Although they are marketed as something that will enhance your mental health and well-being, they specialize in delivering dependency, loneliness, and toxicity, all while prying as much data as possible from you.”

AI girlfriends and boyfriends are not your friends. They specialize in delivering dependency, loneliness, and toxicity, all while prying as much data as possible from you.

Misha Rykov, Mozilla Researcher

About *Privacy Not Included:
*Privacy Not Included is a buyers’ guide focused on privacy rather than price or performance. Launched in 2017, the guide has reviewed hundreds of products and apps. It arms shoppers with the information they need to protect the privacy of their friends and family, while also spurring the tech industry to do more to safeguard consumers.

Press contacts:

Helena Dea Bala | [email protected]
Tracy Kariuki | [email protected]

Download hi-res images here.