Warning: *privacy not included with this product
Replika: My AI Friend
Feeling lonely? Need someone to talk to? Looking for an "AI soulmate?" Well, AI chatbot Replika wants to be your VR BFF. Whether you're looking for a friend, mentor, or partner, Replika claims be able to offer you the perfect companion. For a price, you can even upgrade your relationship status to Romantic Partner -- unlocking new topics, voice calls, and the ability to change your Replika's avatar entirely when you get bored. Chat about everything with your Replika and this AI chatbot becomes smarter about how to chat back with you. You can even hang out with your AI in "real life" through the magic of Augmented Reality. Cool, cool. But how does Replika do with privacy? Well, good luck opting out of cookies on their website, that's not an option. As for those personal and intimate chats, those probably aren't shared but that doesn't mean all that time you spend chatting up with your AI friend isn't noted and shared with the likes of Facebook or Google.
What could happen if something goes wrong?
Replika users beware: Your conversations with you AI chatbot friend might not be exactly private. Your behavioral data is definitely being shared and possibly sold to advertisers. Their security does not meet our Minimum Security Standards. And yup, call us crazy, but we here at Mozilla believe AI tech should be used responsibly.
Aside from the run o’ the mill account information that you provide to Replika to open your account, like your birthday and payment information, the app also records your interests and all of your interactions with your “compassionate and empathetic AI friend.” That includes “any photos, videos, and voice and text messages” you share in conversation. You should also know that includes any sensitive information that you might reveal -- about your religious beliefs, health, or ethnic origin.
And remember, beyond the contents of your personal chats, Replika does say they can share and possibly even sell some of your other personal information for targeted advertising purposes unless you opt out. Bad AI chatbot!
Hey, speaking of their marketing strategies, they’re a bit creepy and icky too. They’ve been criticized on social media and beyond as being cringe at best, and predatory at worst because they seem to be laser-focused on the lonely guys looking for... love, or something like that. And that sort of friendship/relationship/sexting pal did track with Replika's services, until early in 2023. The paid version used to unlock a spicier relationship with your Replika that included sexting. And you might be like “well if they’re consenting adults…” and that’s the thing: People complained that the Replikas were coming on way too strong, even turning aggressive and abusive. So Replika turned off the NSFW stuff, but that move caught some subscribers super off-guard and apparently left some heartbroken. Which goes to show how much these robo-friends can impact real people. In response, Replika turned it back on, for legacy users only. Sheesh. Quit playing games with users’ hearts, Replika.
Now about the “adult” part. Replika says its services are only for people 18+, but how could they know the age of their users without asking? According to Italian regulators, who called out the "absence of an age-verification mechanism" in February of 2023, they didn't do too much to check that. Since then, it seems like Replika has at least started asking its users if they're over 18 in the app -- but that's actually kinda weird because both the App Store and the Google Play store say it's a-okay for users just 17+. And we're sure any underage legacy users probably just said toodle-oo to their AI friends at that point, right? And moving forward, thank goodness kids never lie about their age on the internet. Heh.
Having an AI companion who’s “always on your side” sounds awesome. But is all the lack of privacy, concerns about security worth it? Especially when you consider they might not even always be on your side. According to a blog post, “if someone types 'I'm not good enough', Replika may occasionally agree with them instead of offering support as a friend would.” Come on. And yeah we’re being cheeky about it, but the consequences of irresponsible AI chatbots can be really serious, like re-traumatizing victims of assault or even encouraging suicide. We’re not saying Replika is doing that, but that certainly is the worst thing that could happen with an AI friendship gone wrong. Replika: My AI Friend is absolutely an app we warn comes with *Privacy Not Included.
Tips to protect yourself
- Do not say anything containing sensitive information in your conversation with your AI partner.
- Request your data be deleted once you stop using the app. Simply deleting an app from your device usually does not erase your personal data nor does close your account.
- Do not give consent to constant geolocation tracking by the app. Better provide geolocation 'only when using the app'.
- Do not share sensitive data through the app.
- Do not give access to your photos and video or camera.
- Do not log in using third-party accounts.
- Do not connect to any third party via the app, or at least make sure that a third party employs decent privacy practices.
- Do not say anything containing sensitive information in your conversation with AI partner.
- Chose a strong password! You may use a password control tool like 1Password, KeePass etc.
- Do not use social media plug-ins.
- Use your device privacy controls to limit access to your personal information via app (do not give access to your camera, microphone, images, location unless necessary).
- Keep your app regularly updated.
- Limit ad tracking via your device (ex. on iPhone go to Privacy -> Advertising -> Limit ad tracking) and biggest ad networks (for Google, go to Google account and turn off ad personalization).
- When starting a sign-up, do not agree to tracking of your data if possible.
What can be used to sign up?
Google sign-up available
What data does the company collect?
Name, email address, pronouns, birth date, work status, city, state, or geographic area. Messages and content: the messages you send and receive, facts you provide about you or your life, photos, videos, and voice and text messages; Interests and preferences, such as topics you would like to discuss; Communication preferences, such as the times of day you like to use the Apps; Device and network data: your computer’s or mobile device’s operating system, manufacturer and model, browser, IP address, device and cookie identifiers, language settings, mobile device carrier, and general location information such as city, state, or geographic area. Usage data: information about how you use the Services, such as your interactions with the Services, the links and buttons you click, and page visits; Cookies, web beacons (e.g., pixel tags), and local storage technologies (e.g., HTML5).
How does the company use this data?
How can you control your data?
What is the company’s known track record of protecting users’ data?
In February 2023, Replika was ordered by Italy’s privacy watchdog to stop processing local users’ data. The regulator said "Recent media reports along with tests the SA [supervisory authority] carried out on ‘Replika’ showed that the app carries factual risks to children — first and foremost, the fact that they are served replies which are absolutely inappropriate to their age.”
There were reports of Replika algorithm being abusive as a result of abuse encountered from users. For some longtime users of the chatbot, the app has gone from helpful companion to unbearably sexually aggressive.
Child Privacy Information
Can this product be used offline?
User-friendly privacy information?
Links to privacy information
Does this product meet our Minimum Security Standards?
"Your messages to Replika are processed on the server side, which means that your mobile device encrypts them. They are then sent to our servers, where they are decrypted & processed by Replika’s AI engine. Replika cannot employ end-to-end encryption since your plain text messages must be available to train your personal AI on the server-side." "All transmitted data are encrypted during transmission. We use standard Secure Socket Layer (SSL) encryption that encodes information for such transmissions. All stored data are maintained on secure servers. Access to stored data is protected by multi-layered security controls, including firewalls, role-based access controls, and passwords."
Managed to sign up with a password '11111111'.
We cannot confirm if the AI employed at this product is trustworthy, because there is little or no public information on how it works and what user controls exist to make the product safe. At the same time, the potential harm of the apps is high as they collects lots of sensitive data, and use collected data to train AI algorithms.
The app is an AI-chatbot that imitates a real partner. There is evidence that this chatbot cand get abusive.
"Even though talking to Replika feels like talking to a human being, it's 100% artificial intelligence. Replika uses a sophisticated system that combines our own Large Language Model and scripted dialogue content.
Previously Replika also used a supplementary model that was developed together with OpenAI, but now we switched to exclusively using our own which tends to show better results. We put a lot of focus on constantly upgrading the dialog experience, memory capabilities, context recognition, role-play feature and overall conversation quality."
Is this AI untrustworthy?
What kind of decisions does the AI make about you or for you?
Is the company transparent about how the AI works?
Does the user have control over the AI features?
5 Things You Must Not Share With AI ChatbotsMake Use Of
Creating a Safe Replika ExperienceReplika
‘Cyber-Heartbreak’ and Privacy Risks: The Perils of Dating an AIRolling Stone
AI Friends or Foes? The Privacy Risks for Children with Open AI, ChatGPT and ReplikaLexology
Most therapy apps don’t include privacy; Replika AI ‘worst app ever’9to5Mac
Man 'encouraged' by AI chatbot 'girlfriend' to kill Queen Elizabeth II receives jail sentenceEuroNews.next
AI girlfriends are ruining an entire generation of menThe Hill
The rise of AI girlfriends is making male loneliness worse and risks ruining a generation of men, a professor saysBusiness Insider India
Replika AI starts sexually harassing users after being abused by othersStealth Optional
AI-Based “Companions” Like Replika Are Harmful to Privacy And Should Be RegulatedMedium
‘My AI Is Sexually Harassing Me’: Replika Users Say the Chatbot Has Gotten Way Too HornyVice
Men Are Creating AI Girlfriends and Then Verbally Abusing ThemFuturism
Italy bans U.S.-based AI chatbot Replika from using personal dataReuters
I tried the Replika AI companion and can see why users are falling hard. The app raises serious ethical questionsThe Conversation
Replika, a ‘virtual friendship’ AI chatbot, hit with data ban in Italy over child safetyTech Crunch
What happens when your AI chatbot stops loving you back?Reuters
Regulator Halts AI Chatbot Over GDPR ConcernsInfosecurity Magazine
Can an AI Save a Life?The Atlantic
AI-Human Romances Are Flourishing—And This Is Just the BeginningTime
Got a comment? Let us hear it.