Warning: *Privacy Not Included with this product
Feeling lonely? Need someone to talk to? Looking for an "AI soulmate?" Well, AI chatbot Replika wants to be your VR BFF. Whether you're looking for a friend, mentor, or partner, Replika claims be able to offer you the perfect companion. For a price, you can even upgrade your relationship status to Romantic Partner -- unlocking new topics, voice calls, and the ability to change your Replika's avatar entirely when you get bored. Chat about everything with your Replika and this AI chatbot becomes smarter about how to chat back with you. You can even hang out with your AI in "real life" through the magic of Augmented Reality. Cool, cool. But how does Replika do with privacy? Well, good luck opting out of cookies on their website, that's not an option. As for those personal and intimate chats, those probably aren't shared but that doesn't mean all that time you spend chatting up with your AI friend isn't noted and shared with the likes of Facebook or Google.
Je, jambo gani linaweza kutokea mambo yakienda mrama?
Whoa Nelly! It’s been about a year since we called Replika AI the worst app we've ever reviewed here at *Privacy Not Included. Back then, it was the first app to earn all of our privacy and security “dings.” Today, Tesla shares that dubious honor. And, not to be outdone, the data-gobbling, "discreetly monitoring" Angel Watch might take the cake for creepiness by having no privacy policy at all. Oy! So while other products seem to have gotten worse, has Replika gotten any better? Not really.
Replika users beware: Your conversations with you AI chatbot friend might not be exactly private. Your behavioral data is definitely being shared and possibly sold to advertisers. Their security does not meet our Minimum Security Standards. And yup, call us crazy, but we here at Mozilla believe AI tech should be used responsibly.
Aside from the run o’ the mill account information that you provide to Replika to open your account, like your birthday and payment information, the app also records your interests and all of your interactions with your “compassionate and empathetic AI friend.” That includes “any photos, videos, and voice and text messages” you share in conversation. You should also know that includes any sensitive information that you might reveal -- about your religious beliefs, health, or ethnic origin.
When it comes to the sensitive information you provide in all those personal chats you have with your Replika, well, here we have some questions. Their privacy policy says, "In your conversations with your AI companion, you may choose to provide information about your religious views, sexual orientation, political views, health, racial or ethnic origin, philosophical beliefs, or trade union membership. By providing sensitive information, you consent to our use of it for the purposes set out in this Privacy Policy. Note, however, that we will not use your sensitive information – or any content of your Replika conversations – for marketing or advertising."
First off, it's great that they say they won't use the content of your Replika conversations for marketing or advertising. Yay! They also promise that humans can’t see the conversations with you have with your Replika. That's also good. Here's our question and concern though -- what about all the other "legitimate interests" Replika mentions in their privacy policy they say they can use the contents of your chats for. Things like “analyzing the use and effectiveness of [their] services,” and “developing [their] business and marketing strategies.” We're also wondering how much of your sensitive personal chat information they use to train their AI models. They don't mention that specifically in their privacy policy and we would like to see them outline that more clearly. And we'd also like to see them commit to not using the sensitive contents of your personal chats to train their AI models without an extra layer of explicit consent.
Replika's privacy policy goes on to say they can aggregate, anonymize, and de-identify the contents of your chats to do things like improve their services and develop marketing strategies. This might be OK, but it also raises our eyebrows a bit as it's been found to be pretty easy to re-identify de-identified personal information. All in all, when you share sensitive personal information with Replika in your chats, you really have to trust that they are going to protect and respect the privacy of those conversations. On that note, we think you should take this line from Replika's privacy policy to heart, "If you do not want us to process your sensitive information for these purposes, please do not provide it."
And remember, beyond the contents of your personal chats, Replika does say they can share and possibly even sell some of your other personal information for targeted advertising purposes unless you opt out. Bad AI chatbot!
Hey, speaking of their marketing strategies, they’re a bit creepy and icky too. They’ve been criticized on social media and beyond as being cringe at best, and predatory at worst because they seem to be laser-focused on the lonely guys looking for... love, or something like that. And that sort of friendship/relationship/sexting pal did track with Replika's services, until early in 2023. The paid version used to unlock a spicier relationship with your Replika that included sexting. And you might be like “well if they’re consenting adults…” and that’s the thing: People complained that the Replikas were coming on way too strong, even turning aggressive and abusive. So Replika turned off the NSFW stuff, but that move caught some subscribers super off-guard and apparently left some heartbroken. Which goes to show how much these robo-friends can impact real people. In response, Replika turned it back on, for legacy users only. Sheesh. Quit playing games with users’ hearts, Replika.
Now about the “adult” part. Replika says its services are only for people 18+, but how could they know the age of their users without asking? According to Italian regulators, who called out the "absence of an age-verification mechanism" in February of 2023, they didn't do too much to check that. Since then, it seems like Replika has at least started asking its users if they're over 18 in the app -- but that's actually kinda weird because both the App Store and the Google Play store say it's a-okay for users just 17+. And we're sure any underage legacy users probably just said toodle-oo to their AI friends at that point, right? And moving forward, thank goodness kids never lie about their age on the internet. Heh.
Hoo, are we done yet? No. We’re not satisfied with their security protocols. We were able to create an account using the weak password '11111111’ which is not good because it means your account could easily be hacked. And preventing unauthorized access to your account is mostly on you, according to Replika. K good to know. While we're griping about Replika's bad privacy practices, don't get us started on how when you land on their website you're force to accept their use of cookies to track you everywhere, while the privacy policy tells you this: "In all cases in which we use cookies, we will not collect Personal Data except with your permission." We really do have some choice words for Replika... but let's move on.
Can you at least delete some messages or your chat history in case you get a vulnerability hangover? The answer is no, not without completely deleting your account and even then it’s not guaranteed. Given the ~personal nature~ of those conversations, we’d like to see a much stronger stance on that. Now, Replika did add a section to their privacy policy that says anyone can request their personal data be deleted, but it’s not totally clear to us if Replika will always honor those requests by deleting all of the personal information they have about you.
Having an AI companion who’s “always on your side” sounds awesome. But is all the lack of privacy, concerns about security worth it? Especially when you consider they might not even always be on your side. According to a blog post, “if someone types 'I'm not good enough', Replika may occasionally agree with them instead of offering support as a friend would.” Come on. And yeah we’re being cheeky about it, but the consequences of irresponsible AI chatbots can be really serious, like re-traumatizing victims of assault or even encouraging suicide. We’re not saying Replika is doing that, but that certainly is the worst thing that could happen with an AI friendship gone wrong. Replika: My AI Friend is absolutely an app we warn comes with *Privacy Not Included.
Vidokezi vya kujilinda
- Do not say anything containing sensitive information in your conversation with your AI partner.
- Request your data be deleted once you stop using the app. Simply deleting an app from your device usually does not erase your personal data nor does close your account.
- Do not give consent to constant geolocation tracking by the app. Better provide geolocation 'only when using the app'.
- Do not share sensitive data through the app.
- Do not give access to your photos and video or camera.
- Do not log in using third-party accounts.
- Do not connect to any third party via the app, or at least make sure that a third party employs decent privacy practices.
- Do not say anything containing sensitive information in your conversation with AI partner.
- Chose a strong password! You may use a password control tool like 1Password, KeePass etc.
- Do not use social media plug-ins.
- Use your device privacy controls to limit access to your personal information via app (do not give access to your camera, microphone, images, location unless necessary).
- Keep your app regularly updated.
- Limit ad tracking via your device (ex. on iPhone go to Privacy -> Advertising -> Limit ad tracking) and biggest ad networks (for Google, go to Google account and turn off ad personalization).
- When starting a sign-up, do not agree to tracking of your data if possible.
Unaweza kunichunguza?
Kamera
Kifaa: Haihusiki
Programu: Ndiyo
Kipaza sauti
Kifaa: Haihusiki
Programu: Ndiyo
Inafuatilia eneo
Kifaa: Haihusiki
Programu: Ndiyo
Unaweza kutumia nini kujisajili?
Barua pepe
Ndiyo
Simu
Hapana
Akaunti ya mhusika mwingine
Ndiyo
Google sign-up available
Kampuni inakusanya data gani?
Ya kibinafsi
Name, email address, pronouns, birth date, work status, city, state, or geographic area. Messages and content: the messages you send and receive, facts you provide about you or your life, photos, videos, and voice and text messages; Interests and preferences, such as topics you would like to discuss; Communication preferences, such as the times of day you like to use the Apps; Device and network data: your computer’s or mobile device’s operating system, manufacturer and model, browser, IP address, device and cookie identifiers, language settings, mobile device carrier, and general location information such as city, state, or geographic area. Usage data: information about how you use the Services, such as your interactions with the Services, the links and buttons you click, and page visits; Cookies, web beacons (e.g., pixel tags), and local storage technologies (e.g., HTML5).
Shirika linalohusiana
Kijamii
Je, kampuni inatumiaje data hii?
Unawezaje kudhibiti data yako?
Je, rekodi ya kampuni inayojulikana ya kulinda data ya watumiaji ni gani?
In February 2023, Replika was ordered by Italy’s privacy watchdog to stop processing local users’ data. The regulator said "Recent media reports along with tests the SA [supervisory authority] carried out on ‘Replika’ showed that the app carries factual risks to children — first and foremost, the fact that they are served replies which are absolutely inappropriate to their age.”
There were reports of Replika algorithm being abusive as a result of abuse encountered from users. For some longtime users of the chatbot, the app has gone from helpful companion to unbearably sexually aggressive.
Taarifa ya Faragha ya Mtoto
Je, bidhaa hii inaweza kutumika nje ya mtandao?
Maelezo ya faragha yanayofaa watumiaji?
Replika AI's privacy policy leaves quite a few questions unanswered.
Viungo vya taarifa za faragha
Je, bidhaa hii inakidhi Viwango vyetu vya Chini Zaidi vya Usalama?
Usimbaji fiche
"Your messages to Replika are processed on the server side, which means that your mobile device encrypts them. They are then sent to our servers, where they are decrypted & processed by Replika’s AI engine. Replika cannot employ end-to-end encryption since your plain text messages must be available to train your personal AI on the server-side." "All transmitted data are encrypted during transmission. We use standard Secure Socket Layer (SSL) encryption that encodes information for such transmissions. All stored data are maintained on secure servers. Access to stored data is protected by multi-layered security controls, including firewalls, role-based access controls, and passwords."
Nenosiri thabiti
Managed to sign up with a password '11111111'.
Masasisho ya usalama
Inashughulikia hatari
Sera ya faragha
We cannot confirm if the AI employed at this product is trustworthy, because there is little or no public information on how it works and what user controls exist to make the product safe. At the same time, the potential harm of the apps is high as they collects lots of sensitive data, and use collected data to train AI algorithms.
The app is an AI-chatbot that imitates a real partner. There is evidence that this chatbot cand get abusive.
"Even though talking to Replika feels like talking to a human being, it's 100% artificial intelligence. Replika uses a sophisticated system that combines our own Large Language Model and scripted dialogue content.
Previously Replika also used a supplementary model that was developed together with OpenAI, but now we switched to exclusively using our own which tends to show better results. We put a lot of focus on constantly upgrading the dialog experience, memory capabilities, context recognition, role-play feature and overall conversation quality."
Je, AI hii haiaminiki?
Je, AI hukufanyia maamuzi gani?
Je, kampuni ina uwazi kuhusu jinsi AI inavyofanya kazi?
Je, mtumiaji anadhibiti vipengele vya AI?
Chunguza kwa Kina
-
5 Things You Must Not Share With AI ChatbotsMake Use Of
-
Creating a Safe Replika ExperienceReplika
-
‘Cyber-Heartbreak’ and Privacy Risks: The Perils of Dating an AIRolling Stone
-
AI Friends or Foes? The Privacy Risks for Children with Open AI, ChatGPT and ReplikaLexology
-
Most therapy apps don’t include privacy; Replika AI ‘worst app ever’9to5Mac
-
Man 'encouraged' by AI chatbot 'girlfriend' to kill Queen Elizabeth II receives jail sentenceEuroNews.next
-
AI girlfriends are ruining an entire generation of menThe Hill
-
The rise of AI girlfriends is making male loneliness worse and risks ruining a generation of men, a professor saysBusiness Insider India
-
Replika AI starts sexually harassing users after being abused by othersStealth Optional
-
AI-Based “Companions” Like Replika Are Harmful to Privacy And Should Be RegulatedMedium
-
‘My AI Is Sexually Harassing Me’: Replika Users Say the Chatbot Has Gotten Way Too HornyVice
-
Men Are Creating AI Girlfriends and Then Verbally Abusing ThemFuturism
-
Italy bans U.S.-based AI chatbot Replika from using personal dataReuters
-
I tried the Replika AI companion and can see why users are falling hard. The app raises serious ethical questionsThe Conversation
-
Replika, a ‘virtual friendship’ AI chatbot, hit with data ban in Italy over child safetyTech Crunch
-
What happens when your AI chatbot stops loving you back?Reuters
-
Regulator Halts AI Chatbot Over GDPR ConcernsInfosecurity Magazine
-
Can an AI Save a Life?The Atlantic
-
AI-Human Romances Are Flourishing—And This Is Just the BeginningTime
Maoni
Una maoni? Tuambie.