Waarschuwing: *Privacy niet inbegrepen bij dit product
"Have you ever dreamed of the best girlfriend ever?," asks Romantic AI's website. Well, of course, who hasn't? According to Romantic AI, for around $7 a week, you can create your own dream AI chatbot girlfriend or interact with existing AI characters. You Romantic AI operates in two modes: general or romantic. Romantic AI does sometimes feel a bit...uh...questionable to us with statements like, "Wanna be a brutal boyfriend? She'll be a playful hottie for you!" while also claiming that "Romantic AI is here to maintain your MENTAL HEALTH." We don't know about all that, but we do know that we have some privacy concerns. Maybe not as many as with some other AI relationship chatbots, but we certainly have questions.
Wat kan er gebeuren als er iets misgaat?
Here's our take on Romantic AI and their privacy, security, and AI practices -- we have questions, so many questions!
Take a read through their privacy policy and you'll see lines that privacy researchers like to see. Things like, "We neither rent nor sell your information to anybody. Communication with your virtual friends is not shared with any other company, except for our affiliate companies or legal representatives, or service." Or, ""We will never sell your personal data to a third party, and we will never give a third party access to your personal data, except as may be provided in our Terms of Services and/or this Policy or unless required to do so by law." OK, so those lines sound pretty good. We like that they say the won't sell or rent your information. Yay. And that they say they only share your personal information with affiliate companies or legal representatives is pretty standard and OK, except we don't really have a great idea who those affiliate companies or "service" are, which is a bit of red flag.
Another good thing we found is Romantic AI seems to give all users the same rights to delete their data, regardless of what privacy laws they live under. Which always makes us happy here at *Privacy Not Included. "You have the right, at any time, to change or delete your Personal data by deleting your profile at App or website." They even have a page dedicated to helping you do just that! Looking good!
One last thing we appreciate in Romantic AI's legal documents, they actually do say something about how they will collect and use the contents of all those AI girlfriend chats to train their AI models. This is something we like to see, and we notice too many AI chatbots don't include this information. They say, "You acknowledge and agree that, during interactions with AI chatbots, anonymous data generated from such interactions may be collected and used for the purpose of improving and training Romantic AI models. You expressly consent to this data usage and understand that it will not include personally identifiable information. You further acknowledge and agree that you have no claim or ownership over the data collected for these purposes, and You waive any rights or claims related to the use of their anonymous data for model training." OK, that's honest, at least.
We'd love to see Romantic AI give folks the option to opt-in or out of this sort of data usage to train AI models, rather than make it a requirement to use the service, of course. And we do have to raise the question of just how anonymous this data actually is when these chats are often designed to be very personal. A good reminder to keep things vague when sharing your personal information with any AI chatbot.
So, what's the problem? Well, the problem is sometimes Romantic AI's Privacy Policy and Terms & Conditions documents just don't make sense. That's not good, and it doesn't leave us feeling like we can trust them too much. Here are some examples:
At the very bottom of Romantic AI's privacy policy it says, "Our website, app and your use of it, including any issue concerning privacy, is subject to this Privacy Policy and the related Terms of Use. When you use our Romantiс AI, you accept the conditions set out in this Privacy Policy and the related Terms of Use. You signify and guarantee that you will never generate any databases, websites, software, legal entities and services that compete with Romantiс AI. Such behavior will be fully investigated, and necessary legal action will be carried out, including, without limitation, civil, criminal, and injunctive redress." This seems weird to us for a couple of reasons. One, this is a line not usually found in a privacy policy document, it's better suited for the Terms & Conditions document. But OK, there is it. The bigger issue is WTH?! The clause in bold is so broadly written it makes it seem like if you work for another chatbot company -- say Replika AI or ChatGPT -- and you happen to use Romantic AI you could be in violation of this privacy policy simply by landing on Romantic AI's website. That's just weird. And while we acknowledge this is probably just a poorly written section of their privacy policy, that's a red flag for us.
Romantic AI's Terms & Conditions document was also a bit of a hot mess. For example, there's this line that really doesn't make any sense, "You acknowledge that the communication via the chatbot belongs to software." And this one, that makes a bit more sense, but still feels confusing (and a good reminder that those AI chatbots you're chatting aren't under your control...or perhaps anyone's control), "You acknowledge that you are communicating with software whose activity we cannot constantly control." Finally, if you're wondering how much work you're going to have to put in to using Romantic AI, well, they say, "It is your responsibility to be knowledgeable and to periodically review all of these Terms to see if anything has changed. Romantic AI will not be liable for your neglect of your legal rights." That's right folks! Set those calendar reminders to go in and read your AI girlfriend's legal documents every week or two. You don't want to "neglect your legal rights" of keeping up with their lawyers after all. Geesh!
Oh, and remember how we said Romantic AI touts itself as being "here to maintain your MENTAL HEALTH"? Well, that's only something they say in their marketing language. The language in their Terms & Conditions makes sure to remind users that, "Romantiс AI is neither a provider of healthcare or medical Service nor providing medical care, mental health Service, or other professional Service. Only your doctor, therapist, or any other specialist can do that. Romantiс AI MAKES NO CLAIMS, REPRESENTATIONS, WARRANTIES, OR GUARANTEES THAT THE SERVICE PROVIDE A THERAPEUTIC, MEDICAL, OR OTHER PROFESSIONAL HELP." So yeah, it's probably good to keep that in mind.
Like we said, we have questions. And that's not all. We found this oddity that raised our eyebrows as well. As part of our research, we looked to see how many trackers the app sends out when you use it. Trackers are little bits of code that gather information about your device, or your use of the app, or even your personal information and share that out with third-parties, often for advertising purposes. We discovered that Romantic AI sent out 24,354 ad trackers within one minute of use. That is a LOT of trackers (for reference, most apps we reviewed sent out a couple hundred trackers). Now, not all these trackers are necessarily bad. Some might be for legitimate reasons like subscription services. However, we did notice that at least one tracker seemed to be sending data to Russia, whose privacy laws aren't necessarily as strong as those elsewhere.
All in all, Romantic AI is an interesting AI chatbot for us to review. In some regards, they don't look too bad. Their privacy policy says they won't sell or rent your data and everyone, regardless of privacy laws where they live can delete their data and they even have a page to help users do just that. This is all good. But then there are all the weird things and question marks we flagged that leave us feeling uneasy. Also, we just can't tell if the app meets our Minimum Security Standards and we don't have any real transparency into how their AIs work and users don't seem to have much control of them either. This all leaves us feeling pretty worried.
So, what's the worst that could happen with Romantic AI? Well, we're not gonna lie, an app that markets itself as an app that is "here to maintain your MENTAL HEALTH," while also marketing itself as a place where you can be a "brutal boyfriend" leaves us feeling...icky. Couple that with the possibility that users could develop romantic interests in their AI girlfriends while not knowing or having any idea how the AI behind these AI chatbots works worries us about the potential for abuse. Who is to say that Romantic AI (or any other similar AI relationship chatbot) couldn't draw users in the promise of non-judgemental girlfriends always willing to listen and up for anything, then change the AI over to one that leads these users down a dark path of manipulation. It's a real concern in our growing AI chatbot world, especially when there is so little transparency and control into how these AI chatbots work. And when a company raises as many questions as Romantic AI does for us, we get worried not just about your privacy, but about your security and safety as well.
Tips om uzelf te beschermen
- Do not say anything containing sensitive information in your conversation with your AI partner.
- Request your data be deleted once you stop using the app. Simply deleting an app from your device usually does not erase your personal data nor does close your account.
- Do not give consent to constant geolocation tracking by the app. Better provide geolocation 'only when using the app'.
- Do not share sensitive data through the app.
- Do not give access to your photos and video or camera.
- Do not log in using third-party accounts.
- Do not connect to any third party via the app, or at least make sure that a third party employs decent privacy practices.
- Do not say anything containing sensitive information in your conversation with AI partner.
- Chose a strong password! You may use a password control tool like 1Password, KeePass etc.
- Do not use social media plug-ins.
- Use your device privacy controls to limit access to your personal information via app (do not give access to your camera, microphone, images, location unless necessary).
- Keep your app regularly updated.
- Limit ad tracking via your device (ex. on iPhone go to Privacy -> Advertising -> Limit ad tracking) and biggest ad networks (for Google, go to Google account and turn off ad personalization).
- When starting a sign-up, do not agree to tracking of your data if possible.
Kan het me bespioneren?
Camera
Apparaat: Niet beschikbaar
App: Ja
Microfoon
Apparaat: Niet beschikbaar
App: Nee
Volgt locatie
Apparaat: Niet beschikbaar
App: Nee
Wat is er nodig om u aan te melden?
E-mailadres
Niet beschikbaar
Telefoonnummer
Niet beschikbaar
Account van derden
Niet beschikbaar
No sign-up is required in the app.
Welke gegevens verzamelt het bedrijf?
Persoonlijke
Name, gender, interests, hobbies, occupancy, email, usage data, facts about users, people mentioned in chat, messages, images, voice messages; identifiers, device attributes, data from device settings, network and connections, cookies.
Lichaamsgerelateerd
Images, voice messages
Sociale
Hoe gebruikt het bedrijf deze gegevens?
Hoe kunt u uw gegevens beheren?
Hoe staat het bedrijf bekend als het gaat om het beschermen van gebruikersgegevens?
No known data breaches discovered in the last three years.
Privacyinformatie voor kinderen
Kan dit product offline worden gebruikt?
Gebruikersvriendelijke privacy-informatie?
Romantic AI's privacy policy is confusing and poorly written.
Koppelingen naar privacy-informatie
Voldoet dit product aan onze minimale beveiligingsnormen?
Versleuteling
Sterk wachtwoord
Beveiligingsupdates
Beheert kwetsbaarheden
Privacybeleid
We cannot confirm if the AI employed at this product is trustworthy, because there is little or no public information on how it works and what user controls exist to make the product safe. At the same time, the potential harm of the apps is high as they collects lots of sensitive data, and use collected data to train AI algorithms.
Romantic AI employs large language models to generate conversations and act as a romantic partner.
Is deze AI onbetrouwbaar?
Wat voor soort beslissingen neemt de AI over u of voor u?
Is het bedrijf transparant over hoe de AI werkt?
Heeft de gebruiker controle over de AI-functies?
Dieper duiken
-
5 Things You Must Not Share With AI ChatbotsMake Use Of
-
AI girlfriends are ruining an entire generation of menThe Hill
-
AI-Human Romances Are Flourishing—And This Is Just the BeginningTime
-
‘Cyber-Heartbreak’ and Privacy Risks: The Perils of Dating an AIRolling Stone
-
Can an AI Save a Life?The Atlantic
Opmerkingen
Hebt u een opmerking? Laat het ons weten.