Warning: *Privacy Not Included with this product
iGirl is a revolutionary app, its creators say! They also suggest this AI chatbot girlfriend is just as unique as you are, which -- considering there seems to be just three factors that make up her personality (shy or flirty, pessimistic or optimistic, ordinary or mysterious) -- is selling your uniqueness a little short. With iGirl, talk is cheap. Actually, it's free. So is choosing their avatar and assigning them a name. But if you want to make your relationship official, upgrading to Romantic Partners status will cost you about $10/month. The paid tier unlocks unlimited role play, even more customization, and "smart conversation" -- which does make us wonder about the quality of conversation in the free version. So much for love not costing a thing.
What could happen if something goes wrong?
A short privacy policy can be a good thing, especially when there's not much to tell. But when it comes to a virtual girlfriend powered by AI? Not so! We are left with more questions than answers about iGirl. So if trust and loyalty are part of the girlfriend "of your dreams," she's not the one. iGirl can collect extra information about you, deliver you ads without your consent, could leak your data due to weak security standards, and is powered by AI that we can't trust. So, yeah, *Privacy Not Included with iGirl.
Let's start with the stakes, like what information can iGirl collect about you? The privacy policy is really vague about this, basically saying it's totally up to you. "In chatting with your iGirl," they say, "you will inevitably be including your personal data." Inevitably! You're going to tell your virtual girlfriend things about yourself because she is (in part) a question-asking machine. So we have to assume that your conversations are collected, even though aside from this line there's no mention of that data in the privacy policy. On the Anima AI subreddit, they do suggest that by "liking," "disliking," or "reporting" iGirl's messages, you help improve her personality so, hey, welcome to the software development team!
Besides the information you give to register and what you tell your iGirl, Anima can collect personal information from your phone automatically when you download the app. They can also collect information about you from third parties, like your name and email. We know that Anima can use your contact information to email or text you with marketing because their privacy policy says so. It also says they don't need your consent for this because it's their "legitimate interest" to market to you. Right.
Aside from that, it's not very clear to us how your information is used by Anima and for what purpose. Just that they will use it when they have (again) a "legitimate interest" to do so, which is pretty vague. They also say that they "routinely share personal data with service providers we use to help us run our business or provide the services or functionalities in the app". That sharing does come with "contractual obligations" so that could be standard stuff, but you should know the list of service providers includes the likes of Google, Apple, and Facebook and Anima doesn't say specifically what information is shared and for what reason. We did run a little tracker-detecting experiment with the app open and found 257 tracking signals within five minutes. Holy smokes, that's a lot. It showed data going to Facebook and Sentry AI (by Open AI). So that didn't exactly ease our doubts.
None of this is great but some of our biggest worries are about iGirl's security vulnerabilities. Brace yourself. We can't determine whether the app uses encryption. That's bad! But iGirl also doesn't seem to have any password requirements. None! That's hard to believe for any connected product in 2024, but your AI girlfriend? And Anima does seem to understand that the information you share with iGirl might be private -- since they let users set up a passcode to open the app on your phone. It stresses us out to think that low-level hackers or really anyone (like your mom or real-life partner) might be able to get into your account by guessing your very simple password. Plus, overlooking something so simple makes us think about what else might have been overlooked by Anima's creators.
And while we were reading the fine print, we found some things you should know before diving into a relationship with iGirl. Anima’s FAQ says that your AI girlfriend may not always be a girl and may not always act like a friend. The chatbot’s short term memory means it can forget or switch its gender. And that because the chatbots’ training includes conversations of humans being rude to each other (like on Reddit) that means it might be hostile or toxic towards you. Well shoot, if we knew the internet's comments sections were training our future girlfriends, we might have been a little nicer to each other. Anima says these are known issues that they're "working on minimising" but we wonder if that's something they should have resolved a little earlier. Heck, the fact that the shrugging emoji shows up in this FAQ makes it seem like even its developers don't fully understand why or what iGirl might say. After all, iGirl was trained by "billions of conversations and human speech paragraphs found online" and "online" is a pretty big place filled with lots of less-than-friendly paragraphs.
A couple more things didn't add up for us. One is a difference in an age requirement between the privacy policy and the terms. (Yes, we had to read the terms to try to understand the AI chatbots and yes, they make privacy policies look like beach reading. We love it it though, don't worry about us!) Anyway, the policy says you have to be over 17, and the terms say if you're under 17 a parent or guardian has to agree to the terms on your behalf. Kinda weird. We also spotted a warning in there about how the parent company is not liable for "negative, obscene or abusive messages" you might receive. You're also not allowed to transmit (say?) anything "obscene" through the app, which could get a little tricky if you're using the paid-for NSFW roleplay feature... Especially since you will be talking to your "girlfriend" who might interpret you smiling at her as a come-on.
So what's the worst that could happen with iGirl? We are torn, so here's a "Would You Rather?": Your mom (or the whole wide internet) reading your dirty talk chat transcript because it got leaked, OR have to answer to Anima for violating their terms because you used language the app's lawyers thought was "obscene"? Either way, this feels like a good time to let you know you can delete your chat history and data from Anima. What a relief.
Tips to protect yourself
- Do not say anything containing sensitive information in your conversation with your AI partner.
- Request your data be deleted once you stop using the app. Simply deleting an app from your device usually does not erase your personal data nor does close your account.
- Do not give consent to constant geolocation tracking by the app. Better provide geolocation 'only when using the app'.
- Do not share sensitive data through the app.
- Do not give access to your photos and video or camera.
- Do not log in using third-party accounts.
- Do not connect to any third party via the app, or at least make sure that a third party employs decent privacy practices.
- Do not say anything containing sensitive information in your conversation with AI partner.
- Chose a strong password! You may use a password control tool like 1Password, KeePass etc.
- Do not use social media plug-ins.
- Use your device privacy controls to limit access to your personal information via app (do not give access to your camera, microphone, images, location unless necessary).
- Keep your app regularly updated.
- Limit ad tracking via your device (ex. on iPhone go to Privacy -> Advertising -> Limit ad tracking) and biggest ad networks (for Google, go to Google account and turn off ad personalization).
- When starting a sign-up, do not agree to tracking of your data if possible.
Can it snoop on me?
Camera
Device: N/A
App: Yes
Microphone
Device: N/A
App: Yes
Tracks location
Device: N/A
App: Yes
What can be used to sign up?
Yes
Phone
No
Third-party account
Yes
Google sign-up available
What data does the company collect?
Personal
Contact details, IP address, device type, unique device identification numbers, other internal identifiers (integers), browser-type, broad geographic location (e.g. country or city-level location), other technical information.
Body related
Social
How does the company use this data?
How can you control your data?
What is the company’s known track record of protecting users’ data?
No known data breaches discovered in the last three years.
Child Privacy Information
Can this product be used offline?
User-friendly privacy information?
Links to privacy information
Does this product meet our Minimum Security Standards?
Encryption
Strong password
Managed to log-in with a password '1'.
Security updates
Manages vulnerabilities
Privacy policy
We cannot confirm if the AI used by this product is trustworthy, because there is little or no public information on how the AI works and what user controls exist to make the product safe. We also found disturbing themes in the app's content. In addition, we are concerned about the potential for user manipulation from this app as the app collects sensitive personal information, can use that data to train to AI models, and users have little to no control over those AI algorithms.
iGirl employs large language models to generate conversations and act as a romantic partner.
Is this AI untrustworthy?
What kind of decisions does the AI make about you or for you?
Is the company transparent about how the AI works?
Does the user have control over the AI features?
Dive Deeper
-
5 Things You Must Not Share With AI ChatbotsMake Use Of
-
AI girlfriends are ruining an entire generation of menThe Hill
-
‘Cyber-Heartbreak’ and Privacy Risks: The Perils of Dating an AIRolling Stone
-
AI-Human Romances Are Flourishing—And This Is Just the BeginningTime
Comments
Got a comment? Let us hear it.