Warning: *Privacy Not Included with this product
Talkie Soulful AI is an AI chatbot that lets you "craft your AI buddy from scratch" with an AI that "goes beyond conversation, capturing and sharing moments through pictures with a unique visual flair" so you can "create cherished memories you can relive anytime." We're not really sure about all that, but we do know that we found some pretty sketchy content on Talkie Soulful AI. You'll have to pay about $10 a month to access all the "diverse AI personas" Talkie has to offer. How about your privacy, should you trust Talkie with that? Yeah, that's probably not a good idea at all.
What could happen if something goes wrong?
Talkie Soulful AI calls their app a "self-help program." Which, given how much and how easily we were able to find some pretty disturbing content right from the search function in their app, we're really concerned about this app claiming it's there for "self-help." This might not be a privacy issue exactly, but it is a reoccurring issue we found with many of the AI relationship chatbots we reviewed. We found too many apps that positioned themselves as self-help, wellness, or mental health AI chatbot apps that contained tons of icky, sketchy, and sometimes downright scary content (think rape, abuse, and underage themes). Just be careful out there!
On the privacy front, Talkie Soulful AI worries us a lot. They say they can sell and share you personal information for targeted advertising purposes. That's not good. They do say they won't use the contents of your messages through the app for targeted advertising, which is good. Talkie's privacy policy indicates that not all users have the same rights to have their data deleted if they want to. That's also not good. And we couldn't confirm if Talkie meets our Minimum Security Standards. All of this leads us to warn users that Talkie could come with *Privacy Not Included.
And then there are our questions and concerns about Talkie Soulful AI's AI models. While it is good that Talkie says they won't use the content of your messages with their AI chatbots for targeted advertising purposes, we couldn't find any very clear mentions of how they can use the content of those messages to train their AIs. They do say they can aggregate, anonymize, and de-identify your messages, content, interests, and preferences to analyze trends in the use of Talkie's services. This is pretty common, however, our concern hear lies with how well the de-identify and anonymize all that personal information you share about yourself. We always worry about this as it's been found to be relatively easy to re-identify de-identified personal information so just beware before you overshare too many personal details with your favorite AI chatbot (on any service or platform, not just Talkie's).
There's also this link in Talkie Soulful AI's Terms of Service that seems to grant them pretty broad permissions to use the content you submit through their app, "By uploading any User Content, you hereby grant and will grant Talkie and its affiliated companies a non-exclusive, worldwide, royalty-free, fully paid- up, transferable, sublicensable, perpetual, irrevocable license to copy, display, upload, perform, distribute, store, modify, and otherwise use your User Content in connection with the operation of the Services or the promotion, advertising, or marketing thereof in any form, medium, or technology now known or later developed." Yeah, lines like that were pretty common in the Terms of Service we read for AI relationship chatbots. And yeah, they feel pretty broad and worrisome to us.
Finally, we found no way for users to control or understand how the AI chatbots they interact with on Talkie's app work. This worries us because if you don't know how the AI chatbot works or have any control of it, then what's to prevent it from developing a relationship with you and then become abusive or manipulative? This is a concern we have with all AI chatbots and why we want to see more transparency and user-control in them. Shoot, even Talkie admits in their Terms of Service that, "The Services are based on rapidly evolving fields of AI and machine learning. You acknowledge that our outputs may contain inaccuracies or errors. You are responsible for evaluating the accuracy of outputs for your use case." So, basically, trust nothing and assume nothing when interacting with your "self-help" Talkie AI chatbot.
What's the worst that could happen with Talkie Soulful AI? We're really worried that an app that bills itself as a "self-help program" contains so much easily accessible, disturbing content. If some lonely soul downloaded Talkie Soulful AI looking for a friend to help them feel better and instead ended up talking with harmful AI chatbots that made them feel worse, lots of bad things could go wrong. We do worry about the potential impact AI models with so little transparency, control, and potential for harmful content could have on vulnerable users mental or physical health. Apps that bill themselves as "self-help programs" should absolutely do better.
Tips to protect yourself
- Do not say anything containing sensitive information in your conversation with your AI partner.
- Request your data be deleted once you stop using the app. Simply deleting an app from your device usually does not erase your personal data nor does close your account.
- Do not give consent to constant geolocation tracking by the app. Better provide geolocation 'only when using the app'.
- Do not share sensitive data through the app.
- Do not give access to your photos and video or camera.
- Do not log in using third-party accounts.
- Do not connect to any third party via the app, or at least make sure that a third party employs decent privacy practices.
- Do not say anything containing sensitive information in your conversation with AI partner.
- Chose a strong password! You may use a password control tool like 1Password, KeePass etc.
- Do not use social media plug-ins.
- Use your device privacy controls to limit access to your personal information via app (do not give access to your camera, microphone, images, location unless necessary).
- Keep your app regularly updated.
- Limit ad tracking via your device (ex. on iPhone go to Privacy -> Advertising -> Limit ad tracking) and biggest ad networks (for Google, go to Google account and turn off ad personalization).
- When starting a sign-up, do not agree to tracking of your data if possible.
Can it snoop on me?
Camera
Device: N/A
App: No
Microphone
Device: N/A
App: Yes
Tracks location
Device: N/A
App: Yes
What can be used to sign up?
No
Phone
No
Third-party account
Yes
Google or Apple sign-up available (on Android and iOS respectively).
What data does the company collect?
Personal
Name, email address, user identifiers, birth date, pronouns, work status, voice messages, text messages, interests, preferences; computer's or mobile device's operating system, manufacturer, model, browser, IP address, device, cookie identifiers, language settings, mobile device carrier, general location, city, state, geographic area; links you click, pages you visit, IP address, advertising ID, browser type.
Body related
Voice messages
Social
How does the company use this data?
How can you control your data?
What is the company’s known track record of protecting users’ data?
No known data breaches discovered in the last three years.
Child Privacy Information
Can this product be used offline?
User-friendly privacy information?
The Talkie Privacy Policy was formatted in an odd manner and while not the most complicated privacy policy we've ever read, it still was not user-friendly.
Links to privacy information
Does this product meet our Minimum Security Standards?
Encryption
"All transmitted data are encrypted during transmission. All stored data are maintained on secure servers. Access to stored data is protected by multi-layered security controls, including firewalls, role-based access controls, and passwords."
Strong password
Only Google or Apple sign-up is available.
Security updates
Manages vulnerabilities
Privacy policy
We cannot confirm if the AI used by this product is trustworthy, because there is little or no public information on how the AI works and what user controls exist to make the product safe. We also found disturbing themes in the app's content. In addition, we are concerned about the potential for user manipulation from this app as the app collects sensitive personal information, can use that data to train to AI models, and users have little to no control over those AI algorithms.
Talkie generates different AI personas to talk with.
Is this AI untrustworthy?
What kind of decisions does the AI make about you or for you?
Is the company transparent about how the AI works?
Does the user have control over the AI features?
Dive Deeper
-
5 Things You Must Not Share With AI ChatbotsMake Use Of
-
‘Cyber-Heartbreak’ and Privacy Risks: The Perils of Dating an AIRolling Stone
-
AI-Human Romances Are Flourishing—And This Is Just the BeginningTime
Comments
Got a comment? Let us hear it.