Wysa

Wysa

Touchkin
WLAN

Überprüft am: 20. April 2022

|
|

Mozillas Meinung

|
Abstimmungsergebnis: Nicht unheimlich

Wysa describes itself as "an emotionally intelligent chatbot that uses AI to react to the emotions you express." Launched in 2016 by founder Jo Aggarwal, who discovered bots were easier to talk to when dealing with her own depression, the app says it offers "a mood tracker, mindfulness coach, anxiety helper, and mood-boosting buddy, all rolled into one." Along with Wysa's AI chatbot, the app also offers in-app text-based chat sessions with real, live therapists. Wysa offers some free access to their AI chatbot. To access more features, users can choose between two subscription options, $99 per year for access to the tools, or $99 per month for access to the tools and coaches.

So, how does Wysa's privacy look? We're so happy to say, Wysa's privacy looks pretty dang good! They seem to be one of the rare mental health apps that isn't looking to make money off your personal information. Good work Wysa!

Was könnte passieren, wenn etwas schiefgeht?

There are so many bad-for-your-privacy mental health apps out there in the world it is depressing. Thank goodness Wysa doesn't seem to be one of them! They don't require any personal identifiers to use their service. They don't request your personal data. They don't share your personal data. They don't sell it either. What?!? It's so refreshing to see a mental health app with strong privacy practices. Good on you Wysa.

Wysa does say that no direct marketing is performed but that they may use social media or other marketing, but no personal data is shared for this purpose. Again, that is very good. Consumer Reports did find in 2021 that Wysa shared some data with Facebook, but from what we can tell, that is likely not personal information. And all those AI chatbot conversations, Wysa says they will never share that data without user consent.

Wysa says they can share aggregated, de-identified data for analytics and to help improve their AI. We're not too worried about this with Wysa as they say they don't process geolocation at a level that makes your data identifiable but we also feel compelled to mention it has been found to be relatively easy to de-identify such user data where location data is more precise.

All in all, Wysa seems to be a breath of fresh air in the mental health app space. It actually take steps to implement privacy and security by design and default. We absolutely love that here at *Privacy Not Included. Thank you Wysa and please, keep up the good work!

Tipps zu Ihrem Schutz

  • For extra security, you can use the app without registering.
  • Wysa provides the following security tips:
    • Always lock your mobile screen by setting a password. Use strong passwords and keep passwords private. Never leave your device unattended.
    • Always extend your mobile screen password to set an App PIN to keep your conversations with the App private.
    • Always keep your mobile operating system up-to-date.
    • Enable remote access of your devices to enable you to locate and control your devices remotely in the event your device gets stolen.
    • Install anti-virus software to protect against virus attacks and infections
    • Avoid phishing emails. Do not open files, click on links or download programs from an unknown source.
    • Be wise about using Wi-Fi. Before you send personal and sensitive data over your laptop or mobile device on a public wireless network in a coffee shop, library, airport, hotel, or other public place, see if your data will be protected.
mobile Datenschutz Sicherheit KI

Kann es mich ausspionieren? Information

Kamera

Gerät: Nicht verfügbar

App: Nein

Mikrofon

Gerät: Nicht verfügbar

App: Ja

Verfolgt den Standort

Gerät: Nicht verfügbar

App: Nein

Was kann zur Registrierung verwendet werden?

Welche Daten sammelt das Unternehmen?

Wie nutzt das Unternehmen die Daten?

Wysa does not request Your Personal Data. If you inadvertently submit any Personal Identifiable Information (including but not limited to email identifier, location address, mobile numbers) then they will process it with your data and will irreversibly redact any Personal Identifiable Information within 24 hours in their system.

No personal data gets shared. All event data is made cryptic so that no medical or psychological profile gets created at the hands of the provider. We do not combine and process your personal data with any other third party available data. Your data, messages or usage is not used for direct marketing nor is it sold to advertisers. Wysa will not share or sell your personal data with any third party.

Touchkin will never share your conversation data without your explicit consent. Wysa will always ask your consent before using your name for social proof purposes.

In 2021, Consumer Reports reported that Wysa shares certain data with Facebook but stopped sharing IDs .

No human has access to or gets to monitor or respond during your chat with the AI Coach.

You may use the App if you are 18 or more. If between 13 and 18 years, your parent or legal guardian before use. Kindly inform your parent or legal guardian have to provide their parental consent by writing to [email protected] or [email protected] This App is not meant for those less than 13 years of age.

Wie können Sie Ihre Daten kontrollieren?

Where not specified Wysa retains your data for a maximum of 10 years since the end of subscription and as per their information retention policies.

You can also, at any point of time, clear all your transactional data by using the “reset my data” feature available in the App settings.

You can exercise the wide range of rights, including rights of access and right to erasure.

Wie ist das Unternehmen in der Vergangenheit mit den Daten über seine Verbraucher umgegangen?

Durchschnittlich

No known privacy or security incidents discovered in the last 3 years.

Informationen zum Datenschutz bei Kindern

The App is not to be used by children under 13 years. If you are between 13 and 18 years, read the Privacy Policy and Terms of Service with your parents or legal guardian and ask them to provide their consent to use the app at [email protected]

Kann dieses Produkt offline genutzt werden?

Nein

Benutzerfreundliche Informationen zum Datenschutz?

Ja

Wysa provides an extensive privacy FAQ.

Links zu Datenschutzinformationen

Erfüllt dieses Produkt unsere Mindestsicherheitsstandards? Information

Ja

Verschlüsselung

Ja

Wysa use TLS and SSL encryption during transfer and AES-256 protocol at rest.

Sicheres Passwort

Ja

Sicherheits-Updates

Ja

Umgang mit Schwachstellen

Ja

Datenschutzrichtlinie

Ja

Verwendet das Produkt KI? Information

Ja

According to Wysa, "Wysa AI Coach is an artificial intelligence-based 'emotionally intelligent' service which responds to the emotions you express and uses evidence-based cognitive-behavioral techniques (CBT), DBT, meditation, breathing, yoga, motivational interviewing and micro-actions to help you build mental resilience skills and feel better."

The AI Coach will always check if it has understood you incorrectly before progressing.

Ist diese KI nicht vertrauenswürdig?

Nicht zu bestimmen

Welche Entscheidungen trifft die KI über Sie oder für Sie?

Wysa decides what kind of help/content to provide you.

Gibt das Unternehmen transparent an, wie die KI funktioniert?

Ja

Wysa provides an extensive product FAQ.

Hat der Benutzer die Kontrolle über die KI-Funktionen?

Nicht zu bestimmen


Nachrichten

London patients to trial AI chatbot for mental health support
Digital Health
Wysa has an extensive library of on-demand resources to help patients manage their mental health, including cognitive behavioural techniques, meditation, breathing exercises, yoga and motivational interviewing. The NHS trial will provide clinical evidence on the digital health’s app ability to improve or maintain mental health symptoms while patients wait for traditional talking therapies. It is hoped that patients will not see a decline in their mood while waiting for treatment and in some cases may improve, which will reduce the number of conventional sessions needed, minimise wait time and improve the recovery rate.
Mental health app Wysa raises $5.5M for ’emotionally intelligent’ AI
TechCrunch
It’s hard enough to talk about your feelings to a person; Jo Aggarwal, the founder and CEO of Wysa, is hoping you’ll find it easier to confide in a robot. Or, put more specifically, “emotionally intelligent” artificial intelligence.
Wysa: Mental Health Support
Common Sense Media
Parents need to know that Wysa: Mental Health Support offers a range of tools to address stress and wellness. At the core of the app is a chatbot that uses artificial intelligence (AI) to react and respond to what users express while communicating through its texting-style platform. The chatbot will also suggest videos, articles, exercises, mindfulness techniques and/or other targeted tools depending on the feelings or challenges that are expressed throughout the conversation. There are also tools for better sleep, to relax, to manage anxiety, and to nurture positivity. While some tools are available for free, others are unlocked with purchase of premium subscription.
Do Mental Health Chatbots Work?
Healthline
Chatbots are a viable and seemingly effective method for getting mental health services via your device. The most obvious benefit is convenience, or what some people refer to as “reducing barriers to therapy.” Indeed, the AI platforms that were reviewed (Woebot and Wysa) were very convenient. You can reach out to these clever bots and get help at any time with little commitment.
How Wysa App Helps People With Depression And Anxiety Lead A Stress-Free Life
Times of India
Wysa -- an app -- is trying to help ease people’s mental health. It is making the most of AI in the form of a digital chatbot where people can talk about things bothering them for free while also offering an array of tools to calm someone when they’re stressed.
Meet the Women Founders Behind Shine and Wysa, Two Apps Focused on Mental Health and Self-Care
Yahoo!
There are a lot of topics that people tend to avoid talking about, like the disproportionate lack of mental health care, or why the tech industry is such a boys club. So whenever I come across apps that address these two issues at once, you’d better believe that I’m going to do what I can to make sure more people know about them. As part of this year’s International Women’s Day, Google Play spotlighted mental health apps Shine and Wysa, and I’m excited to help you learn more about these positive lifestyle apps, as well as the women who founded them.
Wysa Review
One Mind Psyber Guide
Wysa is an artificially intelligent (AI) chatbot who can coach users to better cope with daily stresses. Wysa is designed to help with a variety of issues, including depression, anxiety, sleep, issues facing the LGBTQ+ community, and more. Users select the areas they want to work on on the home screen. Users can then choose to chat with Wysa or complete other self-care activities.
I Chatted With a Therapy Bot to Ease My Covid Fears. It Was Bizarre.
OneZero
Makers of therapy bots say they can help manage the ‘tsunami’ of latent mental illness emerging with the stress of the pandemic and unemployment. But are they ready?
Privacy Concerns About Emotional Chatbots
Infosec
Despite their efficiency and potential for commercial deployment, emotional chatbots may also pose numerous risks, including ethical issues, information security threats, and privacy concerns. In this article, we will only focus on privacy concerns raised by emotionally intelligent chatbots.
Peace of Mind...Evaluating the Privacy Practices of Mental Health Apps
Consumer Reports
Mental health apps show many of the same patterns we see elsewhere in data-collecting apps. However, the sensitivity of the data they collect means the privacy practices and policies are even more important—especially during a pandemic where people are relying on these services in greater numbers for the first time. Our evaluation shows how there are multiple ways to evaluate how thoughtfully mental health apps handle user data collection, management, and sharing to third parties.
The Digital Standard Case Study: Mental Health Apps
The Digital Standard
Recent events like the global coronavirus pandemic, the resulting economic crisis, and large scale protests related to the Black Lives Matter movement, have spotlighted rising mental health related harms with marginalized and vulnerable populations. Increased anxiety and upheaval causes both physical and psychological symptoms and can be very distressing. Mental health applications collect sensitive information that can create damaging, irreversible impacts on individuals if shared with third parties, including social stigmatization and additional barriers to future opportunities.
Mental Health Apps Aren't All As Private As You May Think
Consumer Reports
Type “mental health” or a condition such as anxiety or depression into an app store search bar, and you can end up scrolling through endless screens of options. As a recent Consumer Reports investigation has found, these apps take widely varied approaches to helping people handle psychological challenges—and they are just as varied in how they handle the privacy of their users.
Wysa
Mental Health America
Wysa is an AI-enabled Life Coach for mental and emotional wellness. Launched in 2017, the service provides early intervention to high-risk groups through 3 methods: an AI chatbot, a library of evidence-based self-help tools, and messaging-based support from human psychologists.

Kommentare

Möchten Sie einen Kommentar loswerden? Schreiben Sie uns.