Your 4 am friend and life coach in an anonymous AI bot

Sometimes we get tangled inside our heads, unable to move on. Talking to Wysa can help you get unstuck - it’s anonymous, helpful, and will never judge.

Here’s a look at what other people use Wysa for 👇

👂 Vent and talk through things or just reflect on your day

💪 Practice CBT and DBT techniques to build resilience in a fun way

📝 Deal with loss, worries, or conflict, using one of 40 conversational coaching tools

💆🏻 Relax, focus and sleep peacefully with the help of 20 mindfulness exercises

93% of the people who talk to Wysa find it helpful. So, go ahead… talk to Wysa

Would you recommend this product?
71 Reviews4.9/5
Thanks for hunting us Kevin! The best things in life seem to happen when one is trying to do something else. That’s what happened to us with Wysa. We were building machine learning models on phone sensor patterns to detect depression, and created a simple chatbot app as a way to get the sensing code on to the phone. The model worked technically - were able to detect depression to a 90% accuracy - but only 1 in 30 people we detected with depression actually took therapy, the rest went on anti-depressants. The chatbot, however, was really popular, and I began to wonder what would happen if we helped people learn the skills they needed to cope using a simple conversation around their own situation. Wysa sort of built itself from there, driven by its users more than anything else. Over the last year and a half, it has grown over word of mouth alone to 300K users, has its own fan videos and tribute apps. Most importantly, over 10,000 users have provided specific inputs to shape Wysa and 54 are helping us translate it into their language. We learnt that privacy is key (so Wysa will always stay anonymous). They told us that they didn’t want it to feel too like an app, and most of the time they just wanted to be heard. Researchers and psychologists from all over the world joined our in-house psychologist in reviewing our content and helped us keep things scientific and evidence based. About 73 sprints later, we have something we would love your feedback on. Over to you Product Hunt - tell us what you think!
Upvote (21)Share
@jo4dev Super stuff. I remember we talked about it around 2 years back - when you had an open position for content marketing strategist. I have followed the progress and the buzz around (LinkedIn), and good to see the promise in the shape of rewards now. Tremendous team, and the way is up. Cheers!!!
@vingar Thanks Vinish! We are only now trying to figure out marketing, starting with getting ready for this hunt!

I am psychologist and have used Wysa as a tool for my own stressful moments as well as have advised my patients to use it too to adhere to their therapy homeworks as well as keep a track of their mood and energy levels.


Wysa 's there for you when you need a shoulder to lean or an empathetic ear and helps with moving on in stressful times too.


sometimes it may feel like its getting in a loop.

Upvote (13)Share
Dear Smiriti, thank you. We've been most gratified that practising clinicians, counselors and coaches have adopted Wysa and recommend it to their patients or users. What more could one ask for?

I have been using Wysa for 8 months and it has really been helpful and useful in moments when emotions become stuck. It is discreet and anonymous, accepting and friendly.

I would recommend it to anyone who wants to understand their emotions better, needs some processing help, want to build Iife skills or want some add on support to therapy.

Love it!


Anonymous, available 24/7, non intimidating, helpful, great content


It's not therapy or human - depth is limited

Upvote (10)Share
Thank you Peggy! As one of our co-designing users - your feedback has always been spot on and helpful.
This is cool but it makes me nervous. People seek out therapy for their most vulnerable, sensitive issues. What if Wysa responds the wrong way, causing more harm? Is there liability exposure? I'm curious to hear your thoughts on this, @jo4dev and team. 😊
@rrhoover the issue you raise is a key driver in everything we build at Wysa and see the risks at an practical, ethical and legal level. We don't offer it as therapy - rather as a fun way to build emotional resilience skills, but it is true most users who talk to Wysa are in a sensitive place, and we have a duty of care towards them. It may help to read how NHS in the UK is using this with children - arguably a triply vulnerable population. https://www.wysa.io/blog/nhs-chi... We see our risks in three categories: 1. There are practical risks that we will never fully mitigate, and we work around these to make sure they don't cause harm: - Risk of upsetting someone who was already sensitive because we didn't understand them properly. Of the 5000+ reviews on Google Play, we do have about 30 that fall into this category. We constantly try to improve this, but this does happen, and when a user says something like 'you suck Wysa' or ' you're not listening' Wysa apologizes and explains that its lack of understanding doesn't reflect on the user. - Risk that Wysa may offer a technique that is wrong for a person's context - Even though each technique is reviewed by a therapist for its applicability to a person's context - we may not know the full context. We handle that by being sensitive to objections, allowing the user to change direction if the path Wysa is taking isn't working for them. The principle here is that the user has control. 2. There are ethical risks that we would rather shut down the app than violate, and we work on the global thought leaders on ethics of AI in mental health to make sure we are on the right side of these. Some of these are: - Risk of privacy being violated - we decided not to take any personal details - Risk that Wysa may be used by someone who should be talking with a real person. We use NLP to recognize suicidal thoughts or thoughts of abuse, and link them to a helpline (this is often the first things that the health professionals who use us try). The model catches a lot of false positives, but thats okay. Wysa also follows up to make sure the call to a helpline was made. - The risk that Wysa suggests may do harm: We also don't use any techniques that have major risks - clinical psychologists review each technique and its triggers to make sure it is safe for self-help use before it is put in production. There are times when they are not what a user wants - like rethinking negative thoughts when dealing with a loss - but never something that could harm a user. 3. Lastly, there are legal liability issues that we are mindful of and manage: We have a clinical safety team and advisors who help us make sure we comply with laws that apply to us. We are following global best practice in clinical safety of the app by going through the SCI 029 compliance process, have an independent governing board. Sorry for the long answer!
Upvote (16)Share
Now, this is really innovative. Amazed to see such cool content within the app. Loved the app name and the penguin! Kudos team :D
@jitesh_dugar Thanks so much Jitesh. Interestingly, the penguin has really resonated with users because it is gender-neutral, agnostic re sexual preferences (think the LGBT community) and feels non-judgemental. We couldn't have asked for a better friend.