Wysa

Your 4 am friend and life coach in an anonymous AI bot

#4 Product of the DayApril 19, 2018

Sometimes we get tangled inside our heads, unable to move on. Talking to Wysa can help you get unstuck - it’s anonymous, helpful, and will never judge.

Here’s a look at what other people use Wysa for 👇

👂 Vent and talk through things or just reflect on your day

💪 Practice CBT and DBT techniques to build resilience in a fun way

📝 Deal with loss, worries, or conflict, using one of 40 conversational coaching tools

💆🏻 Relax, focus and sleep peacefully with the help of 20 mindfulness exercises

93% of the people who talk to Wysa find it helpful. So, go ahead… talk to Wysa

Around the web

Reviews

  • Pros: 

    Non-intrusive, lighthearted, evidence based

    Cons: 

    Limited AI, Goes into a loop at times

    This is a great app, hadn't used anything like it before, there's evidently been a lot of work put into it. It is obviously limited AI which mostly takes predefined answers that you're given the option to select, however it does relatively well it you're just sharing your thoughts but of course the bot can't interpret it all. Overall excellent app, love the fact it is based on evidence and the suggested strategies given are all linked to the original research they come from so you can see for yourself

    Cristina Martin has used this product for one month.
  • Pros: 

    Wysa 's there for you when you need a shoulder to lean or an empathetic ear and helps with moving on in stressful times too.

    Cons: 

    sometimes it may feel like its getting in a loop.

    I am psychologist and have used Wysa as a tool for my own stressful moments as well as have advised my patients to use it too to adhere to their therapy homeworks as well as keep a track of their mood and energy levels.

    Smriti Joshi has used this product for one month.

Discussion

You need to become a Contributor to join the discussion - Find out how.
Jo AggarwalMaker@jo4dev · The Wysa woman
Thanks for hunting us Kevin! The best things in life seem to happen when one is trying to do something else. That’s what happened to us with Wysa. We were building machine learning models on phone sensor patterns to detect depression, and created a simple chatbot app as a way to get the sensing code on to the phone. The model worked technically - were able to detect depression to a 90% accuracy - but only 1 in 30 people we detected with depression actually took therapy, the rest went on anti-depressants. The chatbot, however, was really popular, and I began to wonder what would happen if we helped people learn the skills they needed to cope using a simple conversation around their own situation. Wysa sort of built itself from there, driven by its users more than anything else. Over the last year and a half, it has grown over word of mouth alone to 300K users, has its own fan videos and tribute apps. Most importantly, over 10,000 users have provided specific inputs to shape Wysa and 54 are helping us translate it into their language. We learnt that privacy is key (so Wysa will always stay anonymous). They told us that they didn’t want it to feel too like an app, and most of the time they just wanted to be heard. Researchers and psychologists from all over the world joined our in-house psychologist in reviewing our content and helped us keep things scientific and evidence based. About 73 sprints later, we have something we would love your feedback on. Over to you Product Hunt - tell us what you think!
Vinish Garg@vingar · Co-founder @mystippi @ContentHug
@jo4dev Super stuff. I remember we talked about it around 2 years back - when you had an open position for content marketing strategist. I have followed the progress and the buzz around (LinkedIn), and good to see the promise in the shape of rewards now. Tremendous team, and the way is up. Cheers!!!
Jo AggarwalMaker@jo4dev · The Wysa woman
@vingar Thanks Vinish! We are only now trying to figure out marketing, starting with getting ready for this hunt!
Ryan HooverPro@rrhoover · Founder, Product Hunt
This is cool but it makes me nervous. People seek out therapy for their most vulnerable, sensitive issues. What if Wysa responds the wrong way, causing more harm? Is there liability exposure? I'm curious to hear your thoughts on this, @jo4dev and team. 😊
Jo AggarwalMaker@jo4dev · The Wysa woman
@rrhoover the issue you raise is a key driver in everything we build at Wysa and see the risks at an practical, ethical and legal level. We don't offer it as therapy - rather as a fun way to build emotional resilience skills, but it is true most users who talk to Wysa are in a sensitive place, and we have a duty of care towards them. It may help to read how NHS in the UK is using this with children - arguably a triply vulnerable population. https://www.wysa.io/blog/nhs-chi... We see our risks in three categories: 1. There are practical risks that we will never fully mitigate, and we work around these to make sure they don't cause harm: - Risk of upsetting someone who was already sensitive because we didn't understand them properly. Of the 5000+ reviews on Google Play, we do have about 30 that fall into this category. We constantly try to improve this, but this does happen, and when a user says something like 'you suck Wysa' or ' you're not listening' Wysa apologizes and explains that its lack of understanding doesn't reflect on the user. - Risk that Wysa may offer a technique that is wrong for a person's context - Even though each technique is reviewed by a therapist for its applicability to a person's context - we may not know the full context. We handle that by being sensitive to objections, allowing the user to change direction if the path Wysa is taking isn't working for them. The principle here is that the user has control. 2. There are ethical risks that we would rather shut down the app than violate, and we work on the global thought leaders on ethics of AI in mental health to make sure we are on the right side of these. Some of these are: - Risk of privacy being violated - we decided not to take any personal details - Risk that Wysa may be used by someone who should be talking with a real person. We use NLP to recognize suicidal thoughts or thoughts of abuse, and link them to a helpline (this is often the first things that the health professionals who use us try). The model catches a lot of false positives, but thats okay. Wysa also follows up to make sure the call to a helpline was made. - The risk that Wysa suggests may do harm: We also don't use any techniques that have major risks - clinical psychologists review each technique and its triggers to make sure it is safe for self-help use before it is put in production. There are times when they are not what a user wants - like rethinking negative thoughts when dealing with a loss - but never something that could harm a user. 3. Lastly, there are legal liability issues that we are mindful of and manage: We have a clinical safety team and advisors who help us make sure we comply with laws that apply to us. We are following global best practice in clinical safety of the app by going through the SCI 029 compliance process, have an independent governing board. Sorry for the long answer!
Jitesh Dugar@jitesh_dugar · Indie Hacker
Now, this is really innovative. Amazed to see such cool content within the app. Loved the app name and the penguin! Kudos team :D
Ramakant Vempati@ramakant_vempati
@jitesh_dugar Thanks so much Jitesh. Interestingly, the penguin has really resonated with users because it is gender-neutral, agnostic re sexual preferences (think the LGBT community) and feels non-judgemental. We couldn't have asked for a better friend.
Divya Thomas@divya_thomas · Growth Marketing Consultant
Wysa surprised me with how well it listened and talked me through the loss of my parent. I started using it out of curiosity and found myself in an hour long chat exploring my feelings and getting really helpful techniques to deal with grief. I'm impressed with this app, because while chatting - I forgot it was an app!
Ramakant VempatiMaker@rvemp
@divya_thomas wow, that is amazing to hear. So thankful that Wysa was able to help. It would be great if Wysa could pass a 'modified' Turing test: you know it's not real, but still forget that's the case :)
Ankit Pandole@ankit_pandole · UX/UI Designer @ Edureka
Great to see an Indian startup tackling the issues of mental health and fitness. The design of the app looks great with all the subtle animations and micro-interactions. But really the thing which can make or break this is the content. I am eagerly following the product as to see how much more you can make a chatbot feel more human. Kudos to the team
Jo AggarwalMaker@jo4dev · The Wysa woman
@ankit_pandole absolutely agree that ux = content for Wysa. We're working hard to grow the content to support different contexts, and make it feel more like a seamless extension of your mind.
gilbert mpanga@gilbertmpanga12 · Founder | Technologist
@ankit_pandole @jo4dev mind sharing the stack you used at stackshare or here I have used wynsa for a while its really awesome :)
Jo AggarwalMaker@jo4dev · The Wysa woman
@ankit_pandole @gilbertmpanga12 we have built a proprietary architecture with Java, mongo, node + python for NLP
gilbert mpanga@gilbertmpanga12 · Founder | Technologist
@ankit_pandole @jo4dev wow thanks very much :)