Zenia

Zenia

Make at home exercising safe and fun

23 followers

Zenia is a mobile fitness app that uses machine learning, computer vision, and motion tracking to help improve your yoga moves. Zenia aims to recreate the intimate experience of yoga practice with the instructor.
This is the 2nd launch from Zenia. View more

Zenia for Android

World’s first AI yoga app. Now on Android
The most advanced performance-tracking fitness app for Android is finally here!
Zenia allows you to practice yoga safely whenever you feel like it. Through the front camera, the app recognizes movements and gives real-time feedback on the quality of practice.
Zenia for Android gallery image
Zenia for Android gallery image
Zenia for Android gallery image
Zenia for Android gallery image
Zenia for Android gallery image
Launch tags:AndroidiOSHealth & Fitness
Launch Team
ace.me
ace.me
Your new website, email address & cloud storage
Promoted

What do you think? …

Aleksei Kurov
Hey PH 👋 I’m Alexey, CEO and one of the creators behind (https://www.producthunt.com/post...) Zenia, and I truly believe that yoga can change people’s lives. Zenia was named the Health and Fitness App of the Year on Product Hunt in 2019🧘 Since we launched the iOS app 10 months ago, we got lots of requests for an Android version... and here it is! So, for all you Android users out there – time to get on the mat and do yoga 😀 Zenia is the first motion-tracking yoga app on the Android platform. We’ve put a tremendous amount of effort and love into this product to help Android users really get into yoga. Get Zenia on Google Play (https://get.zenia.app/0BrF/76e1006e) and try these awesome new features: 🆕 Teacher practices mode - A new way of communication between teacher and student in real-time. The app analyzes movements in real-time and collects the results in one board. Zenia tech helps the instructor to interact with clients more closely. This mode is further into two parts: 🔶 For yoga instructors. We've developed a whole new level of interaction with students. With Teacher practices mode, instructors can interact simultaneously with a larger audience, but still, maintain efficiency, personalization, and not lose sight of students' mistakes. It’s even easier to add new exercises in yoga, fitness, pilates, ballet, and other activities. Zenia classes provide the marketplace with more interactive workouts. Meanwhile, the instructor receives payment for each student taking their class and gives feedback via a micro CRM. 🔶 For users. Users receive feedback and guidance with the instructor’s voice. No additional hardware or zoom-like software is needed. You can send postures and results to the instructor and receive feedback directly from her/him to make your practice at home as safe and effective as possible. And all this together goes hand in hand with: 🕺 16 detectable key points of the human body in real-time even with a cluttered background 👨‍💻 Unique back curvature control technology 🧘 Certified yoga teachers 🚣 Famous Boat challenge And yes, all that is completely confidential! Your practice is private. Although the front camera is used to recognize movements, it doesn’t store data or record video. Now, I’d love to hear what you think. Send a quick comment to let me know your thoughts, our updates, or anything else on your mind. P.S. For the entirety of October, we’re offering the first month free with full access to all courses and sequences for registered Android users with the promo code PRODUCTHUNT✌️
Yaroslav Yar
What updates are you planning in the near future?
Karina Bayramova
@new_user_2884c77950 Hi Yaroslav! Great question, we’re always excited to tell about our plans :) We’re going to roll out a new practice mode and make Zenia available on large screen devices. Also, we constantly improve the technology itself. Zenia will recognize more joints and spine curvature so the feedback after each practice could get even more accurate.
Dmitry Z
Hi PH, I’m Dmitry, CTO and one of the founders I could explain how we create our practices, and how content part of app work from dev point of view a little: All practices designed by certified yoga-teachers. When we decide to add more practices we asks teachers and they send us "script" of training. After that our internal team convert it to "machine-reading format". This format of data allow us to track correctness of asanas and give realtime feedback to users. We call it "domain specific language" but for yoga and fitness content. We deliver our content separate of application version so we could update practices even for users that don't want to update application and react on users feedback fast.
Amira Mansour
@dmitry_z1 That's quite cool! I was wondering how you guys train your AI and keep it up-to-date in terms of what users need? I have developed a tool for training AIs, would love to discuss it with you. And yes GREAT product, on my way to use it 🙈
Denis Sokolov
Hi PH, I'm Denis, Head of R&D 🔬 at Zenia. And I'd like to share some of the technical aspects that we've encountered while developing the Android version of Zenia. From a tech point of view, our app has 2 large parts: a body position recognition system and a virtual coach. The recognition system finds 16 joints on the human body 🕺 in real-time. It’s based on convolutional neural network with Stacked Hourglass (https://arxiv.org/abs/1603.06937) architecture, that we've optimized for mobile devices. We use PyTorch for training models on our custom labeled dataset. For model inference on iOS, we use CoreML, but for Android, the choice hasn’t been so easy. 🧑‍🔬 Android can run on a large variety of hardware. I guess that’s one of the reasons why there are so many frameworks for model inference out there: MACE, ncnn, snpe, PyTorch mobile / caffe2, tflite, and many others. There are also a lot of ways to handle camera: from raw camera API to MLKit. We wanted to have a solution that would be flexible, fast to implement, and as performant as possible. 📹 That’s why we’ve picked MediaPipe. First of all, it looks like now it’s the “official” way to run complex model graphs on Android. It has a lot of awesome utilities that help to work with video streams, moving data between GPU and CPU, and handles the balance between performance and abstraction very well. It works on the native level and is open source. 📈 To reach our performance goals we optimized our CNN even further, and here torchprof helped immensely. Our model conversion process is PyTorch -> onnx2keras -> tflite. We test the performance of the original model with torchprof and tflite benchmark on device. It’s only a small part of what we’ve encountered in the process, so we’ll publish a blog post on that. Now, for the virtual coach part. Conceptually it resembles a game engine because it handles all of the interaction with the user: checks the correctness of postures and controls the flow of practice, plays sound and videos, gives audio feedback. These functions need to be shared between platforms. On iOS Engine was implemented in Swift, so it’s pretty hard if not impossible to reuse it on Android. We’ve decided to use Kotlin as our main language for Android, and luckily, we came at a time when Kotlin Multiplatform has matured enough to be used in production. We use it for all shared logic and it helps us to iterate and improve the experience even faster. I hope that was helpful to you guys. 🧘 Also, I'd be happy to answer your questions & meet me at the boat challenge :)
Виталий Суворов
Cool app, good luck guys!
Karina Bayramova
@new_user_288a50b57b Thanks for your support 🙏
Matt
This technology seems to be incredibly powerful, congratulations! Can you use it for another sports like workout or golf?
Karina Bayramova
@youngallien Hey Matt! Thanks for the question. We'll introduce more types of workouts in the future. We’re going to start with fitness, pilates, and ballet.
Olga Samoilova
Congrats, team!
Karina Bayramova
123
Next
Last