Agshin Rajabov

Agshin Rajabov

EmotionSense ProEmotionSense Pro
69 points
EmotionSense Pro
We chose TensorFlow.js for facial expression detection because it runs entirely in the browser, eliminating the need for server-side processing. That was critical for us to maintain EmotionSense’s core promise: full local processing and zero data exposure. Other libraries either lacked performance or required server dependencies that conflicted with our privacy-first approach.
3 views
EmotionSense Pro
We used Figma for all our UI/UX design because it’s incredibly intuitive for fast iterations and cross-team collaboration. Unlike other design tools, Figma made it easy for both designers and non-designers on our team to jump in and leave feedback in real time. Its browser-based platform also aligned perfectly with our privacy-first, no-install philosophy.
EmotionSense Pro
For speech sentiment analysis, OpenAI’s models offered unmatched accuracy and flexibility. We tried a few other NLP APIs, but OpenAI stood out with its ability to detect nuanced emotional tone in conversation while keeping integration seamless. It helped us build a smarter emotional layer into EmotionSense without overengineering.
2 views