Snap lets you run instant usability tests with AI personas. Drop in a prototype or website, and your custom AI personas will test them like real users. Get results in minutes with transcripts, recordings, and actionable recommendations.
Hey everyone, I’m Eric, co-founder of @Snap. Thanks to @garrytan for hunting 🚀
Before this, I was a designer at companies like Uber and Bread, where I saw firsthand how valuable usability testing is in improving product quality. I also saw how rarely teams have time to do it.
Recruiting participants, scheduling sessions, conducting interviews, and analyzing recordings can take weeks. As a result, most teams only research a small percentage of features (if at all).
Snap lets you run instant usability tests with AI personas built from your real users.
Build a persona in seconds by uploading interview transcripts or describing your target audience
Drop in a prototype link, upload images, paste a website URL, or use our Figma plugin
Your AI personas will test the design like real users, thinking out loud as they decide what to do next
Get results in minutes with transcripts, recordings, and actionable recommendations
We're seeing evidence in academic research that AI users can simulate real user behavior with surprising accuracy. We believe this will help teams make user-centered decisions in the 95% of cases where research wouldn’t have happened before. If AI is making it faster than ever to build, it should also make it faster to learn.
We'd love your thoughts, feedback, or ideas for how you'd like to see us evolve Snap!
@masump - great question! You can customize the task instructions and the screen-by-screen questions you ask. You can also provide additional context to the AI (e.g. tell it what to focus on, what is a placeholder, etc...).
You can also run the test with multiple variations of each persona. Just like with human participants, different AI participants may interpret your designs and tone differently.
This could really save time. Just an idea for you: a question template for user interviews would be great to attach to your trial, something that helps uncover real user habits and shape an AI persona that’s genuinely useful in testing. Not all user interviews are equal, unfortunately.
Our default templates are focused on basic usability interview questions (e.g., What are your impressions of this screen? What would you do next and why? Is there anything missing or confusing?, etc...). And our personas can be generated from files you upload or from a description you provide.
We have a few things on the roadmap that might help with this. Will be sure to move them up.
Using AI to analyze prototypes before the test is created to suggest customized questions for your specific use case.
Building a Persona-setup user interview that you can complete or send to a real user to customize your persona. We've seen in academic research that long-form interviews are one of the most accurate ways to create personas.
Report
This is intriguing. We’re running unmoderated user testing on a large website next week using usertesting.com.
It would be interesting to run the exact same tests on Snap and compare results. Would that be relatively cheap to do with 5 users and 2 tasks/journeys?
@roozbehfirouz thanks! Our goal is to make it easier for designers and builders to incorporate user-centered feedback into their development process. User tests become a lot more useful when they take minutes instead of weeks.
Finally, someone’s making usability testing actually doable for busy teams! Building AI personas from real user transcripts is such a clever shortcut.... I am wondering if you’ve seen any surprising insights vs. manual tests? Btw Congrats team!!
Thanks, @cruise_chen! Yes, exactly - most teams can only do usability tests on the most critical projects because they don't have weeks to run and analyze them.
Our goal is to minimize surprises and make these as similar to manual tests as possible. One surprising similarity is the amount of variation you get from running the same test multiple time with the same AI persona. Just like with human users, you can test with the same type of target user, and they may have different recommendations or take different actions. So, we added the ability to run multi-user tests where we give you a consolidated report with the takeaways across participants.
Congrats to @erictli and the whole Snap team on the launch!
This is a really interesting approach — the ability to run instant usability tests with AI personas could be a game changer for validating functionality, user flows, and overall experience before launch.
Curious to know — can we define our ICPs so the AI personas align with our target audience profiles? That would make the insights even more precise and actionable.
@agzee thank you! Yes, that's a core part of the product for us!
You can describe the ICP, and we'll generate a more structured, fully-featured persona from that description. Or you can upload interview transcripts, presentations, or documents you have, and we'll use those to generate a persona.
You can create personas that represent multiple ICPs or sub-segments within your ICP. You can even build personas to mirror specific customers.
V grateful that the design space is getting such a valuable addition to the ecosystem. Hopefully this saves companies thousands of wrong directions and gets value to market faster
This actually works, guys! Feels like Playwright MCP in action, but extracting relevant data from every step. Great job, team! 👏
Already got some great feedback. It would be amazing if I could click 'Continue' on the task I assigned. It stopped before completing the full goal, probably because of my mandatory analysis step takes a bit of time and it hit a timeout on your end.
@miguel_casteleiro thanks for giving it a spin! Yes, on website tests, we have a timeout on the free plan. We're still refining, but will likely increase these limits soon!
Love the idea of being able to continue a test. Will add that to the roadmap!
Report
@erictli thanks mate. now if I can call this from cursor through MCP, would be great as well! (just an idea here)
(if you can call the AI you are using to improve the questions the system should answer based on the goal given by the user, now you have 3 standard questions (bottom of page), but most users (like me) struggle to create new ones)
@miguel_casteleiro the second part is on our roadmap. We're 100% going to use AI to customize the creation of questions based on what you're testing.
Will need to think a bit more about the Cursor MCP. Could be really interesting to conduct these tests on local code (like UX linting), but need to figure out the best way to make that possible.
Replies
Snap
Hey everyone, I’m Eric, co-founder of @Snap. Thanks to @garrytan for hunting 🚀
Before this, I was a designer at companies like Uber and Bread, where I saw firsthand how valuable usability testing is in improving product quality. I also saw how rarely teams have time to do it.
Recruiting participants, scheduling sessions, conducting interviews, and analyzing recordings can take weeks. As a result, most teams only research a small percentage of features (if at all).
Snap lets you run instant usability tests with AI personas built from your real users.
Build a persona in seconds by uploading interview transcripts or describing your target audience
Drop in a prototype link, upload images, paste a website URL, or use our Figma plugin
Your AI personas will test the design like real users, thinking out loud as they decide what to do next
Get results in minutes with transcripts, recordings, and actionable recommendations
We're seeing evidence in academic research that AI users can simulate real user behavior with surprising accuracy. We believe this will help teams make user-centered decisions in the 95% of cases where research wouldn’t have happened before. If AI is making it faster than ever to build, it should also make it faster to learn.
We'd love your thoughts, feedback, or ideas for how you'd like to see us evolve Snap!
P.S. If you prefer to do research with real users, we also have a product for that.
@garrytan @erictli Can I tweak the AI’s suggestions if it misreads my tone?
Snap
@masump - great question! You can customize the task instructions and the screen-by-screen questions you ask. You can also provide additional context to the AI (e.g. tell it what to focus on, what is a placeholder, etc...).
You can also run the test with multiple variations of each persona. Just like with human participants, different AI participants may interpret your designs and tone differently.
MacPaw
This could really save time. Just an idea for you: a question template for user interviews would be great to attach to your trial, something that helps uncover real user habits and shape an AI persona that’s genuinely useful in testing. Not all user interviews are equal, unfortunately.
Snap
@mskatecheng great idea!
Our default templates are focused on basic usability interview questions (e.g., What are your impressions of this screen? What would you do next and why? Is there anything missing or confusing?, etc...). And our personas can be generated from files you upload or from a description you provide.
We have a few things on the roadmap that might help with this. Will be sure to move them up.
Using AI to analyze prototypes before the test is created to suggest customized questions for your specific use case.
Building a Persona-setup user interview that you can complete or send to a real user to customize your persona. We've seen in academic research that long-form interviews are one of the most accurate ways to create personas.
Snap
@caruben - interesting idea! Feel free to shoot me an email at eric@getversive.com, happy to help facilitate this.
Sellkit
Well this one sounds really exciting! And brilliant ofc. AI-driven usability testing could save designers days of manual feedback loops.
Snap
@roozbehfirouz thanks! Our goal is to make it easier for designers and builders to incorporate user-centered feedback into their development process. User tests become a lot more useful when they take minutes instead of weeks.
Agnes AI
Finally, someone’s making usability testing actually doable for busy teams! Building AI personas from real user transcripts is such a clever shortcut.... I am wondering if you’ve seen any surprising insights vs. manual tests? Btw Congrats team!!
Snap
Thanks, @cruise_chen! Yes, exactly - most teams can only do usability tests on the most critical projects because they don't have weeks to run and analyze them.
Our goal is to minimize surprises and make these as similar to manual tests as possible. One surprising similarity is the amount of variation you get from running the same test multiple time with the same AI persona. Just like with human users, you can test with the same type of target user, and they may have different recommendations or take different actions. So, we added the ability to run multi-user tests where we give you a consolidated report with the takeaways across participants.
Kill Ping
Snap
@agzee thank you! Yes, that's a core part of the product for us!
You can describe the ICP, and we'll generate a more structured, fully-featured persona from that description. Or you can upload interview transcripts, presentations, or documents you have, and we'll use those to generate a persona.
You can create personas that represent multiple ICPs or sub-segments within your ICP. You can even build personas to mirror specific customers.
Deepwander
Very nice! I'll be using this.
Snap
Thanks, @vhpoet! Appreciate you ❤️
NUMI
Snap
@harrison_telyan thank you!! This means so much coming from a founder like you.
UI Bakery
Love how it helps teams validate designs faster without recruiting testers. Great launch!
Snap
Thanks @vladimir_lugovsky!
This actually works, guys! Feels like Playwright MCP in action, but extracting relevant data from every step. Great job, team! 👏
Already got some great feedback. It would be amazing if I could click 'Continue' on the task I assigned. It stopped before completing the full goal, probably because of my mandatory analysis step takes a bit of time and it hit a timeout on your end.
Snap
@miguel_casteleiro thanks for giving it a spin! Yes, on website tests, we have a timeout on the free plan. We're still refining, but will likely increase these limits soon!
Love the idea of being able to continue a test. Will add that to the roadmap!
@erictli thanks mate. now if I can call this from cursor through MCP, would be great as well! (just an idea here)
(if you can call the AI you are using to improve the questions the system should answer based on the goal given by the user, now you have 3 standard questions (bottom of page), but most users (like me) struggle to create new ones)
Snap
@miguel_casteleiro the second part is on our roadmap. We're 100% going to use AI to customize the creation of questions based on what you're testing.
Will need to think a bit more about the Cursor MCP. Could be really interesting to conduct these tests on local code (like UX linting), but need to figure out the best way to make that possible.