Usability testing, with product experts.

Would you recommend this product?
No reviews yet
I've done some opentests and the results have been awesome. very in-depth videos, with actionable insights
@_jacksmith thank you for all the continued feedback. Been immensely helpful.
@_jacksmith agree with Joe. You've been incredibly helpful. I can't wait to be able to repay the favor.
@_jacksmith Thanks Jack! Side note: If anyone is looking to apply to become a product expert, here is the link:
As someone who used "Mechanical Turk" powered usability testing services before, I can safely say that they are very practical but mostly useless, 90% of random people just tell you what you probably want to hear and only 10% point out useful stuff -- Opentest seems like it can eliminate this issue but remain practical, I see you advertise the Facebook Engineer, which is logical, it would be great to see the other experts and maybe choose a specific expert for the test I'm personally interested in getting feedback from a designer / usability expert
@kaansoral thanks for the feedback! yes, allowing the ability to pick a specific type of tester (designer, engineer, product manager) is definitely something we're working on. Let me know how I can help get you to run your first usability test with Opentest :-)
Hey @kaansoral ! Great feedback. That is a perfect example of one multiple pain points we are trying to address. Lack of trust that you'll get value, and too much time sifting through the nonsensical stuff. We apologize for the lack of details about the Expert network. We are in the process of creating a stand alone page that describes who the Experts are in more detail with more examples. To make sure you get what you need if you sign up, there is currently a section in the sign up flow where you can specify additional details for your test. We review every test before we assign it and would make sure you get the type of Expert you feel you'd get the most value from! We have some top notch designers in the queue right now. The hand picking from a list requires significant volume on both sides of the network. We are working our way towards that and will certainly make that available once we lock in that usability test to Expert equilibrium!
@_shahedk @joethomas_x - thank you both for the replies, sounds great Can't wait to use opentest once I have some free time, from experience, receiving feedback is probably much harder than giving it :D
@kaansoral @_shahedk Very very true. But as I'm sure you've learned openness and humility are cornerstones to success. Looking forward to matching you with an expert.
@joethomas_x love the idea. just submitted an expert application.
This is great! More of a consultant/mentor response than a potential customer, which I think is a great distinction worth calling out.
@afhill I was going to say the same thing. This isn't "Usability Testing" in the traditional sense and I think it's a bit of a misnomer. Nothing beats putting your product in front of real potential users and seeing how they use it.
It's a great point @afhill. We have toyed with positioning our offering in that way, but the ability to understand what it is that we do within an elevator pitch became significantly more muddled when we started using the terms micro-consulting and micro-mentoring. @callmeed At our core as a company and the core of our Expert offering, a usability test is fundamentally our approach to the feedback. A video of Experts as they walk through a specified flow or experience. All the feedback given by the Experts is to improve the usability of the site. They just happen to be wearing multiple hats in this scenario as a plain ole user, and as someone who works in the space who can not only point out the problems but offer as solution as well :) We don't disagree about seeing actual users interact with the product.
Hey Product Hunters! Jumping in to give you a little more color. Everyone understands the incredible value usability testing delivers. But most who have done it probably also feel the pain of doing it. It currently takes too much time to set up, gather users, run interviews/usability tests, and summarize the most valuable and actionable feedback. We here at Opentest believe it is time for usability testing to go through it's next phase of evolution. Our first step is launching our Expert product today, where companies can request to have their User Onboarding, User Interface, Mobile Responsiveness, Product Teardown, etc from our network of product Experts. These Experts have built products at companies such as Facebook, Google, Udacity, Slack, etc, will record themselves and their screen as they walk through your product and give detailed feedback. Every delivered test includes an ~15 minute video, Questions & Answer summary, bulleted Action Items, and links to Additional Resources that have helped Experts build better products in the past. We've been running many Expert usability tests in stealth mode until this point and have received tremendous feedback on the value they deliver to companies, especially those earlier in the product lifecycle. We view the Expert product as the first step in making usability research as simple and seamless as possible. Would love to answer any and all questions you guys have. The tougher the question the better :)
@joethomas_x awesome, love it. Some questions to prod around.... Who are the experts you have, how many? What are their backgrounds? I know you mentioned some big companies but details would be cool. Also what range are the experts split into, e.g. design vs developer vs community experts? What about the issue of people designing a product for a very specific use-case/group of people? What is to say that you will be the right ones to test it out? I totally think that getting startup people in and playing around with things is really valuable anyway because there are common themes that are often spotted. But if its a very niche product, how can fair judgement be given? (better yet, how do you plan to tackle that?) - allow community members to become experts themselves after a vetting process?
Hey @bentossell, As a fair warning, this is a long response. Brace yourself. You have asked some very good questions. Our expert network is pretty modest in size as of right now (about 20 people), but we have a wide array of talent: engineers, designers, PMs, founders. The variation in experience this particular group embodies is pretty impressive and we've been very careful to handpick this first group of individuals. It's actually been much tougher for us to exercise caution and refrain when wanting to bring someone into the earliest fold. I am certain yourself and other Product Hunters understand how important it is to grow slowly and bring the best in if you wish to grow out a high quality network. That is exactly what we're trying to do. :-) We've decided to conceal specific identities for now to respect their anonymity, although we plan on building out a reputation piece in the near future depending on how this run goes. That being said, I'm sure many Product Hunters will be curious who is in the network, so here are some name drops I can offer: Joaquim Verges - Senior Android engineer at Twitter - creator of Falcon Pro. James Mudgett - Previously product at Path and worked on Kong with Dave Morin as his boss, currently working on his own venture. Caleb Ogden - Currently at Snapchat as an early front-end engineer/designer. Has extensive experience not only building companies but also understands the stock market quite well. Ilie Ciorba - Designer at Robinhood. Previously at Mixbook. A true Mission hipster with a love for good coffee and biking. You have actually hit on a key problem we wish the Opentest Expert platform to solve. More specifically, our aim is to increase the likelihood that we match a company with a tester that can help them discover an unknown unknown (something the client didn't know they didn't know). This is vastly different from most usability testing with regular users where the likelihood you hit an unknown unknown is very low and random. Testing with regular users is absolutely useful, but it accomplishes something much different. You almost never *expect* to discover something unknown about your flow during this kind of testing (although it's awesome when you do). You typically reaffirm that something works or doesn't - and when it doesn't - it's usually pretty obvious. Since this is a scheduling problem - and I love scheduling problems - I'm currently in the very early stages of formulating a scheduling algorithm that essentially cross references features of a product maker (designer, likes bikes, into cars, etc.) with a company (early stage, automobile industry, targets 20-somethings, etc.). How we obtain this data is another interesting problem, but this is the soul of the Opentest Expert platform and I expect it to be a while before the matches it makes produces magical results. For now our scheduling algorithm is being substituted by a growing Opentest Slack community where we not only discuss who would be best to take on which tests, but also where we knowledge share specific articles on product usability and constantly work to elevate our own understanding of how people use products. In the future, it only makes sense for this community to live on our own platform with - as you mentioned - members who are experts themselves and help vet other Experts. If you are a product maker and this community sounds interesting, please apply:
@vhmth thanks so much for your answer!! Nice to know you have the focus on quality early on. Do you think you'd ever adopt some kind of user rating system? Yeh I agree it shouldn't just be what would be users to test things as start folks with different backgrounds actually can identify issues that others may not. Important to have a mix, but also to have someone interested in the specific space. I'm sure the algorithm will be a complex one but like the idea that you all just discuss it in Slack to see who is up for it :)
@bentossell we absolutely must have user ratings. Otherwise it will be near impossible to hold folks accountable for meeting some minimal bar of quality. Ratings are interesting when it comes to usability feedback though. We noticed early on that the most rewarding feedback to give to a tester was whether a company actually implemented a specific action item you gave them (which is why you may notice that we really emphasize the action items on our demo feedback page - The first version of our rating system will revolve around this premise, and we hope to see massive tester retention when it launches. It is insanely gratifying to come back to a site or app and see that it has changed because of your feedback and knowledge. :-)
@vhmth totally agree!
This is cool. Get quick feedback on your onboarding, site nav, and other parts of your product from experts in the form of a usability test. Ask the maker @_shahedk anything! :)
@nivo0o0 thanks for hunting us! I started working on Opentest with @vhmth and @joethomas_x a few months back. Our goal is to help companies early in the product building cycle by giving them directed product feedback. Ask me or the team any questions you have!
@_shahedk @nivo0o0 @vhmth @joethomas_x this looks awesome. Nice job guys
@gbaroth Thanks Greg! Would love to see you apply to be one of the experts.
@joethomas_x @gbaroth We always go down the rabbit hole of onboarding copy and this product is incredible! So simple, yet potentially so powerful. Very excited to start incorporating this into things. Great work.