Are you facing false positives during user interviews?

Dan Müller
1 reply
I’m often finding huge difference between what people say and how they really behave. I’m talking here typically about user research & feedback. For example when user explains how he use specific feature and then we found out from application data that he didn’t use the feature at all. Outputs from user interviews and “hard” analytics data are two different worlds. From my findings people often want to be seen “better” and also want to satisfy you so they are typically more positive about the product than the reality really is. Knowing this … Does it make sense to interview your users? And how do you avoid the risk of false positives from interviews? We at https://aimful.io are really struggling with this 🤷‍♂️

Replies

Max Korpinen
Interviews are really difficult. One "fix" is just to ask better questions. Instead of asking about theory, ask behavioral questions: "Tell me about the time you last used this feature." If you want to prepare really well, ask the same question from someone from your team or network who you know is really experienced in using the feature. Dot down what they respond to your question, and use that as a benchmark for a "perfect" answer. Unstructured interviews are very close to a coin toss in terms of how well they predict someone's actual work performance. We're building a product (hireproof.io) to help companies run more insightful interviews, feel free to check it out, we also have a free plan available :)