As a Mixpanel product leader, I could go on and on about the importance of using product analytics in my work. (I have as well.) But that doesn’t mean I don’t recognize the value of user interviews in learning how people use my products.
Combining product analytics and user interviews for product development is not a novel approach, but I’ve discovered that knowing when to use which method and how to make sense of all the data you collect can be the real key to success.
Here are some of my best tips for using these two tools together, along with some pointers on how to avoid common pitfalls.
Why do you need to conduct both user interviews and product analytics?
Before I get into how to balance product analytics and user feedback, I’ll explain why I, the wonky Mixpanel guy, believe both are critical.
It is the job of a product team to understand customer pain points and identify opportunities to solve them, and the two data-gathering methods we’re discussing here frequently aid in these processes in different—and complementary—ways. User interviews are ideal for delving deeply into a customer’s problem. You get to ask them questions and observe their responses and actions. Product analytics, on the other hand, examines aggregate user behavioral patterns.
Both methods can identify signals for what problems exist in a customer journey, while analytics can assist you in deciding which problems to prioritize and interviews are uniquely suited to gather ideas for why something is a problem for a user.
For solution-building, the “flow state” of user interviews and analytics
The same combination of user interviews and analytics that yielded your product problem hypothesis can frequently yield a solution hypothesis. From there, user interviews are your best friend if you want to get a quick sense of whether the solution hypothesis is heading in the right direction before you build the solution. Our design team members would frequently create a quick Figma prototype with 1-3 options and then have some users walk through the prototype while performing tasks.
At this stage, relying on product analytics is not sustainable; you’d still need to build some variation of an MVP, which often takes longer than Figma prototypes and calls.
However, analytics will be introduced in the following stage. Assume you’ve implemented a solution hypothesis in your live product and want to know whether it solved the problem you were trying to solve. Once again, you must rely on analytics to obtain that aggregate view.
In a nutshell, user interviews show you what users say they’ll do (and what they actually do), whereas product analytics show you what users actually do.
As a result, you require both. First one, then the other, and then the other once more… and so forth.