Member-only story
How to get answers quickly and avoid features that flop
Lessons from the one-question surveys of Instagram, LinkedIn & Trainline
Get these a week early at growthdives.com ✨

One of the most common things I hear from founders is:
We keep launching features that have no impact. They just flop and we don’t know why.
This is what’s known as a feature factory, something Marty Cagan covers in his book Inspired.
It’s where teams have a constant state of busyness to launch feature-after-feature. Features which often end up having no impact on core product metrics.
One way to stop this is to test assumptions before building something. To uncover what we’re silently assuming when we think of a ‘good’ idea to build. And to test whether these things are actually true to de-risk the idea.
However, user research can be tricky for a number of reasons. One of those is response rates, i.e. getting people to give you feedback. The average response rate across all surveys for instance is a measly 5–30%.
The key question to answer is how do you make sure your questions get answered? How do you engage the lower intent cohorts who don’t want to speak to you, or fill out long forms?
Enter: one-question surveys.
I first learned about these from Teresa Torres, Product Discovery Coach and author of Continuous Discovery Habits. I went on her Assumption Testing course last year and loved it (highly recommend her book and courses).
I learned that one-question surveys are used to test assumptions. The benefit is that they are simple and used in the user experience. These two things lead to higher response rates than a survey via email (like these).
They’re also quick to launch and quick to collect data (if they’re put in the right place). You can get lots of responses within a short time due to those higher response rates.
Other common one-question surveys include: exit surveys, where did you hear about us surveys (WDYHAU), net promoter score (NPS) surveys, brand perception surveys and employee pulse surveys.