3 examples of one-question surveys from Instagram, Pinterest & more

How do get your assumptions tested? Here’s how.

Rosie Hoggmascall
UX Collective

--

App logos for instagram, Pinterest and OneSignal

Too many product teams think they do enough user research when they don’t. They keep launching features that have no impact.

Then they do it over and over again until you have a product with loads of features but no significant user growth.

One way to stop this is to test assumptions before building something. To get feedback early and often through user research.

User research can be tricky for a number of reasons. One of those is response rates, i.e. getting people to give you feedback.

The average response rate across all surveys for instance is a measly 5–30%.

That’s low.

The key question to answer is how do you make sure your questions get answered? How do you engage the lower intent cohorts who don’t really care about your business?

Enter: one-question surveys.

I first learned about these from Teresa Torres, Product Discovery Coach and author of Continuous Discovery Habits. I went on her Assumption Testing course earlier this year and loved it (highly recommend her book and courses).

Screenshot of a one-question survey: “Have you ever worked as a software engineer?’
Example of a one-question survey. Idk about you, but I remember what my past roles were so won’t be picking ‘I don’t know’ anytime soon 😂

I learned that one-question surveys are used to test assumptions. The benefit is that they are simple and used in the user experience. These two things lead to higher response rates than a survey via email (like these).

They’re also quick to launch and quick to collect data (if they’re put in the right place). You can get lots of responses within a short time due to those higher response rates.

You may have seen:

How satisfied were you with the service today? 😃 😊 😒 😠

Other common one-question surveys include: exit surveys, where did you hear about us surveys (WDYHAU), net promoter score (NPS) surveys, polls and consumer or employee pulse surveys.

There are a few key rules about one-question surveys that can help boost response rates and get you accurate feedback:

  • The question needs to be simple (if someone needs to read it twice, that’s bad)
  • They ask about actual behaviour (no ‘could you’ or ‘would you’ statements)
  • Ideally, they need to be embedded in the user experience (if you ask somewhere that is too far removed you may impact your results)

They’re so subtle that users don’t event think twice about answering them.

Because of this, it was super hard for me to find examples for this article. I’d always complete them quickly and think:

oop — I could have screenshotted that 🤦‍♀

However, I’ve managed to find you three great examples of one question surveys from top tech companies spanning B2B Saas and B2C mobile apps to study and learn from. I’m particularly interested in:

  • Where the questions are in the user experience?
  • What does the UI look like?
  • How is copy used to increase conversion?
  • What happens after the question?

So, let’s dig in.

Instagram

I was scrolling on Instagram last night, and something caught my eye.

A one question survey under a suggested post in my feed:

Are you interested in this post?

Screenshot of instagram feed with annotations showing placement of one-question survey under post

The best thing about this question is its positioning. It is right after the content in question and catches users right in the moment of thinking either:

That post was rubbish 😒

That post was good 😊

Which makes the data its collects valid and reliable. It is clear what the content the question is asking about and a user’s opinion is still fresh on their mind.

One-question surveys work best as close to the thing you are testing as possible. The further away you get, the less likely you are to gather the right data.

In terms of UI, someone could miss this module. Especially as it is hard to see in dark mode. However is it relatively unintrustive, which I like.

Pinterest

Now, this one I’ve known about for a while.

At the end of every Pinterest recommendation email, there’s a Y/N one-question survey:

Was this email useful?

Screenshot of the base of Pinterest’s email with the one-question survey above the footer in dark mode

It is placed below the call to action “see more pins” but above footer (where the T&Cs and unsubscribe live).

Because it is at the bottom of an email, it will likely get lower response rates than questions in the user experience.

Analysis of the UX of answering Pinterest’s one-question survey

I also wonder about how it being at the bottom of the email impacts the validity of the results. E.g. if you see the email, open it and the content isn’t good — I’d delete the email instead of reading till the bottom.

Perhaps this means that the answers to this question are skewed positive.

In any case, Pinterest may use this data for two things:

  • To measure the success of their emails (now that open rates are not that useful, companies are finding better ways to measure email response rates)
  • To feed a recommendation algorithm (to make each subsequent recommendation better and more personalised for me).

We know that the latter is true, as in the flow we get to a web screen after tapping either yes or no, which says:

Thanks for your feedback!

This will help us improve the recommendations that we send to you in the future.

Screenshot of the congrats screen after answering Pinterest’s one-question survey

What I like about this is that this messaging is friendly and transparent. I know how I’m helping Pinterest and what my inputs are being used for.

As well as helping with their brand, this also makes me feel good. It makes me feel like I’m getting a personalised experience and that I’m helping the team at Pinterest build better products.

Thanking users is a great way to increase your response rates over time. If users have a good experience, they’re more likely to help you in future.

Next, a more complex flow in B2B Saas company: OneSignal.

One Signal

One Signal is a messaging platform (like Braze but more affordable) used by the likes of eventbrite and Volkswagen. You can send email, push notifications, in-app messages and even SMS.

It is no surprise that a platform that allows you to send one-question surveys in your product takes their own advice and uses them too.

Last week, I was setting up some push notifications from OneSignal and came across the question:

What kind of Automated Messages do you send to your users?

With this multiple choice question, you’re able to select from 5 dropdowns.

In terms of placement, the question appeared in the middle of a large white pop up while I was drafting some push notification copy.

The nice thing here is the avatar and name:

Liberty from OneSignal

The use of an actual image in the avatar makes it feel like one of their team is asking us a question. It personalises this intrusion a bit more.

One thing that hindered this popup is the wording. I had to read the question twice before answering.

The phrase ‘Automated Messages’ is what made me double take, as I wasn’t sure if ‘messages’ meant both push and email. To me, a message is a text or a DM. I also don’t know what they count as a ‘transactional’ email versus a ‘promotional email’.

Any lack of understanding of the question will mean your results are not valid. You’re not testing the assumption of what content I send if I don’t get the question.

In any case, once I answered I was met with two more popups before being directed to a Calendly link.

The second step:

What would you like to see improved about Automated Messages?

I found this tricky, thought for a while, then gave an answer. Again, the definition of ‘Automated Messages’ is what I was hung up on.

Then, I was thanked and asked to book a 1:1 feedback call. After I tapped ‘schedule’ I was sent to a Calendly booking page titled ‘Automated Message Feedback & Discovery’.

Great use of Calendly to try and recruit users for more detailed discovery calls. 10/10.

Stakeholder pushback for one-question surveys can be “we won’t get anything useful from this”, so recruiting users can be a way to get buy-in for your research.

However, I couldn’t be bothered to book a call so exited this screen.

In my view, the first and second question were a liiiiiitle too tricky to warrant me giving them so much of my time in a call. Perhaps if they’d made the questions easier and the flow shorter, with a nice lil’ Amazon voucher offer, I may have booked a call.

In any case, great use of an initial question with a fast follow X 2 for more detail. Improvements to be made, but better to start collecting data and honing your execution over time, than not researching at all.

To summarise: just get started

Too many product teams think they do enough user research when they don’t.

One-question surveys are an excellent way to get over this obstacle. They’re quick to launch, quick to collect data (if they’re put in the right place).

Your first shot won’t be perfect, but you can improve your execution over time. Here’s some pointers to remember when you launch your one-question surveys:

  • Keep them in the user experience to ensure you’re keeping them contextual and not biasing results to higher intent email readers (like Pinterest)
  • Keep the wording simple: use language that your customers use. Don’t confuse them with terminology you only use internally
  • Fast follow for more information or a 1:1 call invite: like Onesignal, ask for more detail or link to a Calendly invite. Test the messaging here and consider a reward for participating
  • Thank your users for their help: to build up trust, make them feel involved and — fingers-crossed — increase your response rates in the future.

It may be scary, but it is 100% worth it to test your assumptions and get some data in the door.

Try it and let me know how it goes in the comments 💫

--

--

UX, monetisation, product-led growth | Writing to get thoughts down on paper & free up some brain space ✍️🧠