The one-question surveys from Figma, LinkedIn & Google

User research doesn’t have to take an age.

Rosie Hoggmascall
UX Collective

--

Company logos for Figma, Google and LinkedIn

A few months back, I published a collection of one-question surveys I’d come across out in the wild. I distinctly remember it took me ages to collect enough good examples to write a full article about it.

Then something weird happened…

Since then, I’ve seen SO many.

Maybe people are doing more user research (I doubt it).

Most likely its confirmation bias — i.e. once you see something once you see it everywhere. My mind was on the lookout for any survey I could see.

These slippery little things are SO hard to capture. They blend in so well I often ignore them.

But, I’ve managed to capture three excellent examples of one-question surveys from top tech companies across B2B and B2C: Figma, Google (specifically Google Search) and LinkedIn.

I’ve picked these three as they’re textbook examples of how to do user research well.

So, let’s dig in and uncover six things that make a great one-question survey design.

Favourites first: Figma.

Figma

I’m SO excited I managed to screenshot this one.

Half a year ago, I saw another one of these. But I’d already ignored it and tapped off the screen. When I went back, it was gone.

Not this time.

I was searching for an event tracking Figjam on mobile app attribution I’d created for a client. I typed ‘attribution’ into my search bar.

No results. Funny, as I could have sworn I would have written attribution in the title. Apparently not. But then something caught my eye.

Bottom right of the screen, there was a little module blending into the user experience:

Quick research question! What are your thoughts on this statement?

Finding the files I need is easy.

Analysis of Figma’s search results on desktop showing the one-question survey bottom right

Then, a 1–5 labelled scale from 1 strongly disagree to 5 strongly agree.

Zoomed in screenshot of Figma’s one-question survey

What’s excellent about this execution is:

  • The question is simple (takes <5 seconds for me to make my decision)
  • The scale is only 5
  • The scale is labelled
  • The copy is super friendly
  • I’m clear how many questions there are ‘Question 1 of 2’

However, quite possibly the best part about this is that it is triggered in a relevant point in the UX.

It’s in a specific, high-traffic page in the experience (to get valid data in the door quickly). I reckon this is the search experience team researching the ease of search & quality of results. Perhaps their assumption is something like:

🔍 We think finding files is difficult for users

Perhaps because they see a high % of queries with no results in the data. Or a low success rate of enter search query → click on results.

I’d be surprised if this was running longer than a couple of days, just enough to get a signal before putting more resource into solving. Saving precious time and money.

After inputting, there’s a fast-follow second question for more detail. This one is a free text entry.

Analysis of Figma’s one-question survey

It is marked as optional, which is nice.

I also like the ‘We’d love to know…’ default text in the free text entry field. Feels friendly.

Zoomed in screenshot of the second screen of Figma’s one-question survey

Then there is another optional field:

Are you open to sharing more about this with Figma Research in the future?

What’s great about this is that the team can not only get valid results in a matter of days, but they also have a list of user research leads for customer interviews if they want to dive deeper into the problem.

Zoomed in screenshot of the second part of Figma’s one question survey showing my typed out response to their question

My only negative in this experience is that the module itself was quite hard to see. Just white on white, and only ~10% of the total page on desktop. I get Figma don’t want to intrude, but I’d love to help answer more of these.

Next, onto the first one-question survey I’ve seen (that I’ve noticed) for one of Google’s biggest products: search.

Google Search

Now this is a funny story.

I needed new slippers. It's cold here in the UK, and my toesies are cold.

I also like hiking. And I saw an influencer wearing these cool packable slippers that zip together from an outdoor brand Merrell— something I’m now convinced I need in my life.

So I head to Google in Safari on my phone to try and find the ones.

Screenshot of Google Search results for ‘merrell packable shoes’

I didn’t actually find the ones I was looking for right away. What I also didn’t see, what the one-question survey at the bottom of the screen.

Analysis of Google Search’s one-question survey at the bottom of the search results

I tapped into one of the search results and then thought,

Wait, what did I just see?

I head back to the search results and it's GONE.

I think:

Noooo, my chance is gone

So I turn to my partner and hastily and say:

Gimme your phone

With some confusion, he complies. I go to safari, type Merrell packable shoe, and hit enter.

Right at the same time as the search results, the prompt shows.

SCORE.

It was useful seeing it a second time as I noticed there was very little delay, perhaps a second after the results showed the survey was there.

Zoomed in screenshot of Google Search one-question survey

Again, good points here:

  • There are only five options
  • Each option is labelled
  • It is a super simple question, a no-brainer to answer
  • The question is placed at the moment the user decides whether the results are good or bad — meaning it will collect valid data

The only missed opportunity is that they haven’t asked for any more info.

Next, onto consumer social: LinkedIn.

LinkedIn

I was scrolling on LinkedIn the other day and saw a small module below a post:

Did you find this post valuable?

[Yes] [No] [Not Sure]

Zoomed in screenshot of LinkedIn’s one-question survey

You may have noticed these too if you’re a LinkedIn user, as I’ve seen about four now. Always positioning right under the post they’re asking about.

Zoomed in screenshot of LinkedIn’s one-question survey second screen asking for more details

After tapping your answer, you’re asked ‘let us know why’, and given the chance to select descriptive words about the author and topic. Things like useful/insightful/funny.

Following that, you’re thanked and told what the research is for:

Your response helps us improve the feed experience for everyone

It’s a nice touch, as makes the user feel involved and as if they’re helping to improve the experience with you together.

Analysis of LinkedIn’s one-question survey

What’s nice is it all feels so smooth. The model is tidy, neat and blends into the experience nicely. I think it stands out a bit better than the Google and Figma examples, but I suppose the screen here is less busy than Google Search results and more compact than Figma on desktop.

Again, the rules of great 1-question surveys are followed here:

  • Easy to answer question
  • Few options to choose from
  • Single select first question
  • Quick fast follow second question for more detail
  • Thanks for taking part

I do wonder if the word ‘valuable’ means that the results will be hard to dissect. For instance, I’ve been seeing a lot more memes on LinkedIn lately that I wouldn’t describe as valuable, but they are funny and I’d like to get more of them.

In any case, great execution. Would love to know more about their research question…

To conlude: great surveys all round

Quick round of applause here for Figma, Google and Linkedin. All following six important steps for great user research with one-question surveys:

  1. Simple question that is easy to answer
  2. Placed discreetly in a relevant place in the user experience
  3. Single-select answer (with no more than 5 options)
  4. If a sliding scale, the options are clearly labelled
  5. Fast follow with a more detailed question for extra information
  6. Thank your user, ideally letting them in on the goss of what you’re researching

I do still find the surveys easy to miss, perhaps they were too subtle in the epxerience and you’ll only get answers from users who are slower / with higher attention to detail. But there’s a fine line between interrupting the user experience and getting data in the door.

What do you think? Seen any good examples to add lately? Let me know in the comments 💫

--

--

UX, monetisation, product-led growth | Writing to get thoughts down on paper & free up some brain space ✍️🧠