Member-only story
Five tips to improve the rigor of your UX surveys
As designers and strategists, we are great at sharing narratives to explain our process and justify design decisions. Unfortunately, this isn’t enough.

Despite interweaving qualitative user research to our designs, it’s often cold, hard, quantitative evidence that drives decision-makers to take action.
How can we improve our UX research process to validate designs with persuasive data? A good way to ensure your research isn’t based on outliers but representative of a wider customer sample is to use a survey.
Writing surveys should be easy, right? It’s just putting together a few questions to test a hypothesis. Unfortunately, with the advent of free platforms like Survey Monkey, Typeform, and Google Forms, people with minimal training are pushing out surveys left, right, and center. A low barrier to entry, combined with tight timeframes means that UX designers release surveys without realizing they contain common mistakes and bias.
Poorly written questions result in poor data.
I respond to surveys every week and am always surprised by how many well-known consultancies and organizations overlook best practice.
Here are five ways to reduce bias and avoid common UX survey writing errors:
1. Never-Should-You-Ever… Serve mains before appetizers
Ease in. In face-to-face interviews, you wouldn’t go for the kill straight away, the same goes for surveys. Warm-up your respondent with general ‘positioning’ questions, then lead into more comprehensive inquiries. Asking age and gender straight away can irk your participants. Crosscheck demographic questions for necessity, unless they will directly impact your design, don’t ask what you don’t need to know. Segmenting customers by their decision making will tell you much more than traditional age-based divisions. If you simply ensure your designs adhere to WCAG 2.1 (Web Content Accessibility Guidelines), the age and gender of your users should have minimal impact on your product.