top of page
  • Writer's pictureJean Saung

A general guide to moderated usability testing: Non-leading questions & prompts

Updated: Apr 13, 2020

Originally published on Medium.


The goal of this article is to share some general best practices that I learned, adapted, and compiled through conducting moderated usability tests on web products and mobile apps the past few years. These notes are mostly examples and things to keep in mind whether you are exploring the competitive landscape, understanding user pain points, or testing a new feature.



General Interview & Usability Testing Script:

“Thank you for participating in our usability study. We are researching the usability of a __________ platform for ___________ called __________. There are no wrong answers, and if you are not comfortable with answering a question, feel free to skip to the next one. If at any time you feel uncomfortable during the study, feel free to stop and leave the study. We are testing the product, not your actions. This interview is for us to gain insights from your experience and to improve on existing features for this product and for the user experience overall. Your answers to these questions remain strictly confidential. Please do keep what you see of the product during this session confidential as well. Lastly, please think aloud during the study so we can understand your thoughts. Any questions? Ready to begin?”

Starting usability studies with an ice-breaker or background interview helps participants feel more comfortable with speaking about their experiences. Depending on the study, have participants fill out an intake form about their demographics and activities and behavior that is related to your research. You can then ask follow up questions referencing the intake form during the interview.


Conducting the Usability Study: Non-Leading Prompts


These questions guide users through tasks during the usability study. They are neutral and non-leading, yet specific enough to get targeted actionable responses from your test participants. These questions also reinforce the script, in that any answer users may provide is a correct answer, their experiences are their authentic experiences.

“Expecting” Questions: Helps to understand how your designs match the context of your users. Their expectations may correlate with the apps they currently use or have used in the past. It is important to survey or interview your participants and consider their previous experiences and preconceptions when you weigh their responses.

  • What do you expect this button to do?

  • Where do you expect this arrow to lead?

  • Is this screen what you were expecting?


“What Stands Out?” Questions: Help to understand the visual hierarchy of your designs to each user. Are users noticing and prioritizing the elements that you intended?

  • What do you notice/gravitate towards/want to interact with first? Second?


“Anything Extra?” Questions: Help to identify if there are elements that distract from the experience, are repetitive or otherwise not an intuitive part of the experience you are designing.

  • Is there anything in this feed that you can do without? If anything at all?

  • Is there anything on this card that you think is extra? If anything at all?


“Anything Missing?” Questions: Help to identify if there are basic elements that users are expecting that are not present in your design.

  • Is there anything in this feed that you think is missing? If anything at all?

  • Is there anything on this card that you think is missing? If anything at all?


“Any other thoughts or comments?” Questions: Give room for the user to voice any overall thoughts that may be forming as they are performing the task. Usually asked at the end of a task or sequence of visuals, or at the end of the test session.

  • Do you have any other thoughts or comments about this feature?

  • Do you have any other feedback for the app experience overall?


Generally, if you give the participant a few extra seconds to think and can sit in silence for a little longer than might feel comfortable, user’s will likely have some interesting insights to share with you.


Does and Don’ts:


Don’t: Ask a user what they what they want. Asking a user what they want can sometimes lead participants to give you an answer they think you might want to hear, or to make random suggestions that they don’t feel strongly about in order to have an answer for you.

Do: Ask participants to recount or show you their current work-flow. Identify pain points by asking them to share experiences and note the ones that cause frustration, or inconvenience. If the participant has not yet used your product or you are still learning more about the problem space, ask them how they “hack experiences”, create work-arounds, or adapt a series of tools to accomplish a task.


Don’t: Ask a user what they think they would do. It is another hypothetical question asking a user to predict their own behavior, which is bound to be inaccurate.

Do: Ask about their past and current behavior to complete a task on your platform, what they do on analogous/similar apps. Best of all, give them a relevant task and observe their actions.

When interpreting their responses, particularly for actions users claim to have performed, combine these insights with referencing the record of a user’s interactions within your product in an analytics and interaction tracking platform. A user’s memory of their past actions or frequency of past actions is not always accurate.


Don’t: It may be tempting, but don’t tell participants what the feature is for.

Do: Share enough information with them that sets up the scene for them to accomplish a task, i.e. “You just watched a movie, how would you go about rating the movie in this app?” Or provide information for steps they have completed to arrive at the current screen you want to test. “You just completed an order and you want to make a quick adjustment to your order.”


Don’t: Ask users questions starting with “Why…?” i.e “Why did you choose that one over the other?”

Asking a user “Why…” can sometimes come across as confrontational or questioning their judgement. It may also result in an answer that is more general or conceptual rather than concrete and actionable. i.e “I like it better than the other one…” or “It seemed like the right choice…”

Do: Rephrase the question to start with “What…?” i.e “What prompted you to choose this one over the other?” Asking “What…” helps users think of specific elements that contribute to their experience.


Don’t: Assume you understand why a user reacts a certain way, how they made a decision, or what motivates them.

Do: Follow up one question with another clarifying question. i.e. “You said this button makes you nervous, what about this button make you nervous?” You can pursue their responses with a number of follow up questions until you reach an answer that is more concrete.


Putting It All Together:


Example 1: You can test wireframe layouts with paper prototypes (You can even ask users to rearrange the components or write and draw on the paper). Guide user feedback with questions like i.e. “What stands out to you on this card?” (Tells you about user priorities and content hierarchy). “Anything extra? Anything missing?” (Tells you about user expectations).


Example 2: Create clickable prototypes to test the user flow. Decide on tasks for your users to accomplish which explore the new feature and would require multiple steps to complete. i.e. “You just completed on-boarding, now you want to find a movie you would like to watch that is available on your favorite streaming platforms.” Ask follow-up questions, “You are hesitant to click this button, what is on your mind when deciding to click it or not?” (Tells you about user expectations and preconceptions, also their emotions)

Last but not least, when interpreting user feedback and translating it into design suggestions for further testing, do so in consideration of other factors like the business goals and technical bandwidth of your product team. As UX designer, you are the advocate for the user, the spokesperson for business needs, and the translator to your developers.

That’s the general guide I have thus far, I am also learning as I go. Please feel free to comment below if I have missed anything, if you’d like me to elaborate on a certain area, or if you have experiences you’d like to share about moderated usability testing questions and prompts!

23 views0 comments

Recent Posts

See All
bottom of page