User-testing toolkit

Learn how to run activities to get feedback from test users

User testing involves testing a website or product with users, to get an understanding of people’s thought processes, actions, and frustrations. The goal is for those responsible for designing a product to get a first-hand look at what users might see when they go to use your tool for their own purposes. You can test something that already exists or with something that you are in the process of creating.

Usability testing refers to evaluating a product or service by testing it with representative users. Typically, during a test, participants will try to complete typical tasks while observers watch, listen and takes notes. The goal is to identify any usability problems, collect qualitative and quantitative data and determine the participant's satisfaction with the product. — Usability.gov

Why do user testing

If you’re creating something that you want people to use, then it’s critical to design it around the needs of users (based on user research), and then test it with real people to see if you have succeeded.

Just like it’s hard for us to judge our own writing, it’s also hard for us to judge our own design work. You may think you have created something perfect, only to discover that people can’t seem to figure out how to use it. It’s better to discover that sooner than later!

User testing is used in the private sector to understand the consumers of a specific product. It’s even more important in the public sector, because governments have a responsibility to serve everyone.

As described in our library of user personas for open data, there are many different types of users of open data. Some have lots of data and technical skills while others don’t. Some want raw data while others want summaries. People have all sorts of motivations, including curiosity, government accountability, advocacy, journalism, and business. Getting feedback from a diverse range of test users can help you ensure that you’re meeting this wide range of needs.

Where it fits in with open data

You might use user testing to get feedback on digital tools, products, or new data platforms.

When?

There are several times when user testing could be useful:

  • At the beginning of your tool/product design process, to get feedback on an existing website, product, or process
  • Once you have developed a prototype, to get feedback on that first iteration to help improve the final product
  • After a public release, to inform planning for future improvement efforts

Who?

Who should conduct the user testing? A variety of people in government might be relevant or interested:

  • Open data program manager (or equivalent)
  • IT or digital staff
  • Staff who deal with communications or community engagement
  • Staff who deal with public records
  • Public-facing departmental staff
  • 311 operators

Who should your test users be? If you have created user personas for your open data program or project, then try to find test users who are representative of each of those personas. If you haven’t created personas yet, we recommend doing that first.

Limitations of user testing

User testing is just one way of trying to make your open data projects and portals user-centered. It’s not a replacement for other co-design methods, which you should also use.

Additionally, you should only do user testing if people’s feedback will actually be incorporated. So, before user testing, make sure to secure a commitment from decision-makers that user feedback will be taken into account. If you were to do user testing on something for which no further changes will be allowed, then that could leave the test users feeling like their time was wasted.

The only exception might be if you want to use the user testing as a way of demonstrating to decision-makers that users have valuable feedback that should be incorporated during a design process — but if this is your intention, make sure that is made clear to the test users upfront.

How to do user testing

Here is one set of steps for user testing. See the “Further reading” section below for additional methods, as well as details on issues like recruiting participants and synthesizing feedback.

1. Introduce yourself

Say something like:

“Hi, I’m __. We are working on a project to improve the __ website, and we’re interviewing current and potential users in order to understand more about how they use the [website/tool]. If you are interested in participating, we would love to hear about your experience. The interview will take twenty minutes or less, and all feedback will be anonymous. Can we talk?”

2. Arrange the group

You will need three roles for your user testing:

  • One person will be the test user
  • One person will be the question-asker
  • One person will be the note-taker

It is possible to do this with only two people — with the question-asker also taking notes — but this is much more challenging. Try as hard as you can to have separate people asking questions and taking notes.

If you are conducting multiple interviews, it’s a good idea to have people get to experience both the role of question-asker and note-taker.

3. Introduce the task(s)

Say something like:

“Next we will need to use a computer and web browser. I will ask you to complete a couple of tasks. This is not a test of you, and there are no right or wrong answers. As you complete the tasks, I’d like you to narrate what you’re thinking, what you’re looking for, what you expect to find, and your thoughts and opinions about what you’re seeing. Do you have any questions for me?”

4. Do the task

Now, you ask the person to complete several tasks, such as finding certain information. You should make sure to start with a blank browser window, to see how the user is landing on your site in the first place. (Are they going to the home page, or arriving to a specific page via Google search?)

Alternatively, instead of asking them to perform specific tasks, you could ask your test users to explore your website or product and use it as they wish. If you do this, you should still ask them to narrate their thoughts out loud, and try to get them to talk about what they are drawn toward, what frustrations they have (if any), and what questions they have.

How this should work:

  • As the user tries to complete the tasks, they should say out loud what they are thinking and why they are making the choices that they are making. This may seem awkward, bt it’s really important to keep them talking.
  • If the user stops saying what they are thinking, ask questions like: “What do you notice? Does it make sense to you? What are you looking for?”
  • Take notes and document where they get stuck or are having difficulty. Also try to see where on the screen they are focusing their attention.
  • The question-asker should not take over control of the computer or give directions on how to do anything.

An excellent example of user-testing, courtesy of Sonja Marziano, to help with asking the right questions focusing on user needs and experiences:

What you want to know Bad question Good question Reason
First impressions of your website or tool “Do you like this homepage?” “Review the homepage. What do you think this website does? Who do you think is the target audience?” “Do you like this homepage?” is too general — it’s a yes-or-no question.
Determine if completing a task is easy “Click the ‘Buildings’ link and select the Building Permits dataset, then sort by date and find the type of the most recent permit.” “Find the most recent building permit issued. What type of permit was it?” The first question tells people how to do it. That’s not good, because you’re trying to find out how people would do it on their own.
Improve user experience or discover issues “How would you change this to make it better?” “What did you find confusing? What would be less confusing?” The second question is a bit more specific. That said, at the very end, you could still ask a catch-all question about any other suggested improvements.

After they are done with the activity, ask them general questions, including about what they found challenging.

5. Conclude

Say something like:

“Thank you so much for taking the time to share your thoughts today. Do you have any other questions for us?”

6. Iterate

Using the feedback that you got from your user testing, you can then go and make changes to your website, tool, or product. While this alone should be an improvement, it would be even better if you then do another round of user testing. This serves two purposes: (1) see if your changes addressed people’s challenges; and (2) see if your changes accidentally resulted in the creation of any new usability problems.

You can do as many rounds of user testing as you want! Also, you can always do additional user testing after an initial public launch.

Further reading

This is not an exhaustive guide to user testing. We encourage you to explore resources others have created, especially for more details and ideas on methods. Here are some suggestions: