Coordinating and Performing the Study

Jump to:

Pointers for Facilitators

Facilitators play an important role in explaining the test beforehand, walking through the test with the participant, and taking notes for analysis.Prior to your testing, you should follow your institution’s policy for working with human subjects; usually called the Institutional Review Board (IRB).

  • Always get consent from a test participant.
  • Have a data protection plan for personal identifying information. This includes personal names, identification numbers, email addresses, telephone numbers, or birth dates.
  • If during testing a participant decides to not complete a test, that’s okay.
  • Help test participants understand that they are helping us.
  • Stick to the time limits for each activity.
  • Make sure to ask participants to think aloud while completing test activities. This helps note-takers understand participants’ thoughts as they occur and as they attempt to work through issues they encounter. It will also elicit real-time feedback and emotional responses that are important for knowing if an interface works or doesn’t.
  • Take good notes: capture as much detail as possible. Record what they say out loud as they move through the tasks. This will help with analysis later.
  • Capture subjective metrics you notice during the test.
  • Be neutral during a test. Simply watch and listen to participants. When they ask a question, reply with “What do you think you would do?” or “I’m more interested to know what you think or would do.”
  • Don’t jump in and help. This will bias the results. If someone wants help, either reassure them that they can’t make any mistakes, or end the task and move on to the next one.

This is an example template for a consent agreement.

Publishing Your Results – You need to submit an Institutional Review Board (IRB) with your institution!

If you plan to publish your research results then be sure to comply with your institution’s Institutional Review Board requirements. This will promote the safety and well-being of human participants, ensure adherence to the ethical values and principles underlying research, and alleviate concerns by the general public about the responsible conduct of research.

Resources

With Measurable Usability Goals – We All Score
Explains how to set goals that are precise, measurable, and task-focused and includes a template for documenting goals. Usability.gov

Decision Frames: How Cognitive Biases Affect UX Practitioners
How a fact is reported makes a big difference in the perceptions of a reports consumer; this page explains aware of framing bias and how to avoid over-influencing decision making. Nielsen Norman Group.

Markless, S., & Streatfield, David. (2013). “Thinking about evidence.” Evaluating the impact of your library (Second ed.). London: Facet Publishing.
Chapter 8 section 8.4, “Ethical Evidence-gathering.”

Communicating User Research Findings (UX Matters)
Advice on how to choose reporting formats, with excellent tips on getting the most important points across in a strong, effective way.

Making Usability Findings Actionable: 5 Tipes for Writing Better Reports (Nielsen Norman)
Five things to focus on in reporting results to convey the most important changes to be made quickly, and to inform future efforts.

Usability Test Report (Usability.gov)
This downloadable report template in Word format has explanations and examples for each feature of the report.

Respectful Interactions & Understanding Participants

  • About Users: Overview
    University of Cambridge’s Inclusive Design Toolkit: About Users section is resource that focuses on understanding different types of user capabilities and how they affect product interaction in general.
  • Inclusive design toolkit by the Government of Ontario
    The Canadian province of Ontario has created an inclusive design toolkit to respectfully work with people who have different accessibility needs. Their inclusion design cards can help you identify interactions and processes for how to be respectful to people of all types of abilities.

Resources Needed for User Testing

The resources you will need to conduct user testing depends on what you have available at your institution external to who will be coordinating user testing. Try to approach thinking about what you need based on these three types of resources when moving forward with putting your design into practice.

  • Test Participants
  • Recruitment Incentives
  • Technology

Test Participants

You will need at least 5 participants from each user group, (e.g. undergraduates, graduates, research faculty, etc.). For guidance on interacting with users, reference the Working with Test Participants component of this guide.

Sometimes you will not be able to get all the users you need. Some institutions will be focused on only one specific group, Undergraduates for example. Who you choose should depend on context and what you are testing, for example testing with Zotero could focus solely on grad students.

You don’t need to specify who the user groups are. Ideally you just need 5 participants because of the goals you make for whatever you test like faculty focused or student focused. You don’t need someone from every group. Remember, any user testing is better than none at all.

Recruitment Incentives

Recruitment incentives are a key way to get test participants to engage in your study. Incentives for test participant recruitment also depends on your overall budget, and amount of time for each test.

Examples

  • Gift cards ($5-10)
  • Cup of coffee
  • Piece of pizza
  • Doughnuts
  • Candy
  • Grocery Store Gift Cards
  • Library Swag, e.g. awesome mugs and pens

Basic Technology Needed to Perform Testing

  • Desktop or laptop computer
  • Wireless internet access
  • Chrome, Firefox, or Safari
  • Mouse
  • Microphone
  • Eye Tracking software

Assistive Technology

You should always try to do user testing with someone who uses an assistive technology device. It helps to make systems more inclusive to all types of abilities. It is recommended to at least test with a screen reader. Below are some recommendations and how-tos:

Video Screen Capture Tool

It is best practice to collect user testing results using video screen capture software. These video files will help you review testing results and reference specific patterns. Note: If the users object to screen recording, you may choose to use a stopwatch or the timer on your phone. Below are some recommendations for screen capture tools.

Resources

Talking with Participants During a Usability Test (Nielsen Norman)
Focuses on the importance of user testing facilitators “talking less and learn more” approach to working with users. Good resource for thinking through approaches to facilitator behaviors during usability studies.

User Guidelines for Uability Research (Nielsen Norman)
Provides guidelines for test implementors assuming the role of observer or notetaker. Topics include how to observe a research session, notetaking instructions, and has an example of what user testing notetaking looks like.

How to find great participants for your user study (Google Ventures)
Advice from a Google Ventures UX researcher with how-to information for recruitment. Includes a worksheet for devising a user screening tool.

Recruiting Test Particpants for Usability Studies (Nielsen Norman)
Although this Nielsen Norman page is geared mainly towards hiring an outside recruiter, with information on the probable costs, it includes a link to a comprehensive report on recruitment

5 Steps to Usability Testing (Yale University)
If you want a usable site, you have to test. Learn how to conduct a usability test in 5 steps.

How to Conduct Usability Testing from Start to Finish (UX Mastery)
How to Conduct Usability Testing from Start to Finish

Checklist for Planning Usability Studies (Nielsen Normal Group)
Planning a user test? Follow these 9 steps to make sure you are prepared.

Running a User Test

Determine where to conduct testing and get your resources in order

Figure out if you are going to approach running your test formally or using the Hallway method outlined below. It’s prefered to test people in their natural information seeking environment, but if you want to run tests in a quiet space that’s welcomed as well.

Recruit Test Participants

How you get test participants to do your test requires you to do outreach. Think about the following before you try to engage them:

  • Where are your users?
  • How do you reach them?
  • What do you need to communicate to them?
  • What do you have to offer them?
  • What librarians, archivists, faculty, and other academic staff or contacts do I know that can help me spread messaging?

Hallway Method

“One creative user testing method which can make recruiting participants easier is ‘hallway’ usability testing. For instance, in the university environment, if students are your users of interest you can often set up your user study in a cafeteria, library, commons room or hallway and select people as they walk by.” (Selecting and Recruiting User Test Participants).

  • Prepare testing space and computer
  • Approach a person and ask if they would like to participate in user testing
  • Provide an overview of what the test is about and for
  • Attempt to select participants that match with needed type of users
  • Be prepared to engage in testing right away

Formal Recruitment

  • Reach out to potential subjects by email, phone, email lists, social media, flyers, advertise in student newspapers
  • Select specific people that match the group(s) to be tested
  • Schedule a time and place to conduct testing
  • Make sure test participants know the intention and purpose of the test

Perform the test

  1. Make sure location is selected and secured and testing instruments are setup for testing
  2. Introduce participant to the test and have them sign consent agreements
  3. Give participant tasks to complete and makes sure they think aloud concurrently while working through an activity (see below)
  4. Watch and take notes about the participant’s actions and comments
  5. Give user their incentive

What is concurrent thinking aloud?

Concurrent Thinking Aloud is a technique used to understand participants’ thoughts as they interact with whatever is being tested. A facilitator uses prompts like “keep talking” or “mmmhmm” to keep the participant talking about their process. The facilitator wants to encourage users to keep their stream of consciousness going until the end of the task. This allows for real-time and emotional feedback responses.