How to Do Usability Testing Without a Budget

This post is about doing usability testing on a website for a public library, without a budget for usability testing, or any extra software beyond a simple survey tool and Google Analytics.
 
So, to start at the beginning, my team completely overhauled our old library website design, and for good reason(s):
 
  1. It looked very outdated, because it was basically a lightly modified version of the website we created in the early 2000s.

a. It was all static HTML pages for the most part.

 

2. The design wasn’t responsive, making it a headache for patrons on mobile or tablet devices, and for staff members trying to help users on mobile or tablet devices.

3. Certain parts of the website were not accessibility compliant.

 
Those are the biggest reasons for the switch, but as you can imagine, there were a lot of reasons to do it.

 
When you are creating a new website, after you think you are done, but before it goes live, you generally want to to do usability testing. The only catch is you don’t have a budget. You can’t purchase software to help, you can’t offer gift cards for participation, and you definitely can’t hire people outside to do it for you. Of course, one option is to not do it, and hope that everything is fine, and fix ny problems later. But my boss and I really didn’t want to do it that way. Since I have worked for different parts of government for almost 6 years, in three different jobs, I’ve gotten used to doing things on a shoestring (read: no) budget. I always wish I had the money to do things the “right” way, but there never is the money. So, I pull a Tim Gunn moment and make it work.
 

Setting Up the Test

 
We decided to use scenario testing. It is one of the best ways to test whether individuals can find information easily.
 
The first thing I did was to identify which audiences we definitely wanted for this process. We determined we wanted feedback from seniors, visually impaired users, and non-native English speakers. These populations were in addition to trying to get a random sampling of the population to test it out.
 
To create the test, we brought up our list of our users’ top tasks (and how they varied by audience). This is an important document that is a guiding lights for the redesign process. After we looked at our top tasks, we created scenarios based on those tasks.
 
For example, one of our users’ top tasks was finding the hours for their branch. So, we came up with a scenario for our users where they would need to find that information. The scenario was: the tester got an email from the library saying that an item they requested has come in, and they want to if they will be able to get the item after they get out of work.
 
After we came up with the scenarios, we had to decide how many people we wanted for each group for the data to be meaningful. Many people in the user experience(profession say that you don’t need more than five or so people per group. They suggest that after five people you get a repeat of opinions and perspectives.
 

Recruiting Participants

 
So, with all of that in mind, we reached out to the City. We talked to their Elderly Commission, their Office of Immigrant Advancement, and their Disability Commission to see if they knew of anyone would be a good fit for testing. For various reasons, we were unable to get participants from the latter two departments. However, we were able to get volunteers from the Elderly Commission, which was great. We also reached out to a school for people with visual impairments in the area, when the Disability Commission recruitment didn’t work out. We were fortunate in that a few of the Elderly Commission volunteers also were non-native English speakers.
 
The rest of our test participants were recruited through “ambush.” This involved three or four people from my team. They would walk around with laptops or iPads around the library and ask people if they wanted to spend ten minutes helping us test out the new website. As we did this, we tried to get a variety of ages and ethnicities so that we weren’t only talking with white people, or only high school students. Of course, we weren’t in control of everything; our participants still had to agree to take the test.
 

Doing the Testing and Analyzing the Results

 
I instructed our testers how to manage the test. I told them:
  • not to give hints, to return to the home screen after each question,
  • to instruct users that they can give up,
  • to take notes on the paths the participants took, and
  • to time the responses.
I also made sure they instructed the participants to narrate their actions and thoughts aloud so we can understand their choices.
 
After each day of testing, the team met up to discuss our findings. When we found concerns, we discussed making changes, so we did a makeshift kind of A/B testing. We made this work by testing participants one day. We then discussed the problems we saw and possible solutions. We implemented our solutions right before we tested the website with a new group of participants, and compared the results to the previous test. It wasn’t scientific, but we thought it was better than not doing it.
 
We got meaningful results through this way of testing. It allowed us to make changes that improved how well users were able to complete tasks. It’s hard to say if we would have gotten anything more meaningful with more resources, but having results is better than not having anything. That’s how I was able to do usability testing on a new website without a budget.

Published by

dhrutikaribhagat

I am a librarian who works on many different parts of librarianship in many different roles.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s