As I write this, I’m flying home from the Agile2018 conference in San Diego produced by Agile Alliance. I curate content for Agile Alliance and act as the product owner for the organization’s website so I end up doing a variety of things at the conference.
A few months back we ran some surveys to get feedback on agilealliance.org. One common theme from those surveys was that the site was hard to navigate. Unfortunately, we weren’t able to dig deeper into what exactly people meant when they said that.
While at Agile2018, I wanted to try some usability testing of website to see if we could figure out the reason people found the site difficult to navigate. I also wanted to get some practical experience performing usability testing as I haven’t had the opportunity to do much before.
Joe, the web content manager, helped me with the usability testing. Over the course of the conference, we spoke with 10 people, twice the target of 5 usually suggested for usability testing.
Here’s a description of how we approached the usability tests and what I learned as a result.
How we approached usability testing
A main point of usability testing is to reach out to your users to get their feedback. This past week offered an excellent opportunity to do that because the conference drew together over 2300 people. Finding volunteers was not going to be a problem.
Every afternoon, Joe and I parked ourselves next to a big monitor in the an open foyer outside a group of the meeting rooms. This area provided a place for people to stop by, recharge their phones, have a good chat, or learn more about Agile Alliance activities.
We asked for volunteers on Twitter and included notes about what we were doing on the daily conference email.
We also tried to recruit interested victims – oops volunteers – throughout the course of the day.
As hinted in the above tweet we used a “small token of our appreciation” to entice people to participate. At Agile2018, drink tickets are a valuable currency, so we gave a drink ticket away to every volunteer. Think of the drink ticket as the equivalent of us buying the volunteer a coffee.
Most volunteers were generally interested in providing feedback and the drink ticket was a nice extra little incentive. Two of our volunteers did usability testing during their day job, so they were especially interested (and provided some good tips on how we could improve our approach to the usability testing).
When someone stopped by to participate in the usability test, we had them sit down and use my Mac to navigate around the website. We projected the computer on the big screen so that Joe and I could see what was going on while our volunteer navigated through the site.
I’d ask the volunteer if they frequently use the site. If they did, I asked them to demonstrate what they typically do on the site.
If they didn’t use the site much, I’d ask them to think about a topic they wanted to learn about at the conference and asked them how they would go about finding out more about that topic. I’d also ask them to narrate what was going through their head when they were doing things and what surprises or unexpected things happened. (I should note I wasn’t always successful in remembering to explicitly ask that question.)
We’d then watch as the volunteer navigated the site. I facilitated the conversation and Joe took notes about what the volunteer said and what was going on the screen.
Joe also noted things that we identified we needed to fix. Some of items the volunteer noted, others items Joe or I caught as a result of seeing someone new go through the site.
We let the volunteer determine how long the session was based on how much time they wanted to spend. Usually the sessions lasted somewhere between 15 – 30 minutes.
Lessons learned
This was a great exercise this week. Not only did we get a better understanding of how people use Agile Alliance’s website, I also learned a great deal about how to do (and not do) usability testing.
I won’t go through the findings about the site itself, but I do want to highlight some of the things I learned about usability testing in general.
Consider people’s technology preferences. I used to be a PC user but switched to Macs a few years ago. Somewhere along the way, I forgot the struggles I had getting adjusted to the Mac when I first switched. We faced that this week with a couple of volunteers who were PC users.
If I were to do it over, I would have had both a Mac and PC, or at least had a mouse for people to use with my Mac. (Volunteers struggles the most with the track pad on the Mac.)
Give people specific tasks to do, but make those tasks relevant. We let our volunteers identify the task they were going to do so that it would be relevant for them.
If we had asked them to do some random task that they weren’t really curious about, they may not have put as much thought into it, or would have tried harder to say what they thought we wanted to hear rather than actually completing the task.
When we asked people to show us how they used the site, we were able to get a better idea of the paths through the site that people who have used it have found.
When we talked with someone who hadn’t used the site much, we were able to see how someone new to the site figured things out, and how hard it was for people to do that.
Ask people to narrate what they’re thinking when they do it. I tried to explicitly ask the volunteer to narrate what they’re trying to do, why they’re trying to do it, and what they’re thinking as they did it. It’s helpful to know what questions come to mind as they worked through the site and it helped us see the site through their eyes.
Resist the urge to help the people find something. Sometimes when a volunteer was trying to do something and did a good job of narrating their thoughts, they would ask a question such as “How would I…”
You’re trying to find out how people actually use the site when you’re not around, so the worst thing you could do is answer that question and not see how they might go about figuring it out for themselves. Watching them struggle and seeing how they figured it out provides a great deal of insight into what you can do to improve the usability of your product.
We did have one volunteer who happens to work in UX call me on it when I was about to help direct him to a certain point.
Follow up on unexpected events or surprises. I could have explored deeper when the volunteer says things like “that’s interesting” or “I didn’t expect that to happen”.
If you hear phrases like that, ask what they mean, and why it was surprising. You can often identify cases where the product is working properly, but it’s doing something that the volunteer didn’t anticipate or is not intuitive to them.
In our case, several of our volunteers built websites for a living, and they were not shy about providing their thoughts on things that did not seem intuitive and even suggested different design choices.
Resist the urge to explain/defend your design choices. If and when you do have someone question your design choices, or suggest a different approach, make note of that feedback and say thank you, but don’t try to explain why it is the way it is.
People using your product don’t care. They only care that they can use it for what they’re trying to get done. Use the feedback as a starting point to see if you can change the way things are done to make it easier to use.
In some cases you will be able to change things. In other cases, you may not be able to do exactly what the volunteer suggested, but you can use that input to find a different way that works within whatever constraints you faced.
It can be refreshing, and terrifying to see the site through your users eyes. You get so used to how things are supposed to work with your product, that you don’t realize how confusing it is for people not familiar with the site. It definitely provides you with a fresher view of things and it can reinvigorate your efforts to make your product better.
What are your usability testing insights?
The best time to do usability testing is probably when you are still building your product so that you can better hit the mark right off the bat. My experience this past week tells me that there still is great value in doing usability testing on an existing product so that you can identify opportunities for improvement.
What lessons have you learned from usability testing and what tricks or tips do you have? Please share them in the comments.
Addendum August 23, 2018
I ran across this article from Andy Warr that shared 10 usability testing pro-tips. There are some great tips here, some that I was aware of and a couple that would have been really helpful, including the note template he suggested as #5.