Doing a usability test involves observing a number of people perform some tasks using your web site. Normal procedure would be to repeat the same test with a few users to get a feel for the different approaches they use, and the problems that they encounter. Three to six users should be a sufficient number to identify common user expectations and the main obstacles that your web site presents to their completing the tasks.
The roles
You will need some users, a facilitator to present the tasks and an observer (or two) to make notes. If you have the facilities it may be helpful to video the session (i.e. the computer screen) and / or have a live relay to another room where other interested parties can observe the test.
Roles - the facilitator
To present the tasks to the user, and to encourage them to think aloud whilst using the web site so that the observer can note what the user is thinking and and their route through the web site. After each task the facilitator should note down the user's response and ask them whether they are happy that they have completed the task successfully (& if not, why not).
Guidance:
Most people find it difficult to think aloud whilst doing a task. There are a few ways in which you can encourage the user in this test to verbalise their thought processes:
- When the user is deciding what link to click, ask them which options they are deciding between.
- When they have decided, ask them why they are choosing the option they are - you might be surprised at their reasons for choosing one link over another.
- If the user falls quiet and / or looks puzzled, ask them what they are thinking, or if they're looking for something in particular. Please refrain from making suggestions.
- When the user has just arrived at a new page, ask them whether it was what they were expecting.
The observer may assist in eliciting comments from the user.
Roles - the user
The users may be from your intended audience for the site, and this would help in finding out exactly what they are expecting from your site. However, in terms of finding out how easy your site is to navigate, the users can be anybody willing to help out. They do not have to have any special web skills, indeed it may be better if they are not experienced in writing web pages - a "Joe Average" standard of web browsing will be sufficient and will reflect the wide variety of ways in which people use web sites.
The user will need some briefing before the test, along the following lines:-
"There are a number of tasks for you to complete that involve finding information on the University of Poppleton web site. The aim is to test how easy the web site is to use, NOT to test your skill in getting around web sites. Please feel comfortable to take your time when you make your choices, and to think aloud and communicate with the other team members."
"Try to think aloud whilst you're looking at each page and deciding what to click on. The observer needs to be able to keep a record of how you respond to the pages, what you think about when looking at the page, and how you decide where to go next.
Please don't hurry, this is not a test of your skill, and the other team members will need time to make notes."
If it's true, it may be helpful to add that you didn't design the site, so not to worry about hurting the designer's feelings. Users can be reluctant to criticise, or may try to say what they want you to hear, neither of which helps the objective of the test.
Getting started:
You could include instructions on how to get to the test web site, if you do already have it ready on screen.
Roles - the observer
To observe the user and to record their behaviour and note their comments. Pay attention to the way the user approaches the task, the route that they take through the web site and what they say about their choices. What are their expectations and what frustrates them?
The observer can also assist the facilitator by asking the user to explain their decisions, or by prompting them to think aloud when they lapse into quietness. Please refrain from making suggestions.
During each task, note down the route that the user took to find the information, and any comments they made about why they chose a particular link or what frustrated them in the tasks.
The tasks
The tasks that we used to run a usability test on a University web site were of 4 types:
- finding information or simple facts.
- finding two sets of information in order to make a comparison.
- making a judgement, which involved finding enough information to be able to make the judgement, and being satisfied that there was sufficient information on which to base a judgement.
The tasks can be ones that you've set, based on your expectations of what someone would want to use your web site for, and / or they can allow for the user to choose their own tasks. By allowing users to choose a few of their own tasks you will gain some insight into what they expect from your web site, which could be different from what you thought.
Here are the tasks we set, with some notes on why they were chosen.
- You have a colleague at Poppleton, Dr Piercemuller, who works in the Science & Engineering Support Unit. What number would you dial from Nottingham if you wanted to speak to them?
This is a straightforward task to begin with. It is interesting to note which route the user takes to finding the information - staff directory, or Department list.
- What is the latest time tomorrow that you could make an InterLibrary Loan (ILL) request from the Document Supply Unit in the main library?
Again, a fairly straightforward task with a few possible routes to finding the answer. In this case the information was repeated on two different pages, one of which only gave partial information.
- On which road is Poppleton Court hall of residence?
Another relatively straightforward information finding task, but the information was on a part of the web site not run by the same web team, so different conventions were in use in the navigation system.
- Which course has higher entrance qualifications requirements, French Studies or German Studies?
Users have to find the same fact for two Departments and compare them. The information may be provided by the centre, by the School to which they both belong, or by individual Departments.
- You know that there are a couple of interesting bands playing on campus soon and you want to know which is the cheaper gig, Gorillaz on 29th June or Ian Hunter on 5th June?
Users have to find two facts and compare them.
- Do you think the sports facilities at the University are any good? (it would help to ask the user to say what criteria they are using).
Users have to decide what their criteria are, and find enough information upon which to base their judgement.
- Would you have a better chance of doing further study after a degree in Chemistry or Physics?
Users have to make a judgement for both degrees based upon information provided centrally and / or by the Departments, and then compare their results.
Materials
It may help to have briefing sheets for your facilitator, observer
and user. Here are the sheets we used in our usability test that
you may like to adapt for your own purposes:
Facilitator notes
Observer notes
User notes
Drawing conclusions
A usability test is always a learning experience. It can be eye-opening to see how other people use the web, and how much your design might be based on your own web using experience alone. You may be disappointed at how users got on with your site, but try to focus on what you need to do to make it better.
Don't be overly concerned by users going momentarily astray and then retracing their steps to find another route. If they recognise that they've gone off the path, and can get back to try another route without being frustrated, then the problem is not too serious.
Pay more attention to users' actions than their opinions. After all, you're doing a usability test, not a survey, and the benefit is in seeing what people actually do.
Make a summary of the of the main problems you saw and any immediate ideas of how to solve them as soon as possible after the test. At this point, try to focus on the specific behaviour or problem you identified, rather than on sweeping generalisations about the site. This will help when you come to identify some potential solutions.
There are a few categories of problem you are likely to find.
- Users can't find the words that they are looking for.
This may be easily solvable by changing some of the labels that you are using for links, e.g. 'Staff listings' instead of 'Internal phonebook'. The users themselves may have provided a couple of suggestions whilst thinking aloud during the test.
- Users are surprised by what they get when following a link.
This could be an indicator of one of two related problems:
a) The text for the link may mean something to you as a developer or as a member of the Department providing the web site, but it may have different currency for your users. Consider using different text for the link.
b) The user is coming at the information from a different perspective. This may mean some reorganising of information in your site. For example, if a user is clicking on "Study" to find out about the library (because they are studying) and you have put the information under "Services" (because it's a service you provide) then you may need to rearrange your site to meet users' expectations rather than to reflect your own view of the organisation's hierarchy.
- The link they wanted was on the page but they couldn't see it.
Maybe there is just too much noise on the page hiding the important links, or maybe the link could be better positioned or more visible. Be careful, though, that other links don't become adversely affected by boosting the prominence of the one that happened to be needed in your test.
- Users are expecting an element on the page to behave in a certain way, but it doesn't - commonly icons or logos not linking to the relevant home page.
Remember that users have come to your site from somewhere else (maybe into the middle of it) and may want to jump right out as soon as they've found their answer, so make sure you let them do that or your site becomes frustrating.
Revising your site
Most problems identified in usability testing can be amended with tweaking rather than needing a full redesign of the site. However, if you are looking at a fundamental problem in information organisation or misunderstanding of what users want from your site, now's the time to get it on the table and start to look at a solution that's informed by what the users have shown they want or expect from your site.
Address the biggest problems first, and the ones that all or most users found problematic.
When making alterations to the site, beware of the temptation to solve a problem by adding instructions or more information by the link. The problem can more often be fruitfully addressed by removing distracting or competing information.
When your revisions have been made, get some more users in and run another test. This is always going to be a cyclical development, and at every stage, real users are the best way to identify how well your site is meeting its objectives.
< Previous - home - Next >
|