What’s the problem?
The Sutton website team has been testing concepts for updates to the council website. The challenge is deciding which design is the most effective to take into a beta testing phase.
One of the methods we’re using to validate effectiveness is a first-click test. This is an activity that captures where on a web page a user first clicks or taps when completing a task. It’s a useful method to understand if users find a website clear and navigable.
Why is click testing important?
Research shows that when users follow the right path on the first click they achieve task success around 87% of the time. This reduces to 46% success if the first click leads down the wrong path.
These statistics echo the feedback we’re getting about the current website design. We’re hearing that the current design is unclear and often leads users to the wrong areas. The result is a frustrating experience as users are unable to complete their tasks. With this in mind, new designs for the Sutton website need to be thoroughly click tested to ensure users can successfully find what they need.
How do we conduct a click test?
The image below shows two variations of a homepage for the website. Each has a slightly different set of elements and navigation structure. We want to know which variant users find clearest to navigate.
To test usability we set users a simple instruction. Indicate where on the interface they would go to renew a parking permit. The image below shows the responses received as a heatmap.
Once we’ve received a significant response, the next step is to analyse the data.
Here we’re looking at three things to confirm whether a design is clear to navigate:
Did users click or tap in places that might lead to a successful task completion? In the parking permit heatmap above, we see that Variant A generated clicks in areas unrelated to the task. Variant B, however, received a more accurate distribution. An irregular click scatter might signal that a design is causing uncertainty. In this case, the labeling and structure should be reconsidered.
How quickly did users decide on where to click or tap? In the heatmap example above, the average click time for Variant A was 10 seconds. For Variant B it was 7 seconds. A longer decision time may indicate the design of the page could be simplified.
How confident did users feel that their click or tap would lead to task completion? To determine this we asked users to place how they felt on a confidence scale. In the parking permit example, users tended to feel more confident in their decisions using Variant B. Lower levels of confidence suggest the labeling or design of the interface is unclear.
In this example we are confident that variant B generally outperformed Variant A for usability. This does not mean the job is finished. We now need to test how our shortlist of design variants perform when users complete other tasks, for example where would you go to pay a council tax bill. Depending on what we discover here, we may decide to redesign any areas of concern. In this case, we’d repeat the first-click process outlined above, but this time we’d focus more attention to the problem areas.
When we’re finished, we’ll use the evidence to help us decide which website design to focus on in beta.