Design

Learning from User Testing

It’s important that as much of the learning you get from this process is new, you want to be surprised by the results. You should hope to have a substantial list of previously unknown problems to fix by the end.

By

Paddy Breslin

on

Apr 29, 2017

Valuable lessons from user testing sessions

I am by no means an expert in user testing, but I have learned some really valuable lessons from the sessions that I have been involved with. I want to share these and hopefully my experience will be of some benefit to you in testing your systems and products. As a UX designer I place enormous value in time spend with user.

Get buy-in from the decision makers at the start

Evaluating any system with users is utterly pointless if there is no prospect of any changes ever being made to it. Nobody would user test a ten year old car, as it’s no longer in production and its technology is out of date. Equally, if the website or product we are testing is being discontinued or is unlikely to evolve further, then testing may be of little use.

More often, however, the scenario we face is that the decision makers are unwilling to invest in making changes. This can be a legacy of Waterfall Development, once it’s built, it’s built. These situations leave us at a decision. Do we try to convince the boss to get involved and sanction any fixes required? Or accept that the decision is made, no alterations will be done, and live with the product as it is.

Pushing for change

There is another way however, which is to guerrilla test. This involves conducting simple user tests without necessarily getting the backing of the senior management. Unsanctioned testing is risky, this can cause friction within the organisation. However if the problems are serious, and management are not concerned. A Guerrilla test can be a great way to get some clarity and highlight important issues. Sometimes it’s better to ask for forgiveness afterwards then seek permission beforehand. Especially if the tests show real users, struggling with what are perceived to be simple tasks.

Ideally, all testing should be video recorded, with screen capture and the subjects’ face and their voice recorded. It’s vital to have this if you are trying to open a discussion, and convince people to fix usability problems.

Set your goals, what do you want to achieve?

Regardless of how we are going to conduct our testing, the major work really begins before any users are involved. We have to ask ourselves what it is we are trying to find out. What metrics are we trying to improve, for example better user engagement, more task completions. Can we identify problems people are having with our system, are the steps unclear, does the language make sense, and are their elements that simply don’t function. It’s important to keep in mind that user testing will not solve all these problems in one go, it is a process that should be repeated periodically.

Outlining and agreeing goals is important from the start, we need a baseline to compare our results to. Without this we cannot measure changes or areas to improve.

User Goals

It is important to talk about the user’s goals from their point of view.

“I want to fill in this text field and then click on this button” - This is not a user goal.

“I want to submit this form quickly so I can go to lunch”  - This is a user goal.

An overarching target should be to identify the problems “people” face in a realistic context. What are they trying to achieve and why? Can they do this easily? Are their roadblocks on their journey? Pick actual use scenarios that people would encounter, and have your testers run through these.

Business Goals

From the business point of view, avoid simply looking at the bottom line in user testing. There should be more to this than simple sales or conversion numbers. If it is an e-commerce website can the user find the product that they want even if they don't purchase it? Or an in-house CRM system, can the employee quickly bring up a customer file to speed up a support call. Can a factory machine operator complete a given task quickly and safely.

Real world test conditions

Don’t waste the valuable time with real users on things you already know. Simple usability problems should be resolved without any testing. A Heuristic review can be done first to fix the obvious nonconforming parts of the system. Also Industry standards should be adhered to during the development, symbols, terms, and language should all follow norms. Avoid the temptation to redesign something that is already universally understood. For example the ‘envelope’ icon means click here to email in any language.

For our Online Classified Ad system a combination of the user goals and business goals gave us a clear picture of what the ideal customer will want to do on our website. From this we created various scenarios of their journey. - User register for an account. (business goal)

- User places a car for sale (user goal & business goal)

- User searches for a new car to buy (user goal).

With these we were then able to run through a mock user test and ensure we had everything ready to go.

The unknown, unknowns

User testing should show you things you have not even begun to contemplate. In one test session, we could see the user struggling with the places we knew were a problem. However we were surprised to see them stuck at a point we felt worked well.

To place a ‘car for sale’ ad the user could input the car registration number. The system would pre populate the ad with that car’s details make, model, colour, etc., as it was linked to the national motor vehicle registry.

However all of our testers missed this and instead filled in the details manually, which greatly increased the time it took to create their ads. After the test we spoke to each user and asked them about this section.  They told us that the button next to the car registration field was labeled ‘SUBMIT’ and they thought this was to submit the completed ad and not the car registration number.

You can watch our test video here

The Cookie Monster - Prepare and practice

Don’t be tempted to cut corners here. For example, in my first user testing session, we forgot to clear the cookies between subjects. This meant that the for next person coming through, the online form was filled very quickly thanks to autofill. So we could not accurately assess how difficult this form was for them. Remember if you only have five people coming to a session, one invalid test is a 20% loss of data.

Getting people in from outside to test your product, can be time-consuming and often expensive. Don’t waste the opportunity by not working out the test flow in advance, make sure you run through a mock test with someone.

Do my questions make sense?

Does the software work?

Is the audio clear?

Don’t test to find out what you already know

It’s tempting to throw in a scenario that will prove something you already know is a problem. I understand very well that convincing the boss that you are not wasting your time with this stuff can be important. And it can be an easy win showing the user struggling with some part of the system. However, if something is so obvious it should be possible to demonstrate this and present it to stakeholders without the need to test it.

It’s important that as much of the learning you get from this process is new, you want to be surprised by the results. You should hope to have a substantial list of previously unknown problems to fix by the end.

Film in the camera

It’s an obvious one, but worth emphasizing. Test the tech and then test it again, have a checklist of the items you need to have, especially if you are going off-site. Chargers, phones, cables, correct web links, script, consent forms etc.

Make it fun

It is important that the test subjects are relaxed, have a friendly setup, easygoing and open.

Make sure to emphasise with your testers that the system is under scrutiny and not them. Making mistakes is not a problem, the more we find out about how they product is working the better we can make it.

Keep the conversation going, ask them to speak aloud as they work through the steps. I find the best insights come from the words they casually say, more so than where they click or what they look at.

Good luck. Feel free to contact me for advice on user testing. @paddy_be

...

Get Our Monthly Newsletter
Directly Into Your Inbox!

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form