Awesome
Learn User Experience Testing
Learn how to test your app with real humans
Why?
This is probably the most important/useful skill any "creative technologist" has. πͺ
Well conducted user experience testing helps you ensure at each stage of development that what you are planning to build is going to solve the problem for your users that you set out to solve. β
Done properly, it prevents you from wasting time β³/ effort π/ money π°on guessing π€βwhat your users want, need or understand by getting them to critique your product as you go rather than retrospectively.
Testing usability and following usability guidelines is not something we have to do to restrict ourselves or because we're told to. Understanding usability is about ensuring users can use our product effectively π π
What?
User experience testing is a means of collecting feedback π£from users on whether the UX/UI presented allows them to achieve their desired goal in the fewest steps.
It should be performed from the very beginning of a project and throughout in a cycle of:
- π£βπgathering feedback on your designs from users ->
- adapting your designs in light of user feedback π€ βοΈ ->
- getting feedback on your amended designs to see if they now meet the users' needs π£β π ->
- Once you are satisfied the users needs have been met then you can begin to build the agreed designs. π π·ββοΈ π·
User experience testing cannot be performed as a 'token gesture' to produce insightful results. Care must be taken to ensure that guidelines are followed properly so that the results are not invalidated by "confirmation bias".
How?
There are many approaches to user experience testing that you may choose to use based on your product, stage of product development, budget π΅, client base π₯ etc.:
The first stage of user testing you may wish to conduct is known as the discovery phase, for information on this stage see: https://www.gov.uk/service-manual/agile-delivery/how-the-discovery-phase-works
To perform usability user testing you may use one/some of the following methods:
- interviews
- Surveys/ questionnaires
- Usage pattern analysis using "analytics" data. For more on this see: https://github.com/dwyl/learn-analytics π
- Screen recordings #1 π₯π₯
- App usage/demo video capture
Ensure that when you conduct tests you are testing them across devices so that your data reflects all of your users experiences e.g. mobile π±, tablet and desktop π» .
How to conduct interviews
- Create a script to run through with your testers π
- Decide who you will interview π₯
- Find somewhere appropriate to conduct your interviews π’
- Decide on a time/date to perform your research π π
- Decide who in the team will conduct the interviews π©βπΌπ¨βπΌ
- Decide how many interviews you wish to do
- Decide how you will record the findings of your interviews. πππ₯
1. Writing a script π
A script is a helpful way of ensuring interviews are delivered in a consistent way (if different team members conduct them) and that the interviewer can feel relaxed βΊοΈ because they know what they've got to say. It's important that scripts aren't read like an autocue π€ as you want the tester to feel relaxed too, so making them more like a normal conversation is beneficial. However if you are anxious π° about missing something out, one way to deal with this is to tell the participant 'I'm going to read through this part of the process to make sure I don't leave anything out'. By telling them what you are doing and why it can help alleviate worries they may have constructed otherwise, you're telling them that you're reading it so they benefit from all of the information, not because this is a formal setting and you're avoiding their eye contact ππ! So make sure the team familiarises themselves with the script before they start. The script doesn't need to be followed word for word but team members should bare in mind that certain word choices are important in order to not influence the testers responses. Here is a script outline that might be of use for some projects: https://docs.google.com/forms/d/e/1FAIpQLSfM2Uaje8gpBde-RqU01DlxrGUEsWZrjiy8yTKmYaeJYAZUuw/viewform
The script came from this 5 min Google tutorial: https://www.youtube.com/watch?v=0YL0xoSmyZI. Remember though to always tailor your script to what you are testing e.g. you may not need as many introductory questions as this one depending on what you are aiming to find out from your testing at this stage.
These are some of the key points from it and other resources (see list at the bottom of readme):
Introduction
- Ensure your screen is set to something non-distracting (not a view of the product) during the introduction to the session e.g. a blank google search page
- Thank the tester for taking part β€οΈ
- Explain the context of the research e.g. βWe're asking people to try using a website we're working on to make sure it works as intended...β
- Explain what they should expect e.g. βin a minute I will show you the application and Iβll ask you some questions about it, it will take about 10 minutesβ¦β
- Explain what is expected of them e.g. βas you go through the application I want you to think aloud, to voice whatever youβre thinking or feeling as you go through itβ - this βthink aloudβ method is crucial for getting inside a userβs head. ππ€π¬
- Encourage honesty and help them relax by letting them know there are no right or wrong answers, you are not testing them. βplease be honest, this is a test of the application not of youβ
- 'If you have any questions as you go through the test feel free to ask them' - I may not be able to answer them right away since we're interested in how people do when they don't have someone sitting next to them to help but I'll make sure to answer them at the end.'
- Inform the tester if you are going to record any of your test (e.g. screen recording πΉ or audio recording π) and what the recording will be used for 'it will only be seen/heard by people working on the site, it will not be used outside of the project.' 'Recording the test helps me as I don't have to take as many notes...' π
- Ask them if they have any questions after your introduction, before you start showing them the application.
Initial Questions
- Basic questions about the participant that are easy for them to answer and not too sensitive. This helps them warm up and also helps you learn a bit more about them as a user. E.g. what's your occupation? π¨βπ³ π©βπ π΅οΈ π©βπ¨
- You might want to learn more about their habits with technology to gain an understanding of how technologically competent they are (giving context to their responses) or whether they're familiar with the device you are testing your product with them on in that test ie. are they an android or iOS user π±.
Exploring your wireframes / product π
-
For a homepage exploration you might prompt them with these questions before showing them the page: 'Look at this page and tell me what you about it...' 'Whose site do you think it is?' 'What strikes you about it? What you think you could do here? What you think this site is for?'
-
For testing a specific flow or user journey tell them about the task you want them to try and complete. You can read out the task aloud and give the participant a written copy for their reference. The task should give the user a scenario to follow, that might include any relevant background details about them, why they are on your site, how they came to know about the site and what they are looking to do on the site.
-
Reiterate that you want the participant to think out loud as they go along. ππ€π¬
-
Ask them about other applications they might use that relate to your product. Consider carefully whether you do this before or after showing them your own application. The order in which you do things will prime the tester to have whatever you have just discussed on their mind, consider whether this is helpful in the context of your research or not.
-
Wrap up and ensure to ask your tester if theyβd like to ask you anything. Itβs not only polite but also sometimes reveals subjects that the test failed to capture. βI think thatβs everything, do you have any questions for me?β
-
Be comfortable with silence πΆπ, when someone is exploring the application don't fill the awkward silences by telling them what to do next, wait for them to talk. Responses like 'I'm not sure what to do now, where does this page go?' are really useful because they show you that your designs are not self explanatory.
-
Avoid using leading questions or all yes/no questions. Open ended questions encourage people to give more detail and depth of analysis.
-
Be mindful of the language you use to respond to the tester, if a tester is talking about whether they like the app or not, saying 'ok' in confirmation rather than 'good' is more neutral. Saying 'good' would suggest that you are pleased with their response and may influence them to give other answers they think you would like to hear.
2. Who π₯
If you have existing users for your product reach out to them and ask if any of them would like to participate in some research to help improve the product. If you don't have any existing users yet, aim to test your application on people who represent your target user group. Even when you do have an existing user base, testing with people who are new to your product can offer a different perspective to those who already know it and what it does. If you are struggling to find participants encouraging people by informing them of how long it takes to participate (10mins is reasonable) or you can offer something to thank them for participating e.g. buy them a coffee βοΈ if you are in a coffee shop. Just be mindful that you don't want what you offer as a thank you to impact what responses they give. You may get a better response rate for participation from people who are on their own π€.
3. Where πΊ
Key for determining where to conduct your interviews are:
- proximity and comfort - how relaxed will your interviewee feel where you're meeting them, is it easy for them to get to, would they feel relaxed in that environment? is there somewhere comfortable for you to both sit? is it too noisy? or is what you're discussing potentially sensitive so is the environment private enough?
- practicality - what space do you have available to you for your time/budget? ππ°do you have a meeting room that you can use? If not, what about a public space like a coffee shop βοΈor somewhere relevant to your product e.g. a leisure centre if your product is a fitness application ποΈββοΈ.
- bias - will the location of your interview influence your interviewees response in any way? Ie. would a staff member respond differently to a question about procrastination at work when asked in their open plan office in front of their colleagues in comparison to if you asked them in a setting outside of their office? or in a private space?
4. When π₯π
When is a time that suits you and your interviewees. If you're looking to source interviewees on the day when is a time that your chosen location is likely to have plenty of people around e.g. office hours for an office, would people be happier to talk on their lunch break or are they likely to leave the office altogether? Does the day of the week or time of year make a difference ππβοΈ? Are there any key events within your company or in the public eye that would be good to coordinate with to conduct your research πππ? E.g. conducting research in January to coordinate with 'Dry January' when researching for a product aimed at people interested in low/no alcohol drinks options.
5. Who will conduct interviews
1 or 2 people is sufficient (you don't want to make your tester feel uncomfortable). Everyone involved in your product should go at some point (developers too!)
6. How many interviews
Up to 85% of core usability problems can be found by observing just 5 people using your application according to Jakob Nielsen (see https://www.youtube.com/watch?v=0YL0xoSmyZI and https://www.nngroup.com/articles/why-you-only-need-to-test-with-5-users/). The reason for this is that the core usability problems are easy to spot for people who are new to your application (and difficult for you to spot as you are too close to the product to see how real people perceive it for the first time). The 'magic 5' approach suggests that you find out the most from the first person you to speak to and then a little less from the next person and so forth.
7. Recording the findings of your interviews π and interpreting them πβ
Agree before you conduct your interview who is going to take notes and in what format or whether you're going to record the session (the screen or audio). Think about how you will later collate your results to see patterns from them. Once you have all of your findings discuss them with your team to find trends and see how you can make improvements based upon them.
Testing with small sample sizes like 4 or 5 people is worthwhile when often the alternative is no testing at all. However, you shouldn't rely on statistics from such small groups. So if 1/4 people take issue with something don't consider it 25% and therefore something that must be changed. Consider what the issue for that 1 person was and interpret it with your knowledge to deduce if you should change the designs based on their response. Remember that small and simple amends are just as good, if not better than a total redesign. Just because you have areas to improve on doesn't mean you should start again from scratch. Also remember it's better to make your existing product work rather than adding new feature after new feature when the original product isn't working well for users yet. If you have made changes to the existing product and people are reacting badly to them remember that aversion to change is normal. Give it a bit of time to allow users to adjust to the new designs or leave assistance mechanisms to point people in the right direction to teach them the new way of doing things.
Be conscious of bias - think about how every aspect of your set up (time/location, interviewee demographic, script etc.) could influence the responses you gained in conducting your research. Could interviewing loved ones about your idea expose only the positives of your idea because they don't want to give you negative feedback for fear of hurting you?
Resources:
- A demo of a 25 minute usability test - Rocket Surgery Made Easy by Steve Krug: Usability Demo https://youtu.be/QckIzHC99Xc
- Notes from this talk have been integrated into this readme. Talk on usability not so specifically usability testing. Jakob Nielsen: "Mobile Usability Futures" (Google Talk) 1hr: https://youtu.be/sELOUAmFHjA