Role: Product Designer
Timeline: 10 weeks
Team Members: 4 Product Designers
Advisors: Ava Sazanami, Evan Sarantinos
This past Spring, I was a part of a Directed Research Group (DRG) within the University of Washington Department of Human Centered Design & Engineering (HCDE) and the Seattle Coronavirus Assessment Network (SCAN). SCAN offers a COVID-19 at-home test kit service which delivers to residents of King and Pierce County in the Seattle area.
THE PROBLEM
The Kit Activation Step ensures that there will not be any mix-ups within the lab with the test sample, and is a crucial part of the process. In this step, users have to take a picture of the barcode on their test tube, and fill out an online form with their personal information and barcode number.
If mix-ups in the lab occur, it makes it more difficult to identify whose samples are whose without activating their kit. It would take longer for the user to get their results as well.
SCAN wants the testing process to be as quick as possible for their users, including when samples are being analyzed. It delays the process if they are not able to verify a user's results, and may give false results as well.
According to SCAN's data taken from comparing their RedCap form submissions and the kits ordered, we found that 20% of test-takers did not complete this step in the process, a relatively large number for such an important step. From this, we asked ourselves the question:
How can we better understand the kit activation process and improve the kit activation rate?
METHODOLOGY
To understand the kit activation and overall procedure of the SCAN study, the entire research team participated in the SCAN study by receiving COVID-19 test kits and completing it as if we were a participant in the SCAN study. I took note of pain points I noticed during the kit activation process, such as a tendency to want to push it off to later in the process or skim the instructions, and noted these for questions within the study design and research with participants.
(Fun fact: although I was well aware of the problem we were trying to solve, I actually became one of our target users as I almost forgot to activate my own kit... good for our research, bad for my ego when I admitted it to the team.)
To target participant recruitment to those who have historically found issues within the kit activation process we analyzed the summary of SCAN’s Kit Activation Compliance breaking it down by demographic data. Through this process we identified a higher incompletion rate between people of color and participants age 50 and up as seen in Figure 1 and 2.
We conducted this usability study to observe how users interact with our boxes and learn more about possible reasons why they may be skipping the Activate Kit step. We had users complete the entire testing process - opening the box, taking their COVID test, etc - on video with us and we asked them questions post-test.
We coded each interview in an Excel sheet to identify insights and key themes in the data. We did this by organizing demographic data of each participant alongside reasons for enrolling in the study, suggestions about the kit activation step, pain points, when the activation step was noticed, reasons for activation/non-activation, the first thing they noticed, and positive points.
This data was then organized into insights sorted by how often each insight showed up as shown in Figure 5. These insights were then categorized into key findings to begin generating recommendations.
FINDINGS & RECOMMENDATIONS
FINDING 1: MISINTERPRETING THE "ACTIVATE KIT" STEP
We observed in our interviews that participants would read but misinterpret the purpose of the “activate kit” step. This issue we believe comes from the current wording of the "Activate Kit step" in the pamphlet, which can be seen in Figure 6.
The current wording makes the step sound like participants need to activate the kit in order to receive their results, which makes it a later priority to the user.
RECOMMENDATION 1.0: CHANGING THE WORDING
We recommended changing the wording of this step in order to add emphasis that this step should be taken immediately, not just when the participant is looking for results.
This could be done by using immediate wording such as “Activate Now”, to create urgency and bring attention to the step.
RECOMMENDATION 1.1: SWITCHING ORDER OF THE STEPS
Switching the order of steps one and two would ensure that participants do not forget to take the barcode photo, so even if they do choose to activate later, they will have the information they need.
FINDING 2: EFFICIENT ORDER OF MATERIALS
When opening the kit, participants tended to focus on the swab and tube in the kit first (as their goal as a user was to simply take the test), as opposed to activating their kit and reading the instructions first.
Two participants did not read the Quick Start Guide, and we believe that this was more likely if they already have experience with COVID tests (participants said this), especially self-swab tests, as they would skim the Quick Start Guide in order to get the test over with quickly.
RECOMMENDATION 2.0: ADDING A BREAKABLE STICKER
A recommendation is to add a breakable sticker around the opening of the tube as another reminder to activate the kit. That way, the users would have yet another chance to activate their kit before opening the tube and taking the test.
RECOMMENDATION 2.1: CHANGING THE ORDER
Participants interacted with the materials in the kit based on the order each item was placed in the box, as two participants first noticed the materials in the kit before the activation step since the tube and swab are placed on top of the instructions.
We recommended changing the order of the materials placed in the kit so that it is in the order they are used in during the test. For example, having the instructions on top, then the tube and swab, and the rest of the items below those.
RECOMMENDATION 2.2: ADDING A COLORED "CARD"
To increase the perceived importance of the Quick Start Guide, we recommended adding a brightly colored activation “card” to place on top of all of the materials in the box. By adding the card in, the test-takers would be forced to interact with the reminder and it prevents them from getting tunnel vision and going for the swab and tube.
However, it should be noted that this suggestion may not be accessible for individuals with color blindness (1-8% of the population), and would need to be tested as it might not have the intended effect.
FINDING 3: CONVENIENCE AND COMFORT
All of our participants answered that they chose SCAN as it is a convenient and comfortable option for them to be able to take the test at home. Some of our participants faced different limitations that prevented them from getting an in-person test, as shown in the figure below:
RECOMMENDATION 3: TARGETING OUTREACH
Because of limitations like these, we recommended that providing outreach to communities currently benefiting from the at-home tests, including bilingual and multicultural communities, would be beneficial for them.
In addition, it's important to avoid overwhelming them with texts and reducing content overload. Being spammed with texts notably made the users more stressed out about the entire process, when it should be as easy and relaxed as possible.
FINDING 4: NOTICEABLE POPULATION OF BILINGUAL PARTICIPANTS
The participants in our study were primarily bilingual, exposing possible language barriers present in the kit activation process. Despite being bilingual, all of our participants chose English as the language of their kits, and we found that one participant wasn’t even aware that a translated kit was even an option.
RECOMMENDATION 4: MARKET TRANSLATED KITS BETTER
This language barrier could be a cause of misinterpretation of the kit activation step, and by more effectively advertising translated kits we can provide the option of a better user experience for bilingual individuals. By completing the kit in their primarily spoken language, we can limit the translation errors that may occur when using the English kits.
It is also important to ensure the kit activation step is being properly translated to once again limit errors due to a language barrier. Because a language barrier was not the primary focus of our study, we believe that this demographic of multilingual participants should be studied further in order to better understand their specific needs and pain points in the SCAN at-home testing process.
FINDING 5: OVERALL SATISFACTION
Despite the confusion associated with the kit activation step, participants expressed that overall the steps to collect a nasal swab were easy to understand and complete. We also observed this in our interviews, where participants skimmed over the text and felt comfortable enough to complete the test without fully reading the pamphlet. We had participants even express that the process could be done in less steps, and that “if anyone is able to read, it is clear”.
RECOMMENDATION 5: REDUCING TEXT IN PAMPHLET
We believe that there is still room to improve in the process overall. By smoothing out issues with kit activation, we can reduce any room for error for the participant. We also recommended reducing the use of text in the pamphlet, in order to keep participants from skimming and ensure the process is as simplified as possible.
RETROSPECTIVE
Finally, we believe that some moderator bias likely affected how participants acted while on calls with us. As with all usability studies, the Hawthorne Effect – which states that people act differently when they are aware they are being watched – might have caused participants to act how they believed we wanted them to act.
NEXT STEPS
Since the study was only for the length of a quarter, it was difficult to find out any data from implementing our recommendations into the kit. Thinking towards the future, if I was still on the team, our success metrics might include:
When activating their kit, users will have to fill out a form on RedCap, so we can evaluate how many kits were sent in and compare it to the activation responses.
This would show whether or not people are visiting the form and still putting it at a lower priority/too much work, or if people are simply not seeing the form/brushing it off from the initial glance.
This was my first time participating in research, so I felt a bit of imposter syndrome joining this research group as a freshman in a group of sophomores and juniors already studying Human Centered Design & Engineering. At first, it was difficult to ask questions and be more open about my inexperience in user testing, since it felt like everyone already knew what they were doing. However, as I put in more effort to be an active participant in our discussions and be more open with my team members, I realized that even the sophomores or juniors aren't completely familiar with user research. It is normal and valid to be intimidated by a group of upperclassmen, but I knew it was important to remind myself that we are all here to learn and it's only detrimental to continue to be intimidated by them.
I am also proud of myself for how much I learned in this study this past quarter — I gained a lot of insight into how usability tests are run and the different methods we used to analyze data and come up with solutions. One of the reasons why I love design is because of the real-world impact it can have, and in this study, I felt that I had made meaningful contributions to my community! I hope that SCAN will have positive results once implementing our recommendations into their COVID-19 at-home test kits.