emmabydesign.

Post-test Actions

Thriva | 2023

Thriva are revolutionising healthcare, giving anyone the opportunity to see what’s going on inside their body through self-administered blood testing. The focus of this project was to drive customer retention through a more personalised post-results experience.

The Problem

As a squad, the key business outcome we were looking to drive was an increase in LTV:CAC* over a period of 6 months. A key driver of LTV is customer retention. Using customer data, qualitative and quantitative insights, we identified 4 high level problems that impact retention:


➡️ It’s hard to understand how behaviours impact blood results over time

➡️ It’s unclear what to do about a failed test and why it happened

➡️ There’s not much to motivate, reward & recognise achievement

➡️ It’s unclear what to do after a set of blood test results
(this was ultimately chosen as the problem area to focus on)


Our app is where most customers view their results, so we decided to prioritise our efforts on this experience rather than our web dashboard.


*Life Time Value : Customer Acquisition Cost

Existing homepage teardown

Existing results page teardown

Ideation for potential ‘daily check-in’ concept

Testing

Working closely with our Content Designer, User Researcher and Product Manager, we formed two ‘How might we’ statements to focus our efforts:


➡️ Encourage customers to return to the app daily

➡️ Present their next steps in a clear, actionable way


These helped us to come up with designs to test, in collaboration with our Engineers and Data team. Our Researcher put these live as an unmoderated test not specific to Thriva customers on UserTesting.com. The test validated all three of our hypotheses:


1️⃣ Introducing a ‘Progress’ tab will make it clearer to users where to go when they want to take action following their results – ✅ Validated


2️⃣ Being able to mark their recommended activities as complete will make users more motivated to take action – ✅ Validated


3️⃣ Presenting users with a daily ‘streak’ incentive will encourage them to return to the app daily – ✅ Validated

Prototype screens (unmoderated test)

Interviews

Several recommendations emerged from our user testing, which were implemented into the next prototype to test with our own customers.
The key recommendations we followed were:


➡️ Explore alternative name for progress tab

➡️ Simplify the activity cards so the action to complete is clearer

➡️ Introduce motivational messaging when someone completes a task

➡️ Explore the idea of being able to mark an activity as complete in the ‘progress’ tab without having to open pop-up to do so

➡️ Implement a maximum number of cards in the ‘updates’ tab to reduce overwhelm and ‘banner blindness’


I put together an improved prototype taking these findings into account, then worked with our User Researcher to ensure our discussion guide for the interviews covered our key goals for the next research phase. I then moderated a couple of these interviews, and took notes in some others.

Prototype screens (customer interviews)

Key findings

The feedback we received from customers on this new prototype was encouraging for the most part, with some more helpful insights into further improvements to be made:


1️⃣ Customers were positive about the new collapsed results component at the top of the homepage – they liked being able to access this quickly


2️⃣ There wasn’t enough distinction between the ‘Do’ and ‘Learn’ tabs – some customers felt there was repetition between the two, and it wasn’t immediately clear what ‘Learn’ contained


3️⃣ The content of the ‘Do’ tab was clear, the grouping of interventions felt focused on their results


4️⃣ The intervention information pop-up provided the expected content, and customers liked the links out to evidence-based articles, but they did want to see more of a link to the result that the recommendation was based on


Many halo insights were raised and recorded as a part of this project. The content and relevance of the personalised recommendations to a customer’s specific health needs was mentioned multiple times, flagging the need for a clinical review of the content and also a technical review of how these recommendations are surfaced for each customer.

“Tells you everything that you need in that box up there. I guess it draws your attention more to your results, which is really important for me to know” (Participant B)


“I think that learn tabs really quite important. I’m just, I’m not sure why I missed it when I went into your next steps. I just didn’t see it.” (Participant E)


“If I knew that I was gonna click on the page and it was gonna recommend the same thing every time, let’s say I did a test and didn’t do another one for another six months, then I probably wouldn’t visit this page again” (Participant A)

Build

As the build phase was due to begin, a new business need was raised, which required us moving the ‘next steps’ portion of the work into the next business period to iterate and improve on further. This meant that we had to pivot our strategy for build to be much more lightweight in the short-term.


We decided to focus our efforts on surfacing more relevant actions on the app homepage, as well as signposting customers to their next steps in the existing format (at the bottom of the results screen) directly from the homepage.


Together with our Content Designer, we came up with a series of ‘timely action cards’ to surface various information to the customer at different times of their test cycle and overall customer journey. I then worked with our Backend Engineer to develop a prioritisation framework for these cards, to ensure that a maximum of 3 were shown at any one time. This was in response to customer feedback, that the homepage could easily become overwhelming if too much information was presented at once.

Designs for homepage build

Prioritisation matrix for timely action cards

Project outcomes

As previously mentioned, the ‘next steps’ screen was identified as an opportunity to be a part of a future piece of work. This then became a key feature in our new membership offering which was developed and launched 6 months after this project ended.


Due to shifting priorities in the Data team, we were unable to obtain detailed insights on the performance of the ‘timely action’ cards and new results component. However, we remain optimistic that this is a vast improvement on our customer experience, giving much clearer direction as to what to do next when customers launch the app.