Jun 18, 2024 Digital BIAS

UX Optimisation: Google HEART Framework Experiments by Stage

UX Optimisation Experiments by Stage
16:34

The HEART framework is at the centre of tracking the user experience (UX) optimisation viability of our websites and apps. We’ve moved beyond simply liking something or applying our own bias, earned or not, to putting the decisions in the hands of the buyer. We employ the Growth-Driven Design (GDD) approach across the digital product range.

Sure, it must be on-brand and have the right messaging and positioning to speak to your website, app visitors, and users, but there’s no guarantee that you will get that right the first time. Using the GDD approach to UX design, you believe in testing and experimentation for the best result.

But GDD is about design and dev sprints, and the HEART framework is about UX research, user behaviour and the customer experience of your product. They make for a powerful web product with a defined path to further evolution. In this article, I will break down a series of experiments you can run for each stage of the framework so that you can understand why a one-off front-end redesign is now a waste of time and resources.

Happiness Stage

For a B2B SaaS or fintech application, focusing on the Happiness stage of the HEART framework involves measuring user satisfaction, perceived ease of use, and Net Promoter Score (NPS). Here are some experiments and methods user experience designers can use to gather these metrics:

Experiments for Measuring User Satisfaction

1. User Surveys:

  • Post-Interaction Surveys: After users complete key tasks, prompt them with a short survey asking about their satisfaction with the process.
  • Periodic Surveys: Send out regular surveys (e.g., quarterly) to gauge overall satisfaction with the product. Include questions about specific features and overall experience.

2. Net Promoter Score (NPS):

  • NPS Surveys: Periodically ask users the NPS question: "How likely are you to recommend our product to a friend or colleague?" Follow up with an open-ended question to understand the reasons behind their score.

3. In-App Feedback:

  • Feedback Widgets: Implement feedback widgets within the app where users can rate their experience and leave comments.
  • Feature-Specific Feedback: After users interact with new or updated features, prompt them to provide feedback on their experience.

Experiments for Measuring Perceived Ease of Use

1. Usability Testing:

  • Moderated Usability Tests: Conduct sessions where users perform tasks while being observed. Collect qualitative data on their ease of use and any difficulties they encounter.
  • Unmoderated Usability Tests: Use tools like UserTesting to gather data from users performing tasks independently. Analyse the results to identify common pain points.

2. Task Completion Surveys:

  • Post-Task Surveys: After users complete specific tasks, ask them to rate the ease of use on a scale (e.g., 1 to 5). Include open-ended questions to gather detailed feedback.

3. Heatmaps and Session Recordings:

  • Heatmaps: Use tools like Hotjar to visualise where users click, scroll, and spend the most time. Identify areas where users struggle or hesitate.
  • Session Recordings: Analyse recordings of user sessions to observe their interactions and identify usability issues.

Experiments for Measuring Net Promoter Score (NPS)

1. NPS Surveys:

  • In-App NPS Surveys: Periodically display NPS surveys within the app to capture real-time user sentiment.
  • Email NPS Surveys: Send NPS surveys to gather feedback from users who may not be active in the app.

2. Follow-Up Interviews:

  • Interviews with Promoters and Detractors: Conduct follow-up interviews with users who gave high and low NPS scores to understand the reasons behind their ratings and gather deeper insights.

3. Sentiment Analysis:

  • Text Analysis of Feedback: Use sentiment analysis tools to analyse open-ended responses from NPS surveys and other feedback channels. Identify common themes and areas for improvement.

Implementation Example

Goals, Signals, and Metrics

  • Goal: Increase user satisfaction and perceived ease of use.
  • Signals: Positive survey responses, high NPS scores, and fewer usability issues.
  • Metrics:
    • Average satisfaction score from surveys.
    • NPS score and distribution of promoters, passives, and detractors.
    • Task completion rates and time taken to complete tasks.
    • The number of usability issues identified and resolved.

Example Table

 

Dimension

Goal

Signal

Metric

Target

Happiness

Increase user satisfaction

Positive survey responses

Average satisfaction score

Increase from 4.0 to 4.5

Happiness

Improve NPS

High NPS scores

NPS score

Increase from 30 to 40

Task Success

Enhance ease of use

Fewer usability issues

Task completion rate

Increase from 85% to 95%

Task Success

Reduce task completion time

Faster task completion

Average time to complete tasks

Reduce from 5 minutes to 3 minutes

 

By conducting these experiments and tracking the relevant metrics, UX teams can gain valuable insights into user satisfaction and ease of use, ultimately improving the overall user experience of your B2B SaaS or fintech application. Next, let’s look at the engagement stage of the framework.

Engagement Stage

For the Engagement stage of the HEART framework, focusing on metrics that measure user behaviour and how users interact with and consume content can provide valuable insights into their level of engagement. Here are some experiments you could consider:

Experiments for Measuring Content Engagement

1. Personalised Content Recommendations:

  • Experiment with different algorithms to recommend personalised content based on the user's viewing history, preferences, and behaviour.
  • Measure the impact on metrics like the number of videos watched per week, dwell time, and bounce rate.

2. Gamification and Rewards:

  • Implement a gamification system that rewards users for engaging with content (e.g., watching videos, sharing, commenting).
  • Measure the impact on the number of videos watched, shares, and overall engagement time.

3. Content Discovery Features:

  • Test different content discovery features, such as trending sections, curated playlists, or category-based browsing.
  • Measure the impact on the number of new videos/content discovered and consumed.

4. Social Sharing Prompts:

  • Experiment with prompting users to share content they enjoyed at strategic points (e.g., after watching a video on content completion).
  • Measure the impact on the number of shares and potential new user acquisition.

5. Content Formatting and Presentation:

  • Test different content formats (e.g., long-form vs. short-form, interactive vs. static) and presentation styles.
  • Measure the impact on dwell time, bounce rate, and overall engagement.

6. Notifications and Reminders:

  • Experiment with different notification strategies (e.g., frequency, timing, personalisation) to remind users about new or relevant content.
  • Measure the impact on the number of videos watched, dwell time, and user retention.

7. User-Generated Content

  • Encourage and incentivise users to create and upload their own content.
  • Measure the impact on the number of uploads, shares, and overall community engagement.

Implementation Example

Goals, Signals, and Metrics

  • Goal: Increase user engagement with the platform's content.
  • Signals: Users consuming more content, sharing content, and spending more time on the platform.
  • Metrics:
    •   - Number of videos watched per week
    •   - Number of uploads
    •   - Number of shares
    •   - Dwell time
    •   - Bounce rate

Example Table

 

Dimension

Goal

Signal

Metric

Target

Engagement

Increase content consumption

Users watching more videos

Number of videos watched per week

Increase by 20%

Engagement

Encourage content creation

Users uploading more content

Number of uploads

Increase by 15%

Engagement

Foster content sharing

Users sharing more content

Number of shares

Increase by 25%

Engagement

Improve content stickiness

Users spending more time on content

Dwell time

Increase by 10%

Engagement

Reduce content abandonment

Users not leaving after viewing content

Bounce rate

Decrease by 15%

 

By conducting these experiments and tracking metrics like time spent, you can gain valuable insights into user engagement with your platform's content. This information can then refine your content strategy, improve the user experience, and ultimately drive higher engagement and retention. A powerful way to improve the UX of your website or app is to look into the adoption stage of the HEART framework.

Adoption Stage

For the Adoption stage of the HEART framework, focusing on metrics like upgrades to the latest app version, new subscriptions, and purchases by new users, here are some experiments you could consider:

1. Onboarding Experience Optimisation:

Experiment with different onboarding flows and tutorials to help new users understand the value proposition and key features of your app.
Test various messaging, visuals, and interactive elements to guide users through the initial setup and encourage them to upgrade or subscribe.
Measure the impact on new subscription rates and upgrades from the free to the paid version.

2. Feature Announcements and Promotions:

  • Run A/B tests with different in-app messaging strategies (e.g., banners, modals, push notifications) to promote new features or subscription plans.
  • Experiment with the timing, messaging, and incentives offered to encourage users to upgrade or make purchases.
  • Measure the impact on upgrade rates, new subscription signups, and in-app purchase conversions.

3. Pricing and Packaging Experiments:

  • Test different pricing models, subscription tiers, and bundled offerings to understand user preferences and willingness to pay.
  • Experiment with free trials, discounts, or limited-time offers to incentivise upgrades or new subscriptions.
  • Measure the impact on conversion rates for different pricing and packaging options.

 

4. Referral and Viral Loops:

  • Implement referral programs or viral loops that incentivise existing users to invite new users, who can then be prompted to make purchases or subscribe.
  • Experiment with different reward structures, messaging, and sharing mechanisms to optimise the referral process.
  • Measure the impact on new user acquisition, as well as the conversion rates of referred users to paid plans or purchases.

5. Targeted Remarketing Campaigns:

  • Segment users based on their behaviour, preferences, and engagement levels, and run targeted remarketing campaigns to encourage upgrades or purchases.
  • Experiment with different messaging, offers, and channels (e.g., email, push notifications, in-app messages) for remarketing.
  • Measure the impact on conversion rates for upgrades, new subscriptions, and in-app purchases among different user segments.

Implementation Example

Goals, Signals, and Metrics

  • Goal: Increase the adoption of paid plans, subscriptions, and in-app purchases.
  • Signals: Users exploring premium features, engaging with upgrade prompts, and initiating purchase flows.
  • Metrics:
    • Upgrade rate from free to paid version
    • New subscription signups
    • In-app purchase conversion rate
    • Referral conversion rate

Example Table

Dimension

Goal

Signal

Metric

Target

Adoption

Increase paid upgrades

Users exploring premium features

Upgrade rate from free to paid

Increase from 5% to 10%

Adoption

Boost new subscriptions

Users engaging with subscription prompts

New subscription signups

Increase by 20%

Adoption

Drive in-app purchases

Users initiating purchase flows

In-app purchase conversion rate

Increase from 2% to 4%

Adoption

Encourage referrals

Users sharing referral links

Referral conversion rate

Increase from 10% to 15%

 

Using these experiments and metrics can give you valuable insights into user adoption patterns and optimise your strategies to encourage more users to upgrade, subscribe, and make purchases within your app.

Retention Stage

For the Retention stage of the HEART framework, focusing on metrics like the number of active users, renewal rates, and repeat purchases, here are some experiments you could consider:

1.Personalised Content Recommendations:

  • Experiment with different algorithms to recommend personalised content, features, or products based on users' usage patterns, preferences, and behaviour.
  • Measure the impact on metrics like the number of active users, session duration, and repeat purchases.

2. Gamification and Rewards:

  • Implement a gamification system that rewards users for engaging with the app or site (e.g., points, badges, leaderboards).
  • Measure the impact on the number of active users, session frequency, and repeat purchases.

3. Subscription Reminders and Renewal Offers:

  • Test different strategies for subscription reminders and renewal offers (e.g., timing, messaging, incentives).
  • Measure the impact on renewal rates and customer lifetime value (LTV).

4. User Feedback and Support:

  • Experiment with different channels and methods for collecting user feedback and providing support (e.g., in-app surveys, live chat, knowledge base).
  • Measure the impact on user satisfaction, active users, and renewal rates.

5. Referral Programs:

  • Test different referral program structures and incentives to encourage users to invite their friends or colleagues.
  • Measure the impact on new user acquisition, active users, and repeat purchases.

6. Notifications and Reminders:

  • Experiment with different notification strategies (e.g., frequency, timing, personalisation) to remind users about new content, features, or promotions.
  • Measure the impact on active users, session frequency, and repeat purchases.

7. User Communities and Social Features:

  • Introduce user communities, forums, or social features to foster engagement and a sense of belonging.
  • Measure the impact on active users, session duration, and user retention.

Implementation Example

Goals, Signals, and Metrics

  • Goal: Increase user retention and encourage repeat usage.
  • Signals: Users actively engaging with the app/site, renewing subscriptions, and making repeat purchases.
  • Metrics:
    • Number of active users
    • Renewal rates
    • Repeat purchase rate
    • Customer lifetime value (LTV)

Example Table

 

Dimension

Goal

Signal

Metric

Target

Retention

Increase active usage

Users engaging with the app/site

Number of active users

Increase by 15%

Retention

Encourage subscription renewals

Users renewing their subscriptions 

Renewal rates

Increase from 60% to 75%

Retention

Drive repeat purchases

Users making additional purchases

Repeat purchase rate

Increase by 20%

Retention

Improve customer lifetime value

Users staying longer and spending more

Customer LTV 

Increase by 25%

 

By adopting experiments against the right metrics, you can gain actionable insights into user retention patterns and optimise your strategies to encourage continued usage, subscription renewals, and repeat purchases. The HEART is a powerful choice of UX optimisation frameworks. Now, let’s dive into the final stage, task success, to understand how users and website visitors achieve the outcomes they desire.

Task Success

For the Task Success stage of the HEART framework, focusing on metrics like search results success, time to upload, and profile creation completion, here are some experiments you could consider:

1. Search Relevance Testing:

  • Experiment with different search algorithms and ranking factors to improve the relevance of search results.
  • Measure the impact on metrics like click-through rate (CTR) on search results, number of refinements needed, and task completion rate for finding desired information.

2. Search Query Analysis:

  • Analyse user search queries to identify common patterns, pain points, and areas for improvement.
  • Experiment with query suggestions, auto-complete, and other search aids to help users articulate their needs more effectively.
  • Measure the impact on successful query formulation and task completion rates.

3. Upload Flow Optimisation:

  • Test different upload interfaces, progress indicators, and error-handling mechanisms.
  • Measure the impact on metrics like time to upload, upload success rate, and user satisfaction.

4. Profile Creation Funnel Analysis:

  • Identify potential drop-off points in the profile creation process using funnel analysis.
  • Experiment with different form designs, field placements, and instructions to streamline the process.
  • Measure the impact on profile creation completion rate and time to complete.

5. Onboarding Experience Testing:

  • Test different onboarding flows, tutorials, and guidance mechanisms for new users.
  • Measure the impact on task completion rates for common onboarding tasks, such as profile setup, feature exploration, and initial engagement.

6. Error Handling and Feedback:

  • Experiment with different error messaging, inline validation, and feedback mechanisms during task flows.
  • Measure the impact on task completion rates, user frustration, and the ability to recover from errors.

7. Accessibility Testing:

  • Ensure that task flows are accessible to users with disabilities by testing with different assistive technologies and adhering to accessibility guidelines.
  • Measure the impact on task completion rates and user satisfaction for users with disabilities.

Implementation Example

Goals, Signals, and Metrics

  • Goal: Improve task success rates and reduce user frustration.
  • Signals: Users successfully completing tasks, faster task completion times, and positive user feedback.
  • Metrics:
    •   - Search results click-through rate (CTR)
    •   - Task completion rate for finding information
    •   - Time to upload
    •   - Upload success rate
    •   - Profile creation completion rate
    •   - Time to complete profile creation
    •   - Task completion rates for onboarding tasks
    •   - User satisfaction scores

Example Table

Dimension

Goal

Signal

Metric

Target

Task Success

Improve search relevance

Users finding desired information

Search results CTR

Increase by 15%

Task Success

Streamline upload process 

Faster and successful uploads

Time to upload, Upload success rate

Reduce time by 20%, Increase success rate to 98%

Task Success

Simplify profile creation

More users completing profiles

Profile creation completion rate

Increase from 70% to 90%

Task Success

Enhance onboarding experience

Users exploring features

Task completion rate for onboarding tasks

Increase from 60% to 80%

Task Success

Improve error handling

Users recovering from errors

User satisfaction scores

Increase from 3.5 to 4.2 (out of 5)

 

That brings us to a conclusion on the HEART UX framework and how you can successfully apply, test and experiment with it in your business on your app or website. These examples can help you understand task success rates and identify areas for improvement, ultimately enhancing the user experience and helping users accomplish their goals more effectively.

I hope you enjoyed the article and are thinking differently about your user experience, web and app design approach, and how you apply the budget. For help with an app or website, please talk to our team through the form in the footer below.

Published by Digital BIAS June 18, 2024