By Katia Kremneva, Lead data science, and Linzi Ricketts, Senior product manager
How can we test the impact of a tailored user experience?
How can we personalise content recommendations to reflect a user's interests, making it more likely that they will engage with our content? Previously we had struggled to find a way of accurately testing the user impact of personalised content recommendations on our sites. This was largely because tools like Google Optimise are more suitable for testing design or small-scale changes, rather than personalised content.
Our in-house recommendation engine is based on behavioural data that we collect from users (e.g., page views, saved items and ratings). Recommendations update as soon as new behavioural data becomes available. We integrated Google Optimise directly onto the page via a JavaScript API, which triggers a different logic within the widget if the user is within a specific experiment and variant. If the user belongs to the test group, the Recommendations API is called, and the most relevant content is displayed in the user’s hub. Otherwise, the user sees a default set of content.
We had planned to run our A/B test to confirm whether our hypothesis that personalised content drives higher engagement for a month, however after two weeks we reached a clear result. The conversion rate in the experiment where users were seeing the most relevant content to them was 3.3 times higher than default content. As a result of this, we updated our logic so personalised recommendations are now displayed to 100% of users who visit their saved hub items.
After a successful A/B test in the user’s hub we are planning to roll out personalised recommendations to other areas of the site, test that with other Immediate brands and communication channels.