Analysis Setup
A/B testing is a way to compare two versions of something on your site/app to figure out which performs better. To start, decide what two things you want to compare. To measure the difference in impact you’ll first need to define goal metrics or KPIs. By using those metrics, you evaluate performance.
A good starting point is to think about why did you plan this modification?
-
Increase the click rate on a CTA?
-
Reduce exit rate or increase the page consumption?
-
Increase the engagement rate with a specific element?
Think of a specific objective, which may be different from the macro objective of your website
-
Ex. Test on a product page will have the “Add to Cart” objective as primary KPI, and the “E-commerce conversion” as a secondary objective.
Do not set more than 2-3 objectives: if you have more, you won’t be able to make any decisions at the end of the A/B Test. The more objectives you are measuring, the more likely you'll have random fluctuations in your results.
Learn how to assess the performance of your testing KPIs.
Analyzing the test results
Impact Quantification
The first step is to take a critical look at the results of the AB Test. Using the Impact Quantification to start, you can get a view of the session level metrics for each group.
To do so, open Impact Quantification and in a Comparison mode in the Analysis context apply your testing segments : Control and Variant. Note down the answers to the following questions:
-
Are the results significant?
-
What about its effect on conversion and revenue?
-
Are the visitors in the Variant more likely to stay on the site? Do they have deeper journeys and spending more time on site?
-
How likely they are to achieve the page goal? (Are they more likely to add something to their Bags or view enter the booking funnel?)
-
Is it improving in some indicators but not translating to an overall improvement in conversion? If so, we need to start analyzing why this is so. Now is the time to start really drilling down and make a deep dive analysis in the rest of the modules in Contentsquare.
Note: Impact Quantification doesn't display significance levels for the Revenue and Cart comparisons. Check the significance of the difference in these metrics in your AB test solution tool. |
Page Comparator
Page comparator is useful for getting an overview of different segments’ activity on a given page: Did it impact any of the key UX metrics on the page?
To start your analysis, open Page comparator and in a Comparison mode in the Analysis context apply your testing segments: Control and Variant. Then, favor the page you want to analyze and look at the following metrics:
-
Check the exit/bounce rate of the page to figure out if your test is helping in retaining your visitors on the page
-
Look the scroll rate to understand If the change had an impact on how much your visitors scroll through the page
-
Compare the 2 segments' activity rates to see if the users in the Variant are more active on average and more likely to engage with the page content
-
Apply different conversion goals (link those to the predefined objectives of your test) to understand how the test is impacting the micro conversions of your page: are your visitors now more likely to achieve the goal you’ve set?
Journey Analysis
Use the Journey analysis to understand if test has affected the key journeys on the site: Do the journeys differ between the two segments?
Open Journey analysis in 2 tabs of your browser - in one set your Analysis context to the Control segment, and in the other - to the Variant. By looking at the journeys after the page where the test is running, you can really understand:
-
Where your visitors go after seeing your test? Are there any notable differences in the browsing journeys?
-
Is the change aiding their overall navigation or disrupting their journeys by creating stumbling blocks? Are they seeing any unexpected pages (e.g., Error pages)?
-
Is there any looping behaviors: are they more likely to loop back between the page you're running your test on and other key pages?
Zoning Analysis
How does interaction on elements on the page differ between the two versions? With the Zoning analysis you can get an in-depth understanding of the user behavior on the page.
Open the Zoning analysis, activate the Comparison mode by clicking on the Compare button. By default, the two zonings will have the same Analysis context applied. Change the default segments to correspond to your test segments. Analyze the following metrics:
-
Look at the click rate and attractiveness rate. How did the test impact the attractiveness of the key element? Is the element you're testing having more or less clicks?
-
Check how the hover rate, time before first click and click recurrence are trending. Did it have an impact on interactions in any other way?
-
Compare the engagement rate with the rest of the content on the page. Did your test impact the interactions with other key elements on the page?
- Look at the conversation rate of the element you're testing. Select the goal of your test as the conversion goal. Are the users in the Variant having an easier time finding what they're looking for?
Monitor your test performance
Create a custom Dashboard to monitor the ongoing performance of your A/B test.
- Create a widget for page-level metrics (such as time on page, exit rate) and compare your control and test segments.
- Create widgets for conversion rate (such as viewing a subsequent page, or transacting) and apply the control and variant segments.
- Create widgets for specific zones, that maybe differ between versions. Monitor the interactions with these zones (click rate, click recurrence, conversion per click).
Go further
Learn how to move from analyzing your test results to making data-driven decisions.