The goal of running an A/B test is to gather data to drive decision-making. Here’s an example of the decision-making process and the factors that need to be taken into account before making a launch/no launch decision:
- Do you need to make a tradeoff between different metrics?
Ex. If the engagement rate goes up but revenue goes down, should you launch?
- What is the cost of launching this test?
Ex. What are the costs for developing and maintaining the new feature? Can the expected gain cover them?
- What is the downside of making a wrong decision?
Ex. What is the opportunity cost if you forgo a change that has real impact?
Understand the practical significance of your test
Key is to establish not only statistical significance, but also to decide how big of a difference in the main KPIs actually matters from business perspective: is the difference worth the costs of making this change? In other words, what change is practically significant?
Depending on your business, a 0.2% change of revenue-per-user might be practically significant, in other cases - this change might be too small and you are only looking for changes that improve by 10% or more.
Main KPI Negative | Main KPI Flat | Main KPI Positive | |
---|---|---|---|
Practical significance Negative |
Iterate, perform deeper analysis on the page or use another insight to do a test |
Iterate, perform deeper analysis on the page or use another insight to do a test | Magnitude of change may not be sufficient to outweigh other factors such as costs. Do a deeper ROI-based analysis. |
Practical significance Flat |
Iterate, perform deeper analysis on the page or use another insight to do a test |
Iterate, perform deeper analysis on the page or use another insight to do a test | Magnitude of change may not be sufficient to outweigh other factors such as costs. Do a deeper ROI-based analysis. |
Practical significance Positive
|
Repeat the test with more units to gain more statistical power |
Repeat the test with more units to gain more statistical power | Launch |
Example of a decision-making matrix for understanding practical and statistical significance
- What if your test is flat or losses?
You analyze the treatment, the results across segments, and improve your hypothesis. Then – you test again! Be prepared to run many test rounds for each page.
Be also aware of cannibalization effects and side effects from the changes you make. Even a subtle change can have repercussions on how your visitors perceive and behave on your site.
Use case: Client X have run a test on the visualization of their color swatches. They were surprised to see that the test was losing even though the design was considered an industry best practice and they saw an increase in clicks on the color swatches. After an analysis in Contentsquare, they realized that this was due to a decrease in the visibility of reviews. Reviews acted as social proofing and had a considerable impact on conversion
Tips:
- We recommend having an A/B Test follow-up document where you can easily record key information on context.
- One step at the time, one A/B Test = one modification. Don’t change too many things on the same page. If you do, you won’t be able to analyze the real impact of your modification.
- Are you sure your modification can have a relevant impact?
- Be sure to make your test when users are behaving “normally” (Ex: we don’t recommend you to implement an AB Test during Black Friday or other sales periods).
Iterate
- What if you build yourself a winning test? You test again!
Having “an iterative mindset” is key for having a successful CRO program. It’s pretty rare that you’re going to get everything perfectly right with only trying once. So, plan from the beginning to collect some learnings and then iterate.
Conversion optimization is a systematic, repeatable process. You test, measure the impact of your test, then–you analyze again, look into a different part of the page, build a new hypothesis and test again. Because if you don’t iterate, you could move on to the next project losing out on some great opportunities.
Note: Learn how to setup and run your analysis in Contentsquare. |