Learn about the different key performance indicators available when using the Error Analysis module.
Session with errors
The number of sessions that have encountered an error.
Sessions with error after click
See errors that are linked to clicks to help you better correlate the error to user frustration.
Use the 'Sessions with error after click' column to see the number and percentage of sessions with errors that occurred after clicks.
A session with an error after click occurs when, on a pageview, a user clicks at least three times on an element and each of these clicks is followed by an error in less than 2 seconds.
You can sort errors that are linked to clicks to prioritize errors impacting your business and improve your user's experience.
Note
Sessions with errors after click are sorted by percentage unless there are under 30 sessions. Errors with less than 30 sessions are sorted by ranking or weighting of attributes.
Missed opportunity ($)
The revenue lost as a result of the difference in conversion rate (e-commerce) of those who have encountered the error and those that didn’t.
The Missed opportunity metric is only applied on e-commerce accounts and can only be applied an e-commerce goal.
Impact on goal (%)
The difference in conversion rate between those sessions that have encountered the error and those that didn’t.
Lost conversion (#)
The number of conversions we are losing as a result of the difference in conversion rate (reference goal) of those who have encountered the error and those that didn’t.
Statistical Significance
Hover over the symbol displayed in the Missed Opportunity and Lost Conversion to see whether the value is statistically significant or not statistically significant.
"Not significant" means that the segments and/or the conversion rate difference are so small that we cannot confidently say the lack of conversion is correlated to the error.
Note
Statistically significant missed opportunities are prioritized first.
Lost Conversion Sum on a page group level
This is the sum of conversions lost as a result of the difference in conversion rate (reference goal) of those who encountered an error on the page group, versus those who didn't.
FAQs
Why don't the Lost Conversions on mobile and desktop add up to the total?
When analyzing the Lost Conversion sum on mobile and desktop separately, you might question why their individual results don't add up to the combined device total. This is, in fact, not a mistake and expected due to a statistical phenomenon known as Simpson's Paradox.
Let's take a look at an example in Contentsquare's Error Analysis:
Device | Sessions with Errors | Conversion Rate without Error | Conversion Rate with Error | Lost Conversions |
Mobile | 600 | 17% | 16% | 6 |
Desktop | 100 | 65% | 54% | 11 |
Both | 700 | 40% (combined) | 21% (combined) | 133 |
You might expect the sum of Lost Conversions to total 17 (6+11), not 133. However, this is an example of Simpson's Paradox causing the results of separate groups (mobile vs. desktop) to behave differently once combined. As a result, the overall average is being dominated by the mobile results, which has significantly more sessions with errors (600) and a much lower conversion rate than on desktop, thus inflating the combined Lost Conversion result.
Because of this statistical phenomenon, we recommend always looking at such metrics by group (like device) and in total, to uncover both sides of the story.