By William Storey and Nam-Anh Tran

As anyone who works in digital marketing knows, a key part of continuously improving media performance is to run tests of changes in spend or activity in order to determine the level of influence we can expect on our KPIs.

However, carrying out a test is all well and good, but without a solid methodology for measuring the true impact of these tests, the exercise becomes pointless. In this blog, we will explain two methods for measuring the impact of media tests and also how you can use this information to gain maximum insights from your media experiments.

Back to blog home

Intro to the test

In this blog, we will be referring to a test we ran for a telecommunications service provider. This test involved a reduction in spend for two paid search channels, Generics and Shopping; the UK was split into three test regions:

  • 25% of the UK would be a control group where no testing would take place.
  • 25% of the UK would be given no spend for Shopping.
  • The remaining 50% would be given no spend for both Shopping and Generics.

The performance of sales was monitored throughout the test period in order to determine how much value Generics and Shopping were bringing to the business.

There were two stages to this analysis:

  • Estimating the effect the cut in spend would have on overall sales before the test started.
  • Creating baselines which forecast the sales performance given that no test was run, to be used as a comparison.

Incremental uplift

Using converting and non-converting user journeys that contained Generics and Shopping we could calculate the incremental values for each of these channels by comparing average path conversion rates with and without these channels. Furthermore, we could calculate their incremental uplift on various other channels. Hence, we could estimate not only how many conversions we’d lose from Generic and Shopping directly but what sort of halo effect we can expect on other channels. This allowed us to determine what channels will be the most affected and create a forecast.

The halo effect showing incremental uplift

On the above chart, we estimated how many conversions we’d lose from each of these channels if we were to cut Generics by 50%, 75% and 100%. The final results of the test confirmed that indeed Organic was the most affected, Brand Paid Search was the second, Display Remarketing was the third, etc. The overall loss volume was within 15% of the forecast.

Baselines

We wished to know, for each of our test regions, how the online and offline sales would have looked given that no cuts to spend were made at all.

In order to do this, we used a forecasting technique to project past behavioural patterns into the future, as displayed below. The dashed lines are 80% confidence intervals, meaning we expected the number of sales to fall within these bounds 80% of the time.

Baseline forecasting

We then monitored the actual sales volumes seen during the test in comparison to this forecasted baseline. The results were conclusive.

As seen in the graph below, we can see that the online sales gradually decrease at the start of the test and then at the end of the test, slowly creep back up to the volumes that they normally would have been.

Test vs forecast

This allowed us to safely conclude that the effect of Generics and Shopping spend was significant on performance, thus proving the value of this channel.

Summary

Hopefully, the two analysis pieces above have made you question whether your testing procedure is as statistically rigorous as it could be.

  • Are you currently launching into tests with no idea of what the predicted impact would be? Maybe you could benefit from incremental uplift analysis so that you know what to expect.
  • Do you find that you are struggling to create accurate control groups, that are representative of your customer base? Perhaps having a baseline that you can compare your results to would really help you visualise the true impact of your test. If we can help you make your tests have a higher impact on your media planning (we think we can!), then please do get in touch.

Share this article