Cost optimization thanks to A/B testing

A/B Testing reveals only the best performances. In addition, it enables costs to be optimized.

Foto.com

Which visual generates the most conversions? What design format brings the greatest results in terms of conversions? Or the greatest reduction in the time between two orders? Among Facebook advertisers, Foto.com decided to challenge the classic operating method it was using for its weekly product highlight. The results of the analysis were highly informative and provided lots of insights!

Goals

→ Determine the design format that is best adapted in terms of performance and costs

→ Forecast the uplift in performance with regard to adapting the designs per country and linguistic version

Results

→ Clear insights into what makes up the highest-performing designs

→ Definition of a guideline based on the analysis of the designs that are best-adapted per country and linguistic version

A/B testing

As a specialist in performance marketing, blue2purple carried out A/B testing in order to compare several types of designs during a Facebook campaign for Foto.com. The goal of the operation was to identify the performance information for each and assign a correlating performance level to the distinguishing element.

The final goal – defining the type of design for future campaigns both for the linguistic versions and the countries while highlighting the insights that enable the highest performance levels to be reached – should be clearly visible in the results of the proposed analysis.

Methodology

To do this A/B testing, we compared two types of designs that were identical in content and targeting but with different conceptions. In order to compare performance, we then used one design in a video and the other in a carousel photo display.

This provided valuable insights to be used for future brand campaigns and enabled the best designs to be determined with regard to the regions and languages.

Results

A higher relevance score for the Dutch version

The results showed that in general and for the same designs the relevance score was higher for Dutch speaking Belgians and in the Netherlands than for French speaking Belgians.

The Carousel display better adapted to the French-speaking target

Again with respect to the relevance score, we were able to see that the videos were more adapted to Flanders and the Netherlands whereas the carousel display was better in Wallonia. And last but not least, the analysis of the click-through rate highlighted the fact that the video performances were, depending on the region, up to 61% higher compared to the carousel!

Insights

Not only did this test enable us to choose more pertinent designs for future campaigns, it also allowed costs to be optimized by giving preference to the highest-performing formats per region. The costs per visit from the carousel campaigns were up to 50% lower for some countries, whereas the video-generated visits were less expensive for other regions.

The most important thing to keep in mind is that an A/B testing makes sense when you take country specifics into account. What works in French-speaking Belgium doesn’t necessarily work in the Netherlands. It is vital to use these campaign insights to rework upcoming strategies.

Share on FacebookShare on Google+Tweet about this on TwitterShare on LinkedIn