A/B Testing reveals only the best performances. In addition, it enables costs to be optimized.
Which visual generates the most conversions? What design format brings the greatest results in terms of conversions? Or the greatest reduction in the time between two orders? Among Facebook advertisers, Foto.com decided to challenge the classic operating method it was using for its weekly product highlight. The results of the analysis were highly informative and provided lots of insights!
→ Determine the design format that is best adapted in terms of performance and costs
→ Forecast the uplift in performance with regard to adapting the designs per country and linguistic version
→ Clear insights into what makes up the highest-performing designs
→ Definition of a guideline based on the analysis of the designs that are best-adapted per country and linguistic version
As a specialist in performance marketing, blue2purple carried out A/B testing in order to compare several types of designs during a Facebook campaign for Foto.com. The goal of the operation was to identify the performance information for each and assign a correlating performance level to the distinguishing element.
The final goal – defining the type of design for future campaigns both for the linguistic versions and the countries while highlighting the insights that enable the highest performance levels to be reached – should be clearly visible in the results of the proposed analysis.
To do this A/B testing, we compared two types of designs that were identical in content and targeting but with different conceptions. In order to compare performance, we then used one design in a video and the other in a carousel photo display.
This provided valuable insights to be used for future brand campaigns and enabled the best designs to be determined with regard to the regions and languages.
A higher relevance score for the Dutch version
The results showed that in general and for the same designs the relevance score was higher for Dutch speaking Belgians and in the Netherlands than for French speaking Belgians.
The Carousel display better adapted to the French-speaking target
Again with respect to the relevance score, we were able to see that the videos were more adapted to Flanders and the Netherlands whereas the carousel display was better in Wallonia. And last but not least, the analysis of the click-through rate highlighted the fact that the video performances were, depending on the region, up to 61% higher compared to the carousel!
Not only did this test enable us to choose more pertinent designs for future campaigns, it also allowed costs to be optimized by giving preference to the highest-performing formats per region. The costs per visit from the carousel campaigns were up to 50% lower for some countries, whereas the video-generated visits were less expensive for other regions.
The most important thing to keep in mind is that an A/B testing makes sense when you take country specifics into account. What works in French-speaking Belgium doesn’t necessarily work in the Netherlands. It is vital to use these campaign insights to rework upcoming strategies.