Engaging digital content is not only more enjoyable for your audience but also helps drive more conversions. Great creatives drive higher traffic and CTR and lower KPIs such as CPC and CPA.
The challenge many companies face is achieving consistent results and understanding what it is about their creatives that are making them perform well and resonate with the audience since there are many factors to consider.
After working with many growing accounts, we’ve discovered that unlocking scale is a combination of science-like testing and creative expertise. By simplifying your variables and finding out what works best through testing, content creators can experiment with the elements of the creative that will make the biggest difference in results.
Like all accurate tests, a scientific approach will minimize the influence of bias in the experimenter and create a defined process of creative testing which can improve results.This scientific approach goes as follows:
- Question
- What is the goal or outcome that this test is trying to achieve?
- E.g Which creative angle drives more purchase conversions?
- What is the goal or outcome that this test is trying to achieve?
- Hypothesis
- Create a hypothesis from the question asked that will be proven true or false
- Creative angle A will outperform creative angle B in terms of purchases
- Create a hypothesis from the question asked that will be proven true or false
- Structure
- Based on the method above, create the structure of your testing to verify your hypothesis. Identify the key metrics that need to be recorded to determine whether your hypothesis is true or false
- A/B test to determine which creative drives more purchases
- Based on the method above, create the structure of your testing to verify your hypothesis. Identify the key metrics that need to be recorded to determine whether your hypothesis is true or false
- Plan
- Create a plan of action based on the possible outcomes of the test before the test is launched. This allows you to be effective in using data based decisions to further the success of your campaign.
- If creative angle A wins, use this angle in the main campaign
- If creative angle B wins, use this angle in the main campaign
- If there is no clear winner, use both in the main campaign or reevaluate the test structure
- Create a plan of action based on the possible outcomes of the test before the test is launched. This allows you to be effective in using data based decisions to further the success of your campaign.
Generally speaking, many facebook marketers use the old strategy of adding x amount of creatives to an adset to see which one gets the best results. This can be heavily biased as the early performance of an ad is favoured by the ad auction and doesn’t give the other creatives a real chance of giving fair and reliable results.
However, obtaining accurate test results is built into the Facebook Ads Manager by using the A/B Test tool. With this tool, you can test individual creatives, adsets, or campaigns. As part of this testing setup, you need to select the most relevant success metric for the test that is most relevant to your campaign objective such as purchases. The tool will then ensure that each variable in the test receives an equal amount of spend and audience quality and will then select a winner based on the lowest cost per metric that you chose to measure.
As tests require a minimum of 50 conversions per week for campaign stability, you should aim for 100 conversions per week to have more confidence in the observed performance of the test. You should also have a minimum of 4 consecutive days to test, however 7 days will be most accurate as you will reduce the weekday variation bias.
Creatives with a lot of ad level data are most likely to outperform new creatives. The reason being is that the new creatives have no existing data or learning associated with them, which is why they may be at a disadvantage when testing with an established creative.
Still confused?
Book a call with one of our experts to find out how we can help you with Creative Testing