A/B testing (also called split testing) is the process of running two or more concurrent versions of content on the same medium, with the same desired outcome, to determine which version is most effective. Often A/B testing is used on webpages, apps, search ads, paid social media posts, email marketing campaigns, and other digital marketing platforms. In fact, 66% of companies test multiple landing pages on their websites. However, it can be equally effective for offline marketing efforts — particularly direct mail, sales letters, or print ads.
A/B testing can make a huge difference on your marketing efforts, and ultimately, to your lead generation, sales, and bottom line. Comparing your A and B versions with statistical analysis enables you to isolate what works and what doesn’t. If something doesn’t work, cut it. If it does work, replicate it and continue to refine it with further A/B testing. Ultimately, A/B testing helps you maximize your marketing results and stops you from spending money on ineffective marketing.
Why you should A/B test
A/B testing has many advantages, but ultimately it boils down to the fact that you will get higher returns for lower expenditure. It weeds out ineffective marketing, and enables you to design more effective marketing campaigns.
A/B testing arms you with the hard data you need to make and support your marketing decisions. It takes the guess work out of marketing, by allowing you to build and test hypotheses. A/B testing should be used continually to constantly refine and improve content, on all levels of marketing.
A/B tests produce real world results
According to Optimizely.com, using A/B testing:
- Discovery Communications increased viewer engagement by 6%
- Sony increased purchases by 20%
- comScore increased lead generation by 69%
- Secret Escapes doubled their conversion rate.
These results are not out of reach for your marketing teams. These five simple steps will help you refine your content through A/B testing:
Determine what you want to test
The first step of setting up an A/B test is, clearly, to determine what it is that you want to test. Testing digital marketing content is generally less expensive and more common than testing content for more traditional marketing media. You may also want to target underperforming content, or use A/B testing to determine if your content is in fact underperforming.
Should you test onsite or offsite content?
You may choose to A/B test onsite content which includes sales copy on your website and any content that has a CTA (call to action). Or you may choose instead to A/B test offsite content like search ads, landing pages, social media ads, or email marketing campaigns. In either case the fundamental principles remain the same.
As part of this first step, you also need to determine your goals. You should already have a clear idea of your marketing objectives, but if not then ask yourself: do I want to generate leads, create a mailing list, drive ecommerce sales, or direct traffic?
Creating test content
Once you know what you want to test, and have identified your goals, it is time to generate the content. Some things that you may want to consider experimenting with are:
• The design of the content, including the:
• location of the call to action (CTA)
• location of any relevant buttons, links, or forms
• images, graphics and videos you’re pairing with your content
• colour, layout or structure of the piece
• The length, tone and style of the content
• Wording of titles, headings and CTA
Running your test
Tests should be run simultaneously in order to minimize the outside variables that could impact the results. Testing one page today and the other tomorrow leaves a cloud of uncertainty hanging over the results. Even seemingly small factors like the time of day or date can have a massive impact on online traffic. Add to this the myriad of unforeseen circumstances that can affect website traffic, and it becomes clear that you need to A/B test simultaneously. This is also why you will want to extend your A/B test for as long as possible – a larger sample size means more accurate results.
Analysing tour A/B test
You can either A/B test from the get go, which will give you a direct comparison between your A and B content, or launch the A/B test after your primary content has already been up and running. In either case, it is important to compare not absolute figures, but percentages and relative statistics relating to your goals. For example, ‘Clicks’ is obviously an important metric, but if one piece of content has more traffic, then click through rates (CTR) may be a more accurate way to determine which copy is better.
Through analytics services there is a seemingly endless volume of different metrics that you can compare, but arguably the most important are:
• Impressions: how many people saw the content
• Clicks: how many people saw the content and then clicked it
• CTR: how many people clicked the content as a percentage of people who saw it
• Engagements: how many people engaged with the content
• Goal conversion: set up prior to beginning your A/B test based on your own goals
Build your hypotheses
Once you’ve collected your data you can infer the causes of divergent results. Then, using these inferences you should form a hypothesis, and use that as the basis of further A/B tests, or even A/B/C tests. Analyze these further tests to confirm or refute your hypothesis, then create better content, and test again. For example, according to WishPond, “Eric Siu from Treehouse found his cost per acquisition (CPA) was stubbornly floating around $60. Adding the word “free” to the ads decreased his CPA by $17 per start up.” In another example, NeilPatel.com “had four fields in their entry form: ‘Name’, ‘Email’, ‘URL’ and ‘Revenue’. Removing ‘Revenue’ from the entry form rocketed its conversions rate by 26%.”
Offline A/B testing
A/B tests for traditional marketing channels are generally more expensive and more time consuming but can reveal important insights. For example, sending out two different sales letters, each with different wording, or imagery, or the same letter to two different audiences can reveal a great deal of information that will allow you to refine future letters.
Tracking offline A/B testing metrics can be a bit more complicated, but it can be done. To track metrics, include with your sales letters different phone numbers, email addresses, web addresses, or QR codes that relate to either the A or B piece of content and then tally up the number of responses for each.
A/B test your sales funnel
While 72% of online retailers test their call to action buttons, only 49% test the performance of their check out process. It’s also important to use A/B testing to determine the effectiveness of the entire consumer journey, including:
- Online or offline ads
- Landing pages
- Ecommerce checkout pages
- Welcome or thank you email correspondence
A/B testing your entire consumer journey will ensure that you make your marketing and online content as streamlined and effective as possible, improving user experience, generating leads, and ultimately increasing your sales.