Split Testing

The standing joke in the marketing community is that contrary to popular belief, A/B testing stands for “Always Be Testing”. Jokes apart, this isn’t that far from the truth. Given how easy it is to change the layout of anything digital if you aren’t continually trying to improve your creatives, either A) you don’t care about your business that much or B) you don’t need the extra work in the first place.

Split Testing is the process of testing one iteration of your creative against another variation to find out which version performs better. Do more website visitors convert to paying customers if your checkout button is Green or Red? Does more of your traffic signup to your newsletter if you have a one or two-step signup process?

The process of split testing isn’t limited to a simple A/B test like your checkout button experiment, for example. Split testing can also take the form of multivariate testing, where you test multiple combinations of different elements against each other. The goal is to find a winner either in terms of clicks, signups, sales, or any other KPI you need to measure.

I can’t count how many split tests I’ve run over the years, and every single time I ended up with improved metrics — every time, without fail. I concede that the whole process is a self-fulfilling prophecy. You’re looking for a winner, so by default, when you find that winner, your results will get better, but I prefer to back real data over hunches. Split testing uncovers the weaknesses of my marketing strategies, shows me what I’m doing right, and can be an invaluable tool to confirm (or disprove as the case may be) any hypothesis I made about my strategies.

For most of you, split testing will take the form of A/B tests, that is two variations tested directly against each other.

What should you be testing?

Any campaign or online element can be broken down into some reasonably standard ingredients. You’ll have an opening title or headline, some images, your sales copy, a call to action, and, in most cases, a signup form.

Headline:

Your headline is your opening hook, so conventional wisdom is to keep it short and to the point. In some markets, however, concise just won’t cut it, so testing the actual length of your headlines is a must. Test the tone of your headlines. Are you problem-solving? Is there urgency involved?

Does your copy have a positive or a negative tone? “Stop wasting money” versus “Get more for your money.” If you don’t test, you won’t know, and if my tests are anything to go by, the results may mean the difference between a Ferrari and a Fiat.

Images:

Color, or black & white? Do your images show people using your product? Do you use one image or many? How well does your image contrast with your text? Have you tested your color scheme? Geography should play a big part in your split testing. Changing an image is as simple as <img src=”image-1.jpg”> or <img src=”image-2.jpg”>. Not exactly rocket science, is it?

Content

This one’s a bit more complicated and will be determined to a great extent by what it is that you’re actually “selling.” Will long-form content outperform short-form content or vice-versa? Can your product or service adequately be described with a short-form copy, or does it need long-form? Rule of thumb here, if your copy answers all the questions your potential customers might ask, it’s probably long enough, but test it all the same!

What can be tested quite easily and something I see overlooked a lot is the actual layout of your copy. Have you broken down your content into easy to digest chunks? Are you drawing attention to all the critical features of your product or service with headings or bullet points? Next time you run any campaign, spare some time for copy layout tests as well.

Signup Forms

Statistics show that the more information you ask for in your signup form, the fewer signups you’ll get, so confine your forms to what you need to qualify the lead. If you don’t need a telephone number, don’t ask for it. Your signups will be much more prone to providing additional information once they’ve already signed up with you. This tendency is what’s known as the Zeigarnik effect, and what it boils down is this. People are more likely to finish something they’ve already started doing, so make it as easy as possible for them to start.

The Zeigarnik effect also spills over into the actual signup process. Are you just presenting a signup form as a pop-up, or do you link to it from a signup button?

Networks

Not all networks will perform the same for your campaigns, so testing where you run your ads should be a significant part of your split test strategy. Different sites appeal to different audiences, and all audiences have different expectations. Do your ads perform better in the morning, afternoon, or evening? What age groups convert best for your product or service? Does a more visual ad perform better, or do text ads convert more? If this seems like a lot to consider, do keep in mind that split testing for all these various parameters is surprisingly simple. If you can set one campaign to run in the afternoon, you can schedule another to run in the morning. Simple!

Call To Action – CTA

The thing that leads your visitors to your checkout pages, your Call To Action is probably the most crucial element to split test. You might have the most compelling headlines, fantastic images, and content from a guru copywriter; if your CTA isn’t prominent, easy to spot, and visible, nobody is going to click. Styles come and go, and keeping up with the latest trends in layouts and colors is crucial to creating a compelling call to action.

It’s not just the style of the CTA you need to test here. You need to check the text, as well. Which of the two elements below will perform better?

CTA Variations

My money would be on the second one, GET YOUR DISCOUNT. The beauty of split testing, though, is you don’t need to make any assumptions. Run your tests and let the data lead you to the best results. You might like orange as a color, but if an Orange CTA receives a 2% click-through and a Blue one gets 3%, your new favorite color is blue.

Roundup

The assumptions you make about your product or service are always going to be your starting point, but nothing is absolute. Launch your campaigns and test every single element until you find the winners.

Make sure to run your variation tests simultaneously. Conversions will naturally vary on different days. Set your variations to run on the same days for the most accurate results.

Don’t confine yourself to a single test. Just because you’ve tested your headlines, it doesn’t mean you don’t have to check the other elements in your campaign as well.

Don’t test more than one element variation at a time. If you’re split-testing CTAs, don’t split test images as well. If you do, you won’t know which element is responsible for any change in results.

The process isn’t complicated, and all the tools you need to run split testing are readily available. The most challenging thing can sometimes be as simple as getting out of your way and trusting the data. We can all make the mistake of assuming we know all there is to know about our target audience, but until we test our assumptions, we’re in the dark!

Leave a Reply