Split Testing Can Increase Conversion Rates
Split testing is a conversion-tracking method that separates your message into two or more variations to see which gets the best response. Different visitors see different versions of the message and the results are tracked to determine the best return.
We wanted to learn more about conducting split tests (also known as A/B tests), and so we corresponded with Paras Chopra, founder of Wingify, a company that produces Visual Website Optimizer, a split-testing tool for ecommerce merchants and other web operators.
Chopra cited the example of a holiday A/B test conducted recently by MedaliaArt, an online art gallery specializing in Caribbean and Latin America art. “MedaliaArt put up a holiday sale where they offered 5 to 55 percent discounts on all paintings,” he said. “They wanted to determine the best location on the home page to put the message so as to optimize for bounce rate.”
The challenge for the company was to determine where to show the message. “Displaying it prominently on the home page will make more visitors notice it, but some visitors may find it too intrusive and leave the site immediately,” said Chopra. “On the other hand, putting it at a not-so-noticeable location may have no effect at all.”
An Example Split Test
For its split test, MedaliaArt created a couple of versions of its home page with “Holiday Sale” displayed at two different home page locations. One version represented what Chopra calls an “in-your-face ‘Holiday Sale’ message displayed in big, red font prominently on the homepage.”
The second version was a sidebar “Holiday Sale” message in a smaller font.
“Usually, split testing tools track conversion rates (percentage of visitors doing desired action). But, to track the bounce rate, MedaliaArt defined a click on any link on the home page as conversion. Thus the conversion rate of, for example, 40 percent corresponded to a 60 percent (100 percent less 40 percent) bounce rate.”
The first batch of conclusive results was available within two weeks.
“Clearly, the in-your-face, prominent promotional message has a dramatically lower bounce rate (60 percent) than the sidebar one (76 percent), said Chopra. “The reduction in the bounce rate of 21 percent is statistically significant (at 95 percent confidence level) so the in-your-face variation obviously represents a better version. The improvement in bounce rate means more interest by visitors in the paintings they are selling and potentially more sales.”
Without split testing the company could have never known the optimal position of its promotional message. And, fears that a prominently displayed promotional message might backlash by irritating visitors proved not to be an issue.
How To Improve A/B Test Results
Chopra had a suggestion for MedaliaArt (and other ecommerce merchants considering A/B testing). “Also include a variation without the ‘Holiday Sales’ messaging. If MedaliaArt had included such a variation, it would have provided a benchmark to see the effect of the sales message, irrespective of the position.”
Chopra said MedaliaArt could also have used different versions of text in addition to different home page positions. “Maybe a message with the word ‘discount’ (such as ‘55 percent discount on paintings this holiday season’) would have worked better than the default message (‘Holiday Sale’). And, optimizing for bounce rate is fine, but a better metric would have been to measure and optimize for sales, which is what really matters to an ecommerce site.”
Chopra said he believes split testing is the only way to really know what will work and what won’t. “It is essential to check assumptions related to promotional messages, checkout process, product category ordering, buy now button, and more.”
He suggested that merchants should be a little adventurous and test radically different home page designs and ideas. “You can always choose to include only a small percentage of traffic and can disable non-performing variations at a click of a button.”