The process for setting up an A/B test will vary based on the software you’ve chosen. However, there are certain steps you should follow regardless of the platform you use, which we’ll outline here.
Decide What You Want to Test
Try to narrow your focus down as much as possible for your test. While it is possible to test multiple elements on a page at once, this will keep you from pinpointing exactly which change contributed to a fluctuation in conversion rates.
Let’s say you think your “start now” button isn’t converting well. You may decide to test the current version against a version that is a different color. That’s your test: button colors. Avoid complicating things by changing the button’s text or location at the same time, because even if the new version works, you won’t know why.
Choose the Variation(s)
What do you want to change in your test version? In the example above, it’s the color of a button. If your current button is purple, but you want to try green, the green button will be your variation.
If you’re only testing a small factor like button color, you can always add a third variation (or more!). For example, you could test your current purple against both green and blue to see if color can make a difference in conversions.
One thing to keep in mind: the more variations you create, the more site visitors you’ll need to achieve statistical significance (more on that below).
Start Your Test
Use your tool of choice to create your test. Double check that it works in a few different browsers to ensure that it’s running properly – and that visitors can see it – before you leave it alone for a while.
Waiting is the hardest part of an A/B test. No one wants to sit around and wait for data to be collected, but for most tests, you’ll have to do that for at least two weeks.
If you have an enormous number of site visitors, you might not have to wait that long, especially if your test version starts outperforming the control right out of the gate. But it’s still best to give the test time to “level out,” just in case there are coincidences or other factors at play.
Look for Statistical Significance
As you prepare to end your A/B test, look for results that contain some kind of statistical significance – that is, results that matter.
If your A version of a page converted at 2.17% while the B version converted at 2.19%, this is probably not really significant. That 0.02% might only represent a few cents of revenue, or a single lead among thousands.
On the other hand, if your A version converted at 2.17% and the B version converted at 4.17%, this probably is significant. Depending on the amount of traffic or leads, this 2% difference could represent hundreds of dollars, and maybe even more in the long-term.