A/B testing is all the rage, and with good reason. Incremental improvements to conversion rate can result in big gains to top line revenue for high-traffic sites. And it doesn’t take a major investment to get started. Just tool up, craft some hypotheses, launch a test, and watch the data.
If you’re new to A/B testing, let’s set some expectations. They’re not a magic wand.
Properly run A/B tests typically span 30-60 days. And they’re incremental in nature: If you load too much change into a variation, you won’t know which change resulted in your outcome.
A/B testing requires large traffic volumes to yield trustworthy results. Even with 10,000+ site visitors a month, you’d need a lift of at least 9% to be certain about the outcome of the test.
A/B testing tells part of the story. It’ll tell you whether a flashing yellow button gets users to click, but it won’t tell you if the same tactic erodes their perception of your brand’s trustworthiness. Sometimes what's best for your brand is not what "wins" the test.
On the flip side, UX design:
Can mean more dense upfront work.
At Slide UX, we offer guided programs, which are excellent options if you've got time to take part in the 'doing'. You'll learn not only about your users, but also techniques that product teams need to know. Not to mention, they reduce your cost.
Seeks to reveal the "why" behind user reactions, not just the "what"
Covers more than just a digital interface: What do users experience before & after their interactions? How do they react to competitors? What alternatives do they consider? What other pain points might you be able to solve?
Can lead to both small gains AND large shifts.
Our clients often report large conversion gains as a result of our work, even though we're not a “conversion rate optimization” firm. Our holistic methods quickly reveal AHAs that would take years of testing. You'll see many of those stories here.
Our advice? A/B testing is a great technique, and the insights it offers can be thrilling. But it’s only appropriate when you’ve got a good baseline experience on which to optimize, and it definitely doesn’t replace the need for qualitative user insight - AKA, talking to people!