"We got a 300% lift in revenues because we switched 'add to cart' button from green to orange!"
I read stuff like this all the time. It's not only horribly mis-informed but it can be damaging for business owners who execute baseless split-tests that lead to little or no real insight.
For ecommerce entrepreneurs who pride themselves on making the most out of every visitor, it's infuriating to be sold on - and subsequently invest in - conversion optimization tools, only to find after your first split test there were no real lifts at all.
The problem is, there's a good deal of information that misleads the reader into thinking "If I just swap a headline, or change a button color, or 'fix' my call to action, the revenue will start pouring in!"
But that's far from true. In fact, a study conducted by popular split testing software VWO found that 6 out of 7 A/B tests did not provide a statistically significant improvement.
Very likely, this is also because they found that the time invested to create a test, including research, was usually less than 5 hours.
While that speaks volumes about how easy the VWO platform is to use it doesn't look very good for site owners, especially those who profess taking their visitor's online behavior seriously.
These misconceptions about conversion rate optimization are reflected when you see what people test first on the platform.
If you're wondering why I'm calling out Calls To Action & Headlines as misinformed, let me present you with a scenario...
Let's say on a whim, you A/B test the button on your product page so one says "Checkout" and the other says "Add To Cart" and for some reason "Add To Cart" ends up getting 60% more cart adds.
While that's great, you have to wonder:
- Why did this happen?
- How can you carry the success of this test over to other areas of the site?
- Did you learn anything about your visitors that can improve or clarify your offering?
If you can't answer those questions, was your test really a success? And let's say the test lost... What hypothesis did you have invalidated? Sometimes knowing what doesn't work can be more beneficial than knowing what wins.
But if you're just blindly testing things without input from actual users or based off cold hard analytics data, you'll never gain the momentum necessary to create a testing plan that leads to real growth.
Misconception #1: Conversion Rate Optimization is All About A/B Testing
While A/B testing gets the most attention, it is neither the beginning nor the end of conversion rate optimization.
I've seen plenty of situations where the store owner insists upon A/B testing a button color while rejecting the notion that the lack of visible sizing options or shipping & refund policies is what prevents more visitors from buying.
Conversion rate optimization is about understanding the actions that lead to purchasing behavior, so you have to start by asking if their basic needs are met.
This graphic, which shows conversion rate upside down, was published by Bryan Eisenberg back in 2007 is still a perfectly good model for what you should be focusing on, and in what order.
Functional - Is it clear what the site does & who it's for? How easy is it for your visitor to determine your store's value proposition upon landing on the site?
Accessible - Can they get to where they're going quickly? Are the categories clearly labeled? Is the search function clearly visible?
Usable - Are the font sizes easy to read? Do navigation items look like navigation items? Do magnified photos actually show larger product shots?
Intuitive - Can you click on the things that look clickable? Are you asking users to adapt to a new buying modality just to purchase from you? Are common items where they're "supposed" to be?
Persuasive - Does the copy & photography evoke an "experience" rather than simply purchasing a product? Are you leveraging social proof like testimonials or high social shares? Are you creating urgency or showing scarcity to increase demand?
In some situations, like when sizing information isn't easily accessible on the product page, I'd say it's ok to do iterative testing, because the page is meeting a basic need and will be better off with it in the long run. Just be sure to benchmark your data before & after to see if there's a noticeable difference after the change was made.
Now, also, just so we're on the same page, I'm not suggesting you ignore A/B testing, but rather know when it's appropriate to implement it.
When it comes to big changes on high stake pages (like headline testing on the home page) or different approaches to design or copy, by all means use an A/B test to see what will perform better.
Misconception #2: Inconclusive Tests Are Losers And Should Be Ignored
If your test was based on a well researched hypothesis, than a failing or inconclusive test can help you move on and focus on more important things or try something radically different.
Take these 6 examples offered by Groove HQ, like headline tests or authority logo tests, where the results were inconclusive.
Provided the tests weren't called too early, it's safe to assume that neither the verbiage in these headlines nor the logos of their customers are playing huge roles in their prospect's buying decision.
But also, let's look at the headline test a little more closely...Aren't they basically saying the same thing, just wording it a little differently?
Entrepreneurs have a tendency to get locked into a core message and start playing with semantics; thinking it's a magic combination of the same types of words that will unlock a windfall of revenue.
But if you actually take the time to consider that an inconclusive test like this means semantics aren't really making a difference, you free yourself up to simplify your message, or try a completely different approach.
Consider the headline GrooveHQ is testing now: Looks like they learned from their inconclusive tests, and got straight to the core of what they're about.
Misconception #3: Conversion Rate Optimization Is Dark Magic That Only Seasoned Professionals Know How To Wield
While I do think at some point you should hire a professional to build a long term testing plan, a lot of the conversion rate optimization process comes from finding the clogs in your data and creating a plan to make improvements.
Sean Ellis, founder of Qualaroo says in this article on Optimizely:
"You should begin your testing research with looking at where to test. Use your analytics to uncover the following:
- Top 5 highest bounce rate pages
- Top 5 abandonment points in your funnel
- Top 5 most valuable pages to your business
[...] The pages with the highest bounce rate signify a page where visitors aren’t finding what they’re looking for, or are frustrated by not being able to take an action that they want to. For abandonment points, look at places on your site where you lose the most traffic."
He then goes on to recommend using survey tools like Qualaroo to capture real visitor feedback, but to also conduct one-on-one interviews and other forms of qualitative research to understand why these pages "leak."
While a professional may have more experience, it certainly doesn't take an expert to look at a report and ask questions to find out "why" people are leaving.
There are a lot more misconceptions about conversion rate optimization, like that you're able to "test faster" or that statistical significance means validity, but we can get into that in future articles.
For now, I'd love to know what your thoughts, experiences & struggles with conversion rate optimization have been. I'd love to help bring some clarity and get you on the right path moving forward.