Split-testing — when you create variations of your website and test which ones perform the best — is quickly becoming the rage among marketers and entrepreneurs.
Sometimes called A/B testing, split-testing is the tool of choice by most “growth hackers,” marketing people focused on attracting more users, conversions and the like. Many potential investors and angels will be impressed when they hear that you’re running split tests, and you sound 100 times smarter at a networking event when you mention that you’re testing your ideas.
But ego-stroking aside, split-testing can be an incredibly valuable tool for increasing conversion rates and hitting your goals. You can use it to validate a lot of the assumptions and ideas you have as a team, and get real data on what works for your business.
I’m not going to tell you how to split-test your website. Instead, I’m going to tell you how to prepare for it, psychologically. Although split-testing is not difficult, you do have to hop into it with the right frame of mind to get results. Here are five things to consider when getting started on split-testing your website:
Nothing is true unless it’s tested.
If you’re human, you likely have an opinion on design and copy. Everyone does. When it comes to running tests the key is to assume nothing and test everything. Any big idea that your team proposes is an assumption that should be tested.
For example, say you have an e-commerce website and you notice most visitors go directly to your “About” page before completing the signup process. You instantly assume that displaying more information on the homepage will make people sign up faster. But before you hire a designer, you should first brainstorm 10 to 15 ways you can improve the copy on your current homepage to achieve the results you want. Next, run 10 to 15 different experiments testing those ideas using programs like Optimizely, Visual Website Optimizer or Google Website Optimizer. Soon, you’ll have real data to give to your designer, who can then help make intelligent decisions on how to improve your site.
Congratulations! You’ve just saved hundreds of dollars and hours of your time by running a few simple split tests.
Variables must be tested independently.
Split-testing is also known as A/B testing for a reason. Make sure you’re testing your assumptions in a mathematically valid way. Let’s say your co-worker believes a different headline and lighter background will lead to more conversions. To test out this theory, run your tests one at a time against the original design: one test for the background color and one test for the headline. It will be difficult to make independent decisions if you have too many variables in your experiments.
Changes won’t stick if the team doesn’t agree.
Okay, you’ve run some tests and collected great data on why your team should make some specific changes. However, your team of designers does not believe you’ve run your tests correctly, or perhaps they trust their own intuition as designers more than your data. In short, you have wasted weeks of split-testing because your team was not ready for your insights.
Before you get started, you have to prepare your team’s mind for testing. Make sure your entire team is on board, and that they understand every decision is an assumption to be validated. Your entire team must be ready for the results of a test, no matter how surprising they may be.
Tests need traffic to be accurate.
This is very important. When you begin split-testing and you start to see results, it can be super exciting. If you see that your proposed orange button is beating the other variations, you immediately want to call your co-workers and scream, “I told you so!” However, my advice is to put the phone down and back away slowly from the screen. Try not to watch your results in real-time. You need to reach statistical significance — the point where enough people have participated in your split tests – before you make any decisions. Optimizely gives you an alert once you’ve collected enough data to choose an accurate winner. If you need to drive more traffic to run your tests quicker, consider using Google Adwords or Facebook Ads to generate more clicks.
Split-testing has to start somewhere.
The hardest part about split-testing is knowing where to start. In fact, that decision alone can be so overwhelming that most choose not to start at all. Don’t let that happen. I’d suggest spending an hour looking through Google Analytics (or whatever traffic-measurement tool you use) to discover trends and generate intelligent hypotheses on why you’re making the sales that you are. Perhaps many people who end up purchasing your products are coming from a certain blog, or maybe the majority of your first-time visitors come through a particular landing page. Look for some of those trends and make a list of assumptions. Then, simply build experiments around that list.
It’s okay if you bump your head or you have to run a few failed experiments to get it right. Split-testing is not only a way to discover new insights, but it’s also a tool that you can use to validate all of the assumptions you have about your product, about your visitors and about your business. The beauty of it is that you can only get better by testing.
Arielle Patrice Scott is the Marketing Director of Storenvy. Storenvy is a fast growing online store builder and marketplace for independent stores online. Arielle blogs about marketing, startup culture and e-commerce on her blog, TheArielle.com. Prior to Storenvy, Arielle was the founder of GenJuice, a content recommendation service.
The Young Entrepreneur Council (YEC) is an invite-only nonprofit organization composed of the world’s most promising young entrepreneurs. The YEC recently published #FixYoungAmerica: How to Rebuild Our Economy and Put Young Americans Back to Work (for Good), a book of 30+ proven solutions to help end youth unemployment.
Image via andrew.zerick/Flickr
VB's working with marketing expert Scott Brinker to understand the new digital marketing organization. Help us out by answering a few questions
, and we'll help you out with the data.