SAN FRANCISCO — A/B testing may put you to sleep. But it’s critical to the modern enterprise and will save you lots of grief, misconceptions, and money.
That’s the message from a CloudBeat 2013 session on the subject of A/B testing, where you present users with two choices and measure which one they prefer.
The panel was moderated by Dell’s Bernard Golden, the senior director of cloud computing enterprise solutions group, and it included Elizabeth Allen, the product manager at digital media analytics firm Metamarkets and a former IGN data analyst; and Dan Siroker, chief executive and cofounder of Optimizely.
A/B testing typically brings science to the debate over what a company should do for its users. Typically, companies steer themselves based on HIPPO, or what Siroker calls the “highest-paid person’s opinion.”
It’s a simple idea, but A/B testing runs into a lot of resistance. Most people don’t know how to do it properly, and in the past, it has required people to enlist the support of web developers — a scarce resource for many companies, said Siroker, who was director of analytics for the 2008 Obama presidential campaign and has a new book out, A/B Testing: The most powerful way to turn clicks into customers.
Siroker said he started his company to build an analytics and testing product that he wished he’d in 2008. He said that companies have been slower than they should be at adopting A/B testing. Allen agreed.
“I had to figure out what A/B testing was in” her former job at game news publisher IGN. “It changed the way we thought about the potential of data analytics.”
After it was introduced, the company could be much more reactive to its own development processes. Allen introduced A/B testing, enabling company insiders to test their hypotheses about what the audience wanted.
“It was a big change for IGN in driving a data-driven culture,” she said.
She said that the company had a promotion running for a YouTube show on the right side of its web page. Few people clicked on that button to subscribe. Allen suggested that the button be moved to the top of the page. The result: a 65,000 percent increase in clicks on the button. But there were some turf battles, as Allen discovered late that she should have warned the ad operations person about that change.
“When we implemented the service for A/B testing, it was a challenge to get people to know we had it and they should use it,” Allen said.
What you need, Siroker said, “is a champion for A/B testing.”
Marketers have to get on board with the testing so that you know what to test. That’s why automating the process, or making it easy to self-test, matters a lot, Siroker said.
“In my past, from my experience at Google, I know that no engineer wants to help a marketer with their job,” he said. He also said that A/B testing is like kryptonite for those who don’t want to change.
Over time, Allen said, “We had to keep proving the value and proving it again.”
VentureBeat is studying mobile marketing automation
, and we’ll share the data.