A/B testing platform leader Optimizely announced on December 20, 2017 that it will sunset its free Starter plan on February 28, 2018.
We estimate that this change will leave around 70,000 websites* without a tool for testing and tweaking the site experience.
Here’s the official message Optimizely sent to customers via email:
We are notifying you about an important change to our pricing plans that will impact your Optimizely account. As of February 28, 2018, our Starter plan option will no longer be available as the Starter plan is based on a version of Optimizely that is being sunsetted.
You will be able to view and run experiments through your Starter plan through February 28, 2018. After this date, you will still be able to access your Optimizely account and results data, however any actively running experiments will be paused, and you will not be able to launch any new experiments or campaigns.
The Optimizely X platform provides organizations with the necessary tools to experiment across every channel, application and feature. We hope you will take this opportunity to consider moving to our new SaaS platform, Optimizely X. Our Starter option was designed as an easy way for customers to get started with Optimizely as they began their journey with experimentation. Over time, we’ve found that the companies that see the greatest success are the ones who are able to commit to experimenting over a longer term, in an ongoing and iterative fashion. For that reason, we will be focusing our efforts and now offer annual plan options going forward.
If you’d like to learn more about our annual subscription plans, please visit optimizely.com/plans.
The company states that they will discontinue the “old” platform, Optimizely Classic, leaving users with the option to move to Optimizely X — a solution heavily focused on enterprise customers.
Optimizely X positions itself as an experimentation platform. It offers versatile features such as product recommendations, personalization, and server-side testing but comes with a high price tag that most users of Optimizely’s free plan will not be able to afford. At this point, it is unclear what will happen to the paid legacy plan types — bronze, silver, and gold.
While the decision is a blow to customers of the free plan, it is an interesting indication of how Optimizely has shifted focus.
In its early days, the company aggressively promoted its free A/B testing platform to establish market dominance, but it has since been moving further and further away from its roots.
The introduction of Optimizely X in 2016 heralded a shift from A/B testing to more sophisticated experimentation, and Optimizely has sought to position itself as a platform for continuous optimization and personalization.
Sunsetting the free plan was only a question of time, and yet the decision to make such a cut tells us more about the future of conversion optimization than Optimizely cares to admit.
A/B testing is dead
For years, it seemed A/B testing was the universal answer to conversion optimization. The marketing world believed that a scientific approach to testing would revolutionize how we do business and that A/B testing would allow any website to become a cash-making machine.
The only catch is that setting up a scientifically valid experiment is more complex than we were led to believe — and in 2017, reality finally caught up with the industry.
It wasn’t a lack of effort or a lack of time that derailed many users’ tests; it was a lack of knowledge about statistics. The truth is, A/B tests are often set up incorrectly, polluted by external factors, or never reach statistical significance. Not to mention those that are properly conducted but are effectively worthless because they have no sustainable impact on business and growth.
Trained statisticians would tear the average marketer’s A/B test to shreds, and it seems Optimizely has finally decided to call time of death on oversimplified CRO attempts. After all, bad results reflect badly on the tool itself, and with Optimizely X the company seems eager to distance itself from its past.
Why is A/B testing so error-prone?
For starters, statistical significance — the certainty with which one is able to declare a winning variation — is an issue. On average, an online experiment will need more than 25,000 visitors to reach significance, which means many experiments fail due to insufficient data.
In addition, things often go wrong in the preparation phase. Using classical statistical techniques requires setting a sample size and committing to a minimum detectable effect in advance, a crucial step that is frequently overlooked. Many testers also fail to realize that testing several goals and variations at once can increase errors. And, finally, data pollution may cause false positives.
It’s also important to understand that an A/B test requires more than just creating a tweaked variation of the original page. It’s about identifying a problem and providing a hypothesis, backed by data, and then testing a new experience based on that foundation.
In other words, A/B testing is a rich person’s game. It requires enough traffic, experience, and information to test quickly and with high confidence. If you don’t have the necessary resources to set up A/B tests correctly, you are better off avoiding the risk of operating under false assumptions.
Okay, A/B testing is hard, but why did Optimizely opt out?
No one can deny that A/B testing has the potential to make an impact, but it is not the jack-of-all-trades CRO marketers had hoped it would be. It seems Optimizely wants to shield itself from unsatisfied users who simply can’t achieve what they had hoped to. In this light, the push toward more sophisticated experimentation seems inevitable.
However, the above email does not explain why Optimizely has chosen to move to an enterprise-only business model. After establishing market dominance in website optimization, it seems Optimizely has decided to no longer cater to a large chunk of its original user base, leaving many looking for an alternative.
In fact, with Optimizely X, the company hints at where we should all be looking next: personalization. Creating a custom user experience and tailoring the site to each user individually is something no longer reserved for big-data companies. Although personalization is often associated with complex algorithms, some Optimizely competitors have already made it available to SMEs.
What are the alternatives to Optimizely?
If your heart is really set on A/B testing, take a look at the free version of Google Optimize. It offers a powerful editor for both visual editing and more advanced changes through code editing. Google Optimize is deeply integrated into the Googleverse, which makes performance analysis using Google Analytics and optimization for AdWords campaigns particularly easy.
Potential downsides might be that you can run no more than three concurrent experiments, and the tool doesn’t offer audience targeting. So audience-based experimentation and personalization are not its strength.
VWO offers an extensive feature palette that covers all kinds of optimization use cases.
If you are looking for a testing platform that is more powerful than Google Optimize, VWO has got you covered for $49/month. Its optimization and personalization solution will set you back $299/month.
There is a catch, though. Due to implementation using asynchronous loading, speedy websites might see a FOUC (flash of unstyled content) that could potentially pollute experiments. Still, having all the tools — testing, surveys, and personalization — wrapped up into one platform certainly facilitates optimization.
If A/B testing has not brought you any luck in the past, it might be a good idea to invest in a tool that helps you better understand user behavior.
Starting at $89/month, Hotjar offers a wide range of solutions, such as heatmaps, visitor recordings, and feedback polls that help you uncover hidden opportunities. Bear in mind that you might need additional tools to put your new ideas into action.
If you are just looking for a new A/B testing tool, we recommend going with Google Optimize. Even though it is limited to three concurrent experiments, that should be plenty for most low-traffic websites.
For marketers who feel their tests might be failing due to a lack of insights into user behavior, a heatmap and user recording tool such as Hotjar will be a good option.
You might also want to check out Best Personalization Services — a website that lists, reviews, and rates different personalization solutions.
* Using https://builtwith.com/ we analyzed the number of websites that installed the Optimizely script between March 1, 2015 and August 1, 2016 — the time range when the free Starter plan was available. Next, we subtracted the number of paying Optimizely users (as estimated by market insiders), leaving us with about 70,000 websites affected by the change.
Yvonne Koleczek is CMO at Unless. She writes about personalization and growth marketing.