So we set out to find proof of companies that are successfully leveraging data to get real results: to achieve more revenue or reduce costs.
That search will culminate in our upcoming DataBeat event next Monday and Tuesday, May 19 and May 20. Our sole purpose in this event is to focus on real cases where businesses get ahead by harnessing and analyzing all of their data — then achieving profits with the help of a new generation of analytics tools.
And we found quite a few. So don’t despair — instead, consider coming to DataBeat next week and get a front-row seat to hear how some of the best businesses in Silicon Valley and beyond are using data to get ahead.
Note: Fewer than 50 tickets remain for DataBeat. Grab yours today.
We’re not aware of any other event out there that has focused on real cases of data usage like this.
Here are just 10 of the stories we’ll showcase at DataBeat that could very well change the way you do business:
Case study #1: 500% increase in conversion
MindJet, a San Francisco company that makes collaborative work management software, says it used software from business-intelligence company Looker to increase lead conversions among users of its trial products by a whopping 500 percent. At DataBeat, MindJet data scientist Anna Gordon will describe how she did this: She used Looker to track visitors to the company’s trial product site and watch how often they logged in and what kinds of data they accessed. She then performed a regression analysis to predict their propensity to buy the company’s software. The sales staff then zeroed in on those most likely to buy, and converted them to paying customers at a rate five times higher than they had previously, the company says.
As part of her research, Gordon also noticed that MindJet customers were less engaged with the product after the first two weeks of use, so she built a re-engagement strategy that also helped boost loyalty. Finally, she has spread the usage of the Looker product across her marketing, product, and finance teams. The new infrastructure allowed her to keep her data on premise at MindJet, thus keeping data secure. It also allowed her to change queries immediately, and gave her more granular control over the data she couldn’t get with other tools she’d used, such as Jaspersoft or Tableau. Looker also aggregated data from various databases that Mindjet used to store its data, from Salesforce to Netsuite, Heroku, Mongo, MySQL and Eloqua, thus saving her from having to do laborious work of writing SQL queries for each of the silos. MindJet says 80 percent of Fortune 500 companies are now its customers.
Case study #2: 85% revenue increase
LinkedIn’s Simon Zhang, director of business analytics, will explain at DataBeat how using Tableau’s data visualization software helped the the social networking company grow its business services revenue by 85 percent year over year. LinkedIn did this by getting the easy-to-use analytics tools into the hands of key sales and other employees, and having it pull data from multiple sources, form Salesforce to Hadoop.
Case study #3: $1 million cost savings
NYSE’s chief data officer Steven Hirsch will talk about how the leading stock exchange has saved at least $1 million in costs by storing its more than 10 petabytes of data in Hadoop, a fast-growing tool companies are using to manage large volumes of data. Hadoop is a cost-effective way to store petabytes of data, and then run analytics on it. Previously, NYSE stored its data in more expensive legacy data storage products, where it was difficult pull the data sets for use by modern big data applications. The legacy storage also required database administrators to load and transform datasets.
The new Hadoop-based technology, by contrast, gives the company’s data scientists self-service access to the data. In addition to cost savings from using the less expensive products, NSYE has also realized “massive time savings” too.
Case study #4: $16,500 savings per employee
News Corp will talk about the perks of using RelateIQ’s “relationship-intelligence” software to achieve cost savings. News Corp’s chief of technology Paul Cheesbrough will join RelateIQ co-founder and chief technology officer Adam Evans in a fireside chat about the new roll-out of this technology. While Cheesbrough won’t be ready to reveal specific numbers, RelateIQ is reporting that customers in general are reporting the following metrics:
- 2 hours saved in status meetings per week
- 3.3 hours saved logging contacts per week
- 5+ total hours saved per employee per week = $300+ saved per employee per week
- $16.5k saved per employee in a year
Case study #5: 75% reduction in customer support requests
AirBnB, the fast-growing company that allows people to to open up their homes for short-term stays, will explain how it harnessed huge amounts of data from its customer feedback logs to drive changes to its product, thus driving revenues upward. At DataBeat, AirBnB’s head of data science, Riley Newman, will explain how the company’s initial popularity led to a barrage of customer support requests, and why the company decided to make lowering that contact rate a key metric of success for its product team.
Over the course of a couple of months, the team built out its own infrastructure on top of the support desk software Zendesk, which allowed the customer support data to be shared between teams. This fostered better internal communication about how to improve the site. The company managed to drive down customer support contacts by 75 percent in three months. AirBnB’s Newman said the effect was to make the site more scalable, which led to more revenue.
Case study #6: 10-15% increase in average order value
Bonobos, a clothing shopping site for men, will explain how it was able to bet on an all-cloud architecture offered by a technology provider GoodData to increase revenue. In just one example, on the Monday after Thanksgiving (Cyber Monday) last year — the largest shopping day of the year — Bonobos found that its revenue was trending nicely two-thirds through the day, but that it was still behind in its sales goals. Using GoodData, the company’s employees viewed the real-time distribution of the dollar values of all of its orders. This way, the team formulated promotions at pricing tiers that the customers appeared to find the most attractive.
“We were able to boost average order value by ten to fifteen percent,” said David Glueck, the company’s senior director of data science and engineering.
Case study #7: $300,000 in upfront cost savings
Domino’s Pizza found that it could save more than $300,000 in upfront costs in its IT department by replacing its legacy technology with software from a big data company called Splunk. It used the Splunk software to collect and monitor data from its servers and applications more cost-effectively. The savings led Domino’s to expand its use of Splunk across the organization, including in its business and marketing units, which Domino’s says let its employees collect online sales data logged across its network of more than 10,000 stores.
The Splunk product allowed Domino’s to more quickly fix network bandwidth and latency hiccups, Internet connection problems, and payment processing issues. For marketing, Domino’s can analyze the data to better determine say, what promotional coupons are proving popular among customers, and how to adjust those that aren’t working as well, in order to bring in more revenue. Previously, Domino’s marketing department had to pull reports from the company’s data warehouse, but these came well after the fact and too late for meaningful action to be made.
The infrastructure changes helped Domino’s realize its most successful Super Bowl campaign in 2013.
Case study #8: 5X increase in Facebook referrals
The Guardian, the third largest English-speaking newspaper website in the world, with 5 million unique visitors per day, built an in-house analytics system called Ophan. The system helped it improve its search engine optimization (SEO) and drive more traffic through social media like Facebook, thus boosting revenue through the advertising on the resulting page views.
Graham Tackley, the Guardian’s head of architecture, will explain how his team built Ophan on top of a technology from Elasticsearch, an open source search and analytics engine. The technology allows employees across the company — including editors, journalists, and the SEO team — to see the in real-time how users are interacting with the content. The engine helped the team see how Facebook was better than Twitter at driving traffic back to the Guardian’s site, and what headline styles and topics worked best when posting articles to the company’s Facebook pages.
Incorporating these lessons, the Guardian’s team drove a five-fold increase in Facebook referrals over the last few months. And over the last two years, unique visitors have increased by over 50 percent to just over 100m last month, with improved analytics a “significant factor” behind that increase, according to Tackley.
Case study #9: New lines of business
RMS, a 1,200-employee Silicon Valley company that helps insurance and financial companies model their exposure to catastrophic risk, realized it could grow its revenues by offering customers more real-time access to its risk modeling. So it set out to build a cloud-based product on top of MongoDB, a database that allows RMS to store and easily access its more than 100 billion documents a year, representing hundreds of terabytes of data.
At DataBeat, RMS executives will talk about how previously, its customers had no easy way to access data that was scattered across their various business units. But with the rollout of RMS’s new product, customers can access it all in a single place, and can build their own products without RMS’s interference. Further, by getting clearer upstream and downstream views into the risks of various industries, RMS is using the platform to drive into different lines of business, including non-catastrophic modeling, as well as serving the health and aviation industries.
Case study #10: Smarter marketing spend
MarketShare, a company that helps marketers allocate their spending across different types of media, will explain at DataBeat how it used a technology based on Hadoop to help its customers increase results, thus boosting its own business.
Marketshare got help from Altiscale, a Silicon Valley company that delivers Apache Hadoop as a cloud service. Altiscale helped Marketshare track data from up to 150 different sources simultaneously, to build a sophisticated predictive model for its customers. This allows Marketshare to decide where to spend its marketing dollars, by getting insight into results per dollar for TV campaigns, online campaigns, or print advertisements.
It also allowed the customers to customize the modeling by themselves. Marketshare says it works with 70 of the Fortune 500, including Adobe.
Stay tuned for more DataBeat program announcements in the coming week. In the meantime, check out the event details — including our full program and lineup of 70 data visionaries — on the DataBeat website.
Seats are very limited. We expect a full sellout in the next few days. Reserve yours now!