Check out the on-demand sessions from the Low-Code/No-Code Summit to learn how to successfully innovate and achieve efficiency by upskilling and scaling citizen developers. Watch now.
How your bottle of beer found its way into the fridge when you sat down last weekend to watch a football or basketball game on TV is hardly of consequence when you’re looking to relax. What you know is that you picked up a six-pack at your local market, but the circuitous route it traveled and the IT used to activate the supply chain before you popped off the top will enlighten you.
Anheuser-Busch InBev SA/NV, commonly known as AB InBev, is a multinational drink and brewing company based in Leuven, Belgium. The original InBev global brands are Budweiser, Corona, and Stella Artois. Its international brands are Beck’s, Hoegaarden, and Leffe. The rest, including Michelob Ultra, are categorized as local brands. According to Statista Research, the world’s largest beer producer will report 2021 sales at more than $46 billion, twice that of its top competitor, Heineken Holdings ($22 billion).
AB InBev and its IT group, Bees, make sure all of their customers — 6 million small-to-medium-sized retailers in 150 countries – have enough of those bottles and cans on hand, so you can select the brand you prefer off the shelf at any time of day.
Bees is an ecommerce and software-as-a-service (SaaS) company (iOS, Android, and web) created by AB InBev in 2016 that as of Q3 2021 has 2.1 million SMBs as monthly active users, more than $13 billion in gross merchandise value (GMV) through the first nine months of 2021, and more than 1.6 million orders per week. Bees’ application empowers users to place orders anytime, anywhere. They can order an array of products, earn rewards where applicable, gain insights on how their business is performing, and have flexibility with respect to delivery dates and times. The app also enables personalized order recommendations to each customer, all powered by cutting-edge AI models.
Intelligent Security Summit
Learn the critical role of AI & ML in cybersecurity and industry specific case studies on December 8. Register for your free pass today.
Bees, which isn’t an acronym but a play on many “b’s” (business-to-business, beverages, Busch, etc.), claims to be the first team to enable revenue management to run autonomously via machine learning.
“We have direct distribution in many markets,” Jason Lambert, senior vice president of product management at Bees, told VentureBeat. “The way we always used to sell the beer was physically showing up once a week (at each location); you know, putting in an order meeting with the retail owner; then, a few days later, the order would arrive. Customers would use Excel files to do analysis. So we’re tasked with reimagining what this new router market model looks like.”
Bees was able to look at that entire value chain and update it “so we still have representatives going in and visiting, but their tasks and objectives are different,” Lambert said. “So where it once was, ‘Hey, place an order now,’ the customers themselves now are empowered to order what they want, when they want. Now the business development reps go in and are tasked with helping these customers grow and do testing that will yield better business outcomes for them.”
How the AI factors into the solution
Andrew Murray, senior VP of Revenue Analytics and Finance, said that Bees supplies ML and AI tools to solve specific customer problems, and the selection of those tools depends on the problem at hand. “It is easy to get lost in AI hype and theory, but I think we’ve done a good job of letting the customer pain point we’re trying to solve determine the sophistication and complexity of the data science we apply,” Murray said. “As a rule of thumb, we start with simplicity and layer on complexity only where it can truly enhance the solution for our customers.”
Bees’ ML and AI models provide digital customers with daily, personalized inventory replenishment recommendations, with ideas for new products to try, along with promotional offerings. Arena-ai was engaged to handle the AI recommendation engine.
The development and deployment of AI
In an interview with a group of senior leaders at Bees, including Lambert, Murray, and Pratap Ranade, CEO of Arena-ai, shared insights for data architects, technologists, and developers alike:
VentureBeat: What AI and ML tools are you using specifically?
Ranade: Specifically, Arena builds machine-learning systems underpinned by Active Learning, a branch of newer AI that we think is going to be transformational. Essentially, active learning systems work by being “curious” – they continuously try new things, making them better at adapting to and operating in a fast-changing, complex world. Most AI today is passive. It learns by looking at billions of examples of labeled data, like tagged faces. Active learning systems improve through interactions with their environment (both real and simulated environments); think Neo from “The Matrix” learning Kung Fu by fighting Morpheus over and over in a virtual dojo.
Using active learning, our software provides personalized, adaptive intelligence to help AB-InBev’s customers automatically receive personalized limited-time offers.
VentureBeat: Are you using models and algorithms out of a box — for example, from DataRobot or other sources?
Murray: Open source models are a great jumping-off point that allows us to spin something up quickly and establish an initial benchmark for model performance. However, we found we always needed to make some adjustments to these models given the size and complexity of our business. After layering heuristic business rules onto these initial models, we continue to push ourselves to evolve our approach and build new and increasingly robust ways to achieve performance improvement of our models. We’re never fully satisfied with our results, and continue to eagerly explore ways to make incremental improvements that can further unlock value for our customers and improve our service to them through ML.
Ranade: A lot of the technology we’re using at Bees is at the bleeding edge of machine-learning research, which requires us to build advanced and custom machine learning systems. Out-of-the-box models and autoML systems like DataRobot are fantastic at democratizing access to machine learning and making it easy and inexpensive to deploy but are not well suited for places where a higher-performing model matters. Bees operates in 13 distinct markets, selling a complex product and customer portfolio, against a changing backdrop of shifting consumer preferences, price elasticity, and supply-chain shocks exacerbated in a post-COVID-19 macro landscape. For the use cases we’re tackling with the Bees team, the incremental impact of algorithmic selling is so significant, that it more than justifies the development and fine-tuning of advanced active learning models.
That being said, we are huge fans of open source ML tooling and are power users of many of the biggest frameworks – e.g., PyTorch, Scikit-Learn, Pandas, etc. – pushing these tools as far as they can take us and filling in the gaps ourselves whenever it is necessary.
VentureBeat: What cloud service are you using mainly?
Ranade: Like many companies, we rely heavily on cloud compute and storage, specifically AWS. Our core machine learning capabilities leverage several of their IaaS offerings. We store data securely in S3 with object-level encryption. We rely on distributed compute engines like Athena to process that data at scale and orchestrate our own containers for training and inference workloads via Elastic Container Service (ECS). Streaming technology like Kinesis allows thousands of our data ingestion workers to collect Terabytes of data per day without data loss. Where off-the-shelf components don’t meet our evolving business needs, we have built custom software.
Murray: Building on what Pratap just mentioned, on the ML and AI side of our infrastructure, our stack is a fairly straightforward implementation of a Lakehouse architecture. As we are a small and growing team, we try to use managed services as much as possible to make efficient use of our human capital. Specifically, we are on Azure and utilize many of their managed services such as Data Factory, Data Share, DevOps, Event Hub, Blob containers, ADLS Gen 2, etc. Databricks is our primary analytics and data science engine.
VentureBeat: Are you using a lot of the AI workflow tools that come with that cloud?
Murray: We’ve absolutely tried them, but our use cases are so specific that it’s been hard to take advantage of these workflow tools to accomplish our objectives. At the end of the day, the challenge is that once you start to make adjustments, you can lose some of the intended functionality that the workflow tools were designed to deliver. So we’ve found you can quickly end in a “no man’s land” with a model not custom enough to get the exact functionality you want, but also not able to leverage the out-of-the-box efficiency the tools can deliver. Again, this gets back to the point that the nature of the problem should dictate the tool to be used.
VentureBeat: How much do you do yourselves?
Murray: As a relatively young team continuing to grow rapidly, we have to take a pragmatic approach to build vs. buy decisions, and we do so on a project-by-project basis. Bees has the great fortune of working every day with the operating businesses at ABI and with analytics teams that sit within the business at ABI as well. So we benefit from the scale, reach, and detailed understanding of our customers and their businesses. That said, we know we have areas in our tech analytics space where we need to continue to learn and grow to accomplish our big ambitions. Sometimes that learning is best facilitated working with a strategic partner like Arena, sometimes that is through building and being willing to learn through doing, and sometimes we decide that the capability is not as core to our value proposition or there is no competitive advantage to be gained from it, and so we rely on more transactional third-party arrangements. To me, the key is that we have clarity on what we are trying to accomplish and continue to revisit this allocation of human capital and resources to optimize our delivery.
VentureBeat: How are you labeling data for the ML and AI workflows?
Murray: For the core use cases my team is currently focused on, our data sets are typically very tractable financial or clickstream data. ABI’s commercial and analytics teams have dedicated a lot of time and thought to things like clusters for our customer base. So what we are working on today doesn’t really require as much labeling as other areas of ML, such as text or image. Of course, new use cases may emerge which will require us to upskill in labeling. When that moment arrives, we’ll approach it with a similar mindset of simplicity, focused on solving the problem at hand.
VentureBeat: Can you give us a ballpark estimate on how much data you are processing?
Ranade: As Jason mentioned, Bees processes 1.6 million orders per week, across 13 countries, and a whopping $13 billion in GMV flowed through the platform in the first 9 months of this year alone. These orders are composed of over 30,000 distinct SKUs, across over 2 million customers. Arena combines data from these transactions at the customer X SKU level with streaming in-app behavioral data to power both our deep learning and active learning models. All in, our models process more than 10 billion records each day and update near real-time, operating on streaming event data at over 50Mbps. In terms of output, our models deliver 78 million personalized outputs to retailers through the Bees platform each day, updating and adapting every hour.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.