One area of AI that’s red-hot is virtual agents — smart software that companies are building to chat with their customers through text, voice, or a web chat box. IBM says it has emerged as the only serious provider of this technology for the enterprise. Rob Thomas, general manager at IBM overseeing data and AI, recently sat down with VentureBeat founder Matt Marshall for an interview.
Thomas called the rest of the virtual agent providers “fireflies,” since there are so many of them, and predicts that in three years 100% of companies will have virtual agents.
Here’s an edited transcript of our conversation:
VentureBeat: IBM is using the Watson brand for its AI offerings, but it covers so much that it can be confusing. Can you break it down for us?
Rob Thomas: There is a misperception of what Watson actually is. Watson is three things:
One, it’s a set of tools for companies that want to build their own AI. So if you’re a builder, you need a studio for building models, you need a place to deploy them, [and] you need to build and manage the lifecycle of your models and understand how decisions are being made. You need human features like speech and voice and vision.
Probably the littlest known fact is that 85% of the work that happens in Watson is open source. People are building models in Python, deploying in TensorFlow — that type of thing.
Second, Watson is a set of applications. We’ve seen some problems that are common enough where we said “Let’s package this up into an application so that the average human can buy it and use it.” So, there’s Watson Assistant for Customer Service — the Royal Bank of Scotland is a customer example. We’ve got something called Watson Discovery, which is basically understanding your unstructured data, or semi-structured data. Think of that as text indexing, understanding documents, understanding PDFs. We’ve got RegTech with Watson for things like know-your-customer, anti-money laundering, and operational risk. There’s Watson Planning and Budgeting.
Third, Watson is embedded AI, which makes it easy for us or another company to embed our AI in their product. A good example of that is where we’ve worked with a company called LegalMation. They’re trying to automate the whole legal discovery process. They would say that they can now do in a couple of hours what it takes a lawyer 30 days [to do], because they’ve embedded Watson in their application to do document discovery. So this third part is just embedding AI in other applications.
VentureBeat: The virtual assistant market is really hot. Would you say Watson Assistant is the most successful of your AI applications?
Thomas: Not necessarily. Watson Discovery has been out for a few years because people have been trying to get more out of their data for a long time. But Watson Assistant is probably the hottest area. Virtual agents are probably one of the few things that most companies don’t have, and I’m confident saying 100% of companies will have in the next three years.
VentureBeat: How do you distinguish your assistant from other chatbot applications?
Thomas: Watson Assistant is a virtual agent. I would distinguish that from chatbots, which are mostly rules-based engines. That’s not what we do with Watson Assistant. At the core of it is a model for intent classification. So we do a really good job of understanding intent. Just based on the questions you ask, we can get a feel for what you’re trying to accomplish. That’s kind of the secret sauce.
VentureBeat: How would you describe the competition you have in the virtual agent area?
Thomas: It’s a $2.5 billion dollar market … and so it’s gotten people’s attention.
What’s interesting about it is that there really are no big players, except for us. There are thousands of fireflies that do one piece of it. It’s an incredibly fragmented market.
I could start a chatbot company and launch in two weeks, because it’s actually that easy to do the basics. It’s so much harder to do something else. In probably half of the cases where a customer is using Watson Assistant, that customer started with some off-the-shelf chatbot, and then they see they need more. Now they want to do this multi-channel [chat, voice, email, etc.], or now they want to connect this to all of their data sources, or now they want simultaneous conversations with what could be 100,000 simultaneous users.
As I mentioned, we differentiate by offering intent classification. Beyond that, we’ll ask “Do you understand your data?” Because in customer support, the answer exists somewhere — maybe it’s with a human, maybe not — but can you find it? Can you index large amounts of data across multiple repositories, across multiple clouds, if your data is stored on different clouds?
Here’s another way we differentiate, by the way: Any competitor can do hyper parameter optimization, but nobody other than us can do feature engineering. With something called AutoAI, we can automate feature engineering that cuts down 80% of the data science work. You might build your model open source, you might deploy open source, but you can use AutoAI to get it into production.
VentureBeat: Is there a level of sophistication or size a customer needs before Watson can be successful?
Thomas: If you only want to serve basic tasks, like letting customers reset their passwords, you don’t need us. Any rules-based engine can do that. If you want to get to any level of interaction or decision-making or understanding [or] intent, then you’re going to need us. Most of the fireflies will serve, you know, 10 questions that they can teach the assistant to answer. But what happens when 10 questions becomes 500 questions? That’s when you need us.
VentureBeat: How about Amazon, Google, or Microsoft? Are they competitors at all?
Thomas: They only serve companies that are already working on their public cloud. But that’s a market that I don’t even really play in.
VentureBeat: What market do you play in?
Thomas: The customer that says “I’ve got a bunch of data on-premise. I’ve got data in AWS, in IBM cloud, in Azure, in Google. I need an engine that can federate all these different data sources.” That’s where IBM would play.
VentureBeat: I see, so you see yourself as the only independent player, not forcing a customer to use your particular cloud … How about Microsoft’s virtual agent and Arc announcements at Microsoft’s Ignite a few weeks ago, where it says it is allowing its Azure cloud products and management to be taken to multiple clouds?
Thomas: Today, anyone can go to any cloud and deploy Watson. While we have seen other announcements, we are not aware of anyone else’s ability to run AI from other companies on any cloud.
VentureBeat: You took over the AI business in January. We’ve seen the announcements lately, including advancing Watson Anywhere, and the customer wins. Where do you feel you’ve seen the most traction?
Thomas: The first big move we made was to announce Watson Anywhere, which we announced in February. And just to be clear, prior to that announcement the only place you could use Watson was on the IBM public cloud. So when we announced Watson Anywhere, that was our decision to say Watson should be wherever the data is, whether that’s on AWS, Azure, Google, Alibaba cloud, [or] on-premise. We’ve had massive momentum since that.
VentureBeat: What’s stopping these other folks — Amazon, Google, Microsoft — from doing the same?
Thomas: They have a little bit of a strategy tax with that. Their hybrid cloud strategy is “We’ll help you on-premise, as long as you only connect back to our public cloud.” So it’s a one-way street.
That’s very different from us saying whatever you do with IBM on-premise can run on AWS, Google, or Azure or IBM Cloud. Or you could federate across those. So we’re the only company saying we’re public cloud-independent. That was the whole point of what we did with Red Hat and how we’re using Red Hat OpenShift as … kind of the common denominator, across cloud. That’s unique to us.
VentureBeat: How big of a value proposition is that exactly? How hard is that to move from a cloud, once you’re on AWS?
Thomas: It’s literally impossible. So just think about it for a second: If you build something on AWS, you’re stitching together proprietary APIs. You don’t actually own anything. You’ve rented your entire application and data infrastructure. So it’s not like “Hey, there’s a cost, but we can move it.” There’s literally no way to move it, because those proprietary APIs aren’t on another cloud, which gets to our whole strategy, which is “Hey, you can do that same thing, but if you’re doing it using Red Hat, then it becomes really easy to move, because you can write it one time, you can build on the binaries that are provided through Red Hat (which are open by definition), and then you have full portability.” So it’s a pretty key point.
VentureBeat: When do you think this advantage will start showing up in IBM’s earning results?
Thomas: Last quarter, we said publicly that Red Hat’s growth accelerated from 14% to 20%. I don’t think that’s a coincidence.
VentureBeat: What areas of AI would you admit that Google, Amazon, Microsoft own?
Thomas: They all have home speakers, so they’re going to do better at voice than we will do.
Anything social media-related, they’re probably going to do better. But the enterprise applications for voice and images are teeny, like non-existent.
So that doesn’t really bother me. In terms of languages and speech, we’ve largely partnered for that, but again that’s not really the interaction model that I see in enterprises. We’re prepared if we have to go there, but it’s not a focus area.
VentureBeat: Why is AI deployment so hard? While something like 90% of CIOs are aware of AI’s potential, only 4% of companies had deployed it last year, according to a Gartner CIO report.
Thomas: Gartner said that deployment number is up to 14% this year. Regardless, why is that? That was my first big thought process as I picked up Watson. I think it comes down to three things. One: Data — inaccessible data, data that’s not in a usable form, data spread across multiple clouds. Two: Skills are an inhibitor. Most companies don’t have the data scientists that they need to get something into production. Three: Trust — meaning companies have this fear factor around AI.
Until there’s a breakthrough in those three areas, AI adoption is going to be slow. And so our strategy has been focused around those three things. First, connect AI to the data. That was Watson Anywhere. Bring your AI to wherever your data is.
Second, on skills, I built a team of 100 or so data scientists whose only job is to go help clients get their first model into production. That’s been a huge success. Companies like Harley Davidson use that team; Nedbank uses that team; Wunderman Thompson, which is part of WPP, uses that team. Clients just need a kickstart, something that gets them going. And then they’re self-sufficient. (Editor’s note: Read our piece about how this “elite” AI SWAT team works.)
Third, one of our biggest product investments has been on trust, where Watson has delivered the ability to do data providence, know where your data comes from, manage the lifecycle of your models, manage bias in your models, and things like drift and anomaly detection — all the things that people worry about as they start to scale AI environments.
VentureBeat: Does it bother you that IBM is often considered a legacy player, at least when it comes to Silicon Valley’s investor and startup ecosystem?
Thomas: Someone came up to me recently and asked “How do you attract people to IBM?” My comment was simple. It’s like, that’s the easiest thing I do, because most people that work in AI want to have their code in the hands of as many people as possible. So what better place than IBM, where you’re going to have distribution in 180 countries. All the big companies in the world use our products. If you want your fingerprints on the world and AI, I don’t think there’s a better place to be.
I think we’ve got a pretty good position. We’re not into image recognition; that’s just not what we do. I’d say the core technology behind what we do is natural language processing [NLP]. For what I’ll call “enterprise AI,” NLP will determine the winners and losers, because language is how companies operate, whether it’s through text or through voice or interaction or conversation, whatever it is.
Most of our tech for that is coming out of IBM Research. Earlier this year, we showcased IBM Debater, which was a computer that would debate humans. Some of that core NLP technology we’re now bringing into some of the products that I mentioned, like Watson Assistant and Watson Discovery. Being able to reason and understand will be fundamental to your AI.
See a response from the “fireflies” here.
Also, Amazon responded to this article, saying there were several inaccuracies. According to an Amazon spokesman:
AWS services offer our customers intent classification and many other features through Amazon Lex, the same technology that powers Amazon Alexa. It is inaccurate to portray IBM as the only company able to do this or as the only “big player.”
To claim that no one besides IBM can automate feature engineering is not accurate. In fact, the recently launched Amazon SageMaker Autopilot goes further than automated model building (including data prep, feature engineering, and more) by offering developers unprecedented control and visibility into their models.
It’s inaccurate to say it’s impossible to move away from Amazon’s cloud. AWS APIs are similar to other common cloud APIs and come with much less lock-in for our customers than running apps on premises, like those sold by IBM. AWS APIs promote workload mobility for our customers. For example, which has more lock-in? Self-managed MySQL running on AWS or a financial app running on [IBM’s] System Z? Those self-managing open source on AWS have much less lock-in than WebSphere or DB2, as another example.
Several AWS service APIs are available without “already working” on AWS.