We are excited to bring Transform 2022 back in-person July 19 and virtually July 20 - 28. Join AI and data leaders for insightful talks and exciting networking opportunities. Register today!
One of the key problems for businesses trying to make the most of their data is that the overwhelming majority of such data isn’t set up for machine processing. That’s where AI can help, according to IBM North America chief marketing officer Rashmy Chatterjee.
She said 80 percent of company data is unstructured, including free-form documents, images, and voice recordings. Traditionally, it hasn’t been possible for computers to understand this data. AI systems can transcribe conversations, interpret the content of images, and divine information from massive text files.
At the moment, computers miss key context, like emotion, speakers’ accents, and other details that humans take for granted, but machine learning systems like IBM’s Watson APIs can help computers access that information.
“[Watson] takes huge amounts of unstructured data, understands it, and uses that data to lay out hypotheses,” Chatterjee said during a presentation at VentureBeat’s MB 2017 conference. (Disclosure: IBM was a sponsor of MB 2017.) “And it says look, these are five answers, or these are a hundred answers. Against each of the hypotheses, we give you a confidence probability.”
“Watson” is the umbrella term IBM uses to refer to its set of intelligent products and services, which include tools for helping process unstructured data, like audio, images, and text.
IBM is far from the only company interested in helping businesses leverage unstructured data with AI. Microsoft, Google, Amazon, and others all have tools for analyzing image, text, and audio data. These tools are a key part of each company’s strategy for the future as they prepare to capture the next generation of business workloads.
Chatterjee highlighted Macy’s as an example of an IBM customer that’s using the company’s tools to better personalize customers’ shopping experiences using AI. The Macy’s On Call feature lets customers get information about what’s in stock and other key details about the contents of a retail store, without a human sales associate present.
It uses Watson’s natural language understanding capabilities to process user queries and provide answers. Right now, that feature is available as part of a pilot in 10 Macy’s stores.