Did you miss a session from the Future of Work Summit? Head over to our Future of Work Summit on-demand library to stream.
As it has more than 12,500 patents, eight Nobel prizes, and a 140-year history of field-testing crazy ideas, it should surprise no one that AT&T would be an important player in artificial intelligence.
“AT&T is a backbone of the internet,” explains Nadia Morris, head of Innovation at the AT&T Connected Health Foundry. The company manages wireless, landline, and even private secure networks to power connectivity for both individuals and corporations. All these networks generate incredible volumes of data that is ripe for machine analysis.
AT&T has built AI and machine learning systems for decades, using algorithms to automate operations such as common call center procedures and the analysis and correction of network outages. On the entertainment side, AT&T’s DirecTV division leverages users’ rating histories, viewing behaviors, and other factors to anticipate the next films they’ll watch.
AI and data visualization help optimize 5G rollouts
Modern AI algorithms have enabled the telecom company to tackle even more complex tasks, such as optimizing the rollout of their 5G network. Traditional cell towers are usually suboptimally placed near urban centers and form an imperfect grid, leading to gaps in coverage. They’re also expensive to put up and maintain and incur challenges with real estate and property ownership.
Small cells are less expensive and more compact and can be installed on inner city buildings on a much finer grid. Their role is to repeat the signal from the main cell towers to bring it closer to end users. By crunching mobile subscriber data, well-calibrated AI can help create spatial models to hone in on ideal spots to build small cells and ensure maximum 5G signal strength for customers.
Designing the right 5G infrastructure is critical, especially given the rapid rise of video. “Video is more than half of our mobile traffic,” explains Chris Volinsky, who leads big data research at AT&T Labs. “Video traffic grew over 75 percent and smartphones drove almost 75 percent of our data traffic in 2016 alone. We expect video traffic growth to outpace overall data growth in 2020.”
Infrastructure is an enormous investment, even with small cells, so accurately modeling trends and usage growth is key to success. Demographic trends can cause previously underutilized areas to suddenly become hot traffic generators. While statistical models are useful for identifying trends in customer movement and throughout, AI and machine learning techniques create future projections from current data.
“We need to visualize billions of data points in a spatiotemporal fashion,” Volinsky elaborates. No tools existed previously to address AT&T’s unique data challenges, so the company built and open-sourced custom tools, such as Nanocubes, a data visualization tool that can map out millions of connections of individual mobile phones and connected devices to cell phone towers. The tool has been used outside the company to characterize sports fans in real time and to analyze crime rates and history.
Algorithms and tools are not the bottleneck in terms of solving problems. Volinsky clarifies that “the challenge is in the data and the data pipeline.” Modern data-hungry AI approaches require a centralized data source, but gathering one across myriad networks with idiosyncratic standards is no trivial task. Each small cell collects cellular data differently. Some track 4G but not 3G. Some don’t get iPhone data. If variations are not taken into account, bias will appear in the data and the results.
“There is no world expert in data munging,” Volinsky bemoans. “To succeed, you have to figure out organizationally how to access data in different silos, technically how to integrate with it, and ensure the formats are in line.” Data scientists often discover that they can’t solve problems because the fundamentals of managing data are difficult and time-consuming. “This is not the stuff people learn in grad school,” Volinsky warns.
Older methods can be just as useful as new ones
Volinsky’s convinced that AI is the most powerful addition in the toolbox used by AT&T’s research arm to develop the next generation of enterprise and consumer-facing solutions. At the same time, he cautions against using deep learning as a magical black box to solve all problems. Instead, you should prioritize solid data infrastructure, subject matter expertise, and an ensemble of methods from data science and machine learning toolboxes.
Volinsky would know best. His BellKor team won the coveted $1 million Netflix Prize in 2009. The key lesson learned during the three-year competition was the power of ensembles. Ensembles involve combining various methods — ranging from regression to support vector machines, singular value decomposition, restricted boltzmann machines, and neural networks — to produce a result. “Deep learning is a power tool in your toolbox, but you still need your old school tools to solve problems,” he emphasizes. “Deep learning evangelists say neural networks effectively incorporate all the other models, but I have not seen that work in practice.”
Economy of scale helps startups and hospitals
In tandem with in-house projects, AT&T operates six innovation labs, called Foundries, all over the world. Each Foundry specializes in a different industry.
Morris works with aspirational startups such as AIRA, a smart wearables startup that uses human-assisted computer vision algorithms to enable the blind and vision-impaired to “visualize” their surroundings and navigate their immediate environment.
Using established manufacturing relationships, AT&T helps health care IoT and wearables companies like AIRA accelerate their hardware prototyping and production. Similar to the Labs, the Foundries also leverage custom-built open-source tools such as Flow Designer, a rapid prototyping tool that simplifies hardware design for software engineers.
Remember Morris’ earlier comment about how the internet runs on AT&T? Turns out this can be mission critical for startups like AIRA, which must ensure superior connectivity at all times to protect the safety of its patients. Since AT&T’s AI systems regulate network traffic, the company can intelligently detect AIRA devices on its network and dynamically allocate greater bandwidth to support live video streaming.
AT&T’s control of networks also comes in handy for hospitals that hoard sensitive patient data. Fearful of security lapses, many operate their own data centers for fear of uploading personal information to the cloud. But data center management is typically not a hospital’s core competency, leading to outdated technology and massive inefficiencies.
“Do you want to run a hospital or do you want to run a data center?” asks Morris. Regardless of the cloud provider a hospital chooses to use, AT&T runs private network connections to all of its servers. “This traffic will never traverse the public internet,” she says, which gives hospitals an extra layer of protection.
Migrating more hospitals to the cloud not only eases administrative pains, it also unblocks AI research. “Hospitals are smart, but they’re like islands,” Morris explains. Competition often incentivizes hospitals to hoard data that is critical to share for superior results. Pooling hospital data into “collaborative cloud communities” and applying de-identification protocols enables medical researchers to access disparate data sets with greater geographic diversity. Algorithms for essential patient services, such as vital sign monitoring, can be trained on aggregate data sets for more accurate benchmarks.
Humans work with bots to ensure customer success
Lead inventive scientist Wen-Ling Hsu has been with AT&T for over 20 years. She obsessed over creating amazing customer experiences using massive data and information even before “big data” was coined.
Hsu analyzes customer conversations from both phone conversations from call centers and online chats with support agents. Machine learning allows her to build textual models, identify customer intent, and route customers to appropriate support agents faster.
With her extensive experience, Hsu learned that interpreting and using the intelligence gained from AI systems is “more of an art than a science.” What matters most is customer perception and seamless execution, so Hsu employs a combination of bots that directly interact with customers and those that stay in the background to assist human agents.
When asked to make a forecast for AI in 2017, Hsu responded, “Human judgment still plays a critical role in many tasks. Together, AI bots and human agents can learn from every customer interaction to personalize the customer experience.”
Mariya Yao is the Head of Research and Design for Topbots, a strategy and research firm for enterprise AI.
This article was originally published on Topbots.
VentureBeatVentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact. Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:
- up-to-date information on the subjects of interest to you
- our newsletters
- gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
- networking features, and more