Are you ready to bring more awareness to your brand? Consider becoming a sponsor for The AI Impact Tour. Learn more about the opportunities here.
While there are some big names in the technology world that are worried about a potential existential threat posed by artificial intelligence (AI), Matt Wood, VP of product at AWS, is not one of them.
Wood has long been a standard bearer for machine learning (ML) at AWS and is a fixture at the company’s events. For the past 13 years, he has been one of the leading voices at AWS on AI/ML, speaking about the technology and Amazon’s research and service advances at nearly every AWS re:Invent.
AWS had been working on AI long before the current round of generative AI hype with its Sagemaker product suite leading the charge for the last six years. Make no mistake about it, though: AWS has joined the generative AI era like everyone else. Back on April 13, AWS announced Amazon Bedrock, a set of generative AI tools that can help organizations build, train, fine tune and deploy large language models (LLMs).
There is no doubt that there is great power behind generative AI. It can be a disruptive force for enterprise and society alike. That great power has led some experts to warn that AI represents an “existential threat” to humanity. But in an interview with VentureBeat, Wood handily dismissed those fears, succinctly explaining how AI actually works and what AWS is doing with it.
The AI Impact Tour
Connect with the enterprise AI community at VentureBeat’s AI Impact Tour coming to a city near you!
“What we’ve got here is a mathematical parlor trick, which is capable of presenting, generating and synthesizing information in ways which will help humans make better decisions and to be able to operate more efficiently,” said Wood.
Wood will be speaking at VB Transform next week, on July 11 & 12 in San Francisco, our networking event for enterprise technology decision makers focused on how to embrace LLMs and generative AI.
The transformative power of generative AI
Rather than representing an existential threat, Wood emphasized the powerful potential AI has for helping businesses of all sizes. It’s a power borne out by the large number of AWS customers that are already using the company’s AI/ML services.
“We’ve got over 100,000 customers today that use AWS for their ML efforts and many of those have standardized on Sagemaker to build, train and deploy their own models,” said Wood.
Generative AI takes AI/ML to a different level, and has generated a lot of excitement and interest among the AWS user base. With the advent of transformer models, Wood said it’s now possible to take very complicated inputs in natural language and map them to complicated outputs for a variety of tasks such as text generation, summation and image creation.
“I have not seen this level of engagement and excitement from customers, probably since the very, very early days of cloud computing,” said Wood.
Beyond the ability to generate text and images, Wood sees many enterprise use cases for generative AI. At the foundation of all LLMs are numerical vector embeddings. He explained that embeddings enable an organization to use the numerical representations of information to drive better experiences across a number of use cases, including search and personalization.
“You can use those numerical representations to do things like semantic scoring and ranking,” said Wood. “So, if you’ve got a search engine or any sort of internal method that needs to collect and rank a set of things, LLMs can really make a difference in terms of how you summarize or personalize something.”
Bedrock is the AWS foundation for generative AI
The Amazon Bedrock service is an attempt to make it easier for AWS users to benefit from the power of multiple LLMs.
“We don’t believe that there’s going to be one model to rule them all,” Wood said. “So we wanted to be able to provide model selection.”
Beyond just providing model selection, Amazon Bedrock can also be used alongside Langchain, which enables organizations to use multiple LLMs at the same time. Wood said that with Langchain, users have the ability to chain and sequence prompts across multiple different models. For example, an organization might want to use Titan for one thing, Anthropic for another and AI21 for yet another. On top of that, organizations can also use tuned models of their own based on specialized data.
“We’re definitely seeing [users] decomposing large tasks into smaller task and then routing those smaller tasks to specialized models and that seems to be a very fruitful way to build more complex systems,” said Wood.
As organizations move to adopt generative AI, Wood commented that a key challenge is ensuring that enterprises are approaching the technology in a way that enables them to actually innovate.
“Any large shift is 50% technology and 50% culture, so I really encourage customers to really think through both a technical piece where there’s a lot of focus at the moment, but also a lot of the cultural pieces around how you drive invention using technology,” he said.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.