Check out the on-demand sessions from the Low-Code/No-Code Summit to learn how to successfully innovate and achieve efficiency by upskilling and scaling citizen developers. Watch now.


Over the last decade, artificial intelligence (AI) technologies have increasingly relied on neural networks to perform pattern recognition, machine learning (ML) and prediction. However, with ML models that consist of billions of parameters, training becomes more complicated as the model is unable to fit on a single GPU.

Large language models (LLMs) such as GPT-3 and Gopher cost millions of dollars and require vast amounts of computing resources, making it challenging for cash and resource-constrained organizations to enter the field. Running trained models such as BLOOM or Facebook’s OPT-175B require a substantial number of GPUs and specialized hardware investment. It is often difficult for smaller tech organizations to acquire data science as well as parallel and distributed computing expertise — even if it can secure the funds needed to train an LLM.

Artificial intelligence-as-a-service (AIaaS) offers a more cost-effective option for running and developing software solutions in-house. AIaaS makes AI technology more accessible by providing low-code tools and APIs that end users can integrate. According to a new report by Reports and Data, the global AIaaS market is forecasted to grow at a rate of 45.6% from $1.73 billion in 2019 to $34.1 billion in 2027. 

Such technology enables small tech businesses to harness AI’s power through cost-effective, ready-to-use solutions with minimal effort. With an AIaaS, you can pay for your needed tools and upgrade to a higher plan as your business and data scale. Instead of months, it takes only weeks to set up AIaaS solutions.  

Event

Intelligent Security Summit

Learn the critical role of AI & ML in cybersecurity and industry specific case studies on December 8. Register for your free pass today.

Register Now

Assembly AI’s ready to integrate offering

California-based API startup Assembly AI provides customers with a single AI-powered API to convert audio or video to text. It’s designed to empower developers by aiding in-model development for transcribing, understanding and analyzing the audio data. Offered as an AIaaS model, the APIs can perform various tasks ranging from summarization and content moderation to topic detection.

“Our API platform focuses on providing developers and product teams an easy access to train and deploy state-of-the-art, production-ready AI models that they can embed in their products to build exciting new features for applications,” Dylan Fox, founder and CEO of Assembly AI, told VentureBeat. 

Assembly AI’s API Audio Intelligence provides an analysis of audio data, with features like sentiment analysis, summarization, entity detection and topic detection. In addition, through the service’s asynchronous transcription feature, users can generate a transcription of pre-recorded audio or video files within a few hundred milliseconds. The company’s API can also transcribe video files, automatically stripping the audio out of the video file. 

Several virtual meeting and video platforms currently use Assembly AI’s models, said Fox, to automate audio summarization and content moderation workflows.

“As a result, the number of developers building with our APIs has grown by more than 1,000% over the past 12 months,” he claimed.

In July, the company announced a $30 million series B funding round, just four months after its $28 million series A. Fox said the current investment will be used towards allocating more resources to train and develop accurate AI models that their end users can readily integrate.

Barriers of LLMs

For organizations embracing digital transformation to develop connected experiences for satisfying growing customer expectations, resources and tools that are flexible as well as efficient to integrate systems and unify data are a must. Until recently, many small businesses were priced out of using AI-based LLMs for their business, as it requires in-house development of systems, staffing and maintenance costs and hardware changes for different tasks. 

Fox says that although LLMs can provide significant advantages for tasks such as speech recognition, summarization and audio embedding, the barrier to entry from a computer perspective is getting higher and higher almost every day. 

“State-of-the-art LLMs require hundreds of GPUs to run a five-billion parameter model successfully,” Fox explained. “Such an entry point makes it harder for SMBs and brand-new startups with lower resources to come in and provide the required accuracy.”.

While working at Cisco Systems as a machine learning engineer in 2016, Fox was doing research engineering for NLP and NLU Systems and looking for available options for AI-as-a-service to integrate into AI products built on speech recognition. He noticed that the available speech recognition vendors were on low accuracy, as their models were built on old-school ML tech that required tremendous computational resources, which made them hard to integrate and run just for compatibility tests on systems.   

Integrating APIs offered through AIaaS could provide an alternative solution to small businesses, he said, eliminating the need for in-house computational infrastructures, especially in training and deploying state-of-the-art models. 

“I was interested in using state-of-the-art deep learning models to create more accurate speech recognition models,” he said. “So, my initial idea was, what if we can use the latest deep learning research to build accurate speech recognition models and then expose those models to developers through simple API structures?” 

Current API adoption and usage

To establish a set of mechanisms by which an application or a component interacts with others, integrating APIs is an on-the-go solution. In a survey by Cloud Elements, 83% of respondents said they consider API integration a critical part of their business strategy, driven by digital transformation initiatives and cloud application adoption. 

With the explosion of cloud-based products and apps, enterprises are now addressing the importance of API integration. According to a report, technology analysts expect API investments to increase by 37% in 2022.

APIs offer flexibility, allowing companies to create sophisticated pipelines for supervised and unsupervised machine learning tasks. As a result, APIs can help improve the end-user experience through automation and effective integration strategies, and drastically reduce operational costs and development time. 

These integrations have the potential to yield entirely new products that can become a core offering for an organization, creating new functionality between apps that can develop services that never existed before. As APIs are becoming a crucial part of product development, business strategy and scalability, they need to be easily integrated to streamline APIs successfully. 

“We aim to provide businesses with a great alternative that is more developer friendly while being customer privacy-focused,” said Fox. “Our API platform has a catalog of state-of-the-art AI models for many different tasks beyond audio. The models are constantly trained and maintained to ensure easy integration.”

Fox added that with the recent investment, Assembly AI’s future focus is to develop more accurate speech recognition models by leveraging technologies such as Transformers, larger training compute clusters and datasets, provided to end users in the form of APIs.  

There is a growing need for flexible platforms that offer highly functional APIs that integrate seamlessly into the ecosystem of products and services used by their customers.

“APIs must evolve according to developers’ expectations and that APIs and API-based integration should essentially be customer-centric,” Fox said. 

The AI-as-a-service revolution

Fox observed an industry trend of product teams trying to leverage AI in their products, especially for audio and video. Such integrations have become more attainable through services like AIaaS, allowing companies to leverage AI for use cases such as customer service, data analysis and automated audio and video production, according to Fox. 

“Developers can now more easily embed all the AI capabilities they need into their products to build these features,” he said. “AI-as-a-service also has simplified deployment issues, enabling us to deploy large models cost-effectively, with lower latencies.”

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.