UserTesting is expanding its platform today with a series of new AI-powered capabilities designed to help organizations gain better insights from research.

Much as the company name implies UserTesting provides a platform that helps organizations with user research and testing of different products and campaigns. In 2022, UserTesting was acquired by Thoma Bravo for $1.3 billion in a deal that merged the company with UserZoom which also developed user experience testing capabilities. With today's update, UserTesting is introducing a new Feedback Engine, based on capabilities from UserZoom and infused with generative AI to help better understand feedback from user surveys.

UserTesting is also expanding its gen AI capabilities for AI-powered insights first introduced in 2023, to enable AI-powered surveys that provide a deeper level of trend and theme analysis from user testing operations.

"Before we were doing AI around insight summaries, sentiment and the interactive path flows," Andy MacMillan, UserTesting CEO of UserTesting, told VentureBeat. "What we're doing now is creating a theming concept, so across open-ended survey questions and responses that people give verbally we can pull out themes."

How gen AI enables User Testing to identify themes in user surveys

UserTesting has already been providing its users with sentiment analysis capabilities. With the AI-powered theme generation capability, the company is looking to go a step further to more fully understand what testing results actually show.

The theme capabilities in UserTesting's AI-powered surveys are powered by large language models (LLMs) to analyze open-ended text responses from surveys and extract underlying themes. 

MacMillan explained that the models are trained on research data so they can understand the domain and identify themes, concepts, and sentiments that would be meaningful for researchers. 

Rather than just flagging keywords or classifying responses as positive or negative, the AI analyzes the full text and groups responses according to common topics or issues that people are discussing. It can then quantify how many responses fell into each theme, giving researchers a high-level view of the prevailing discussions before they dive into the details of individual comments. This provides a more nuanced understanding than generic keyword searches or classifications alone.

In its August 2023 update, UserTesting added AI-powered insights to its platform, which is a bit different than what the new theme capability brings. 

MacMillan explained that insights are generated when analyzing a specific task or experience that participants went through, like completing a purchase on a website. The insights would highlight metrics like success rates, points of dropout, or whether people followed the intended path. In contrast, themes are extracted from open-ended responses, like survey comments. The AI analyzes the full text and groups responses that discuss similar concepts or topics, to surface the prevailing discussions even if people don't use identical wording. It can then quantify the prevalence of each theme. Themes provide a higher-level view of what people are talking about beyond just sentiment.

The difference between gen AI summarization and real AI insights

The ability to create a summary out of a body of text, audio or video is a core gen AI capability that an LLM enables.

MacMillan emphasized that what UserTesting is doing with insights, themes and the latest set of updates on its platform, go much deeper than just gen AI summarization. He said that UserTesting trains its own models using customer experience research data. 

As such rather than just getting a summary of information, the UserTesting models provide the domain insights that an organization would otherwise need an experienced researcher to be able to determine. It's an approach that provides important context from the domain that generic summarization may lack.

"We've spent the last couple of years actually training a set of machine learning models around things like sentiment and intent and things like that, that are things that people look for when they do experience research," he said. “We have an incredible amount of content to train the engine on how to do that.”