Join top executives in San Francisco on July 11-12, to hear how leaders are integrating and optimizing AI investments for success. Learn More
Open-source solutions firm Red Hat today unveiled Ansible Lightspeed, a generative AI service integrated with IBM Watson Code Assistant. The company’s latest offering seeks to drive extensive utilization of Ansible automation within organizations, streamlining task automation for beginners while freeing experienced automators from the arduous task of creating low-level tasks.
Red Hat utilizes natural language processing (NLP) to integrate the service with Watson Code Assistant, which is set to be available in the near future. Ansible Lightspeed allows users to rapidly construct automation code by harnessing IBM’s foundation models. According to the company, this integration offers a valuable solution for enterprises, as it addresses the skills gap and enhances efficiency, thereby expediting the time-to-value of automation.
By allowing users to input simple English prompts, the service facilitates the translation of domain expertise into YAML code to create or modify Ansible Playbooks. Furthermore, users can actively contribute to the model’s training by providing valuable feedback and ensuring continuous enhancements.
“Organizations looking to modernize have a key challenge: An automation skills gap,” Tom Anderson, VP and GM of Ansible told VentureBeat. “Generative AI has the potential to make experienced automation talent more productive and expand the aperture of who can create usable automation content.”
Join us in San Francisco on July 11-12, where top executives will share how they have integrated and optimized AI investments for success and avoided common pitfalls.
How? By making it easier for automation domain experts to translate their expertise into working automation code, he said. “Users can use natural language prompts to get code recommendations for generating tasks, which are the building blocks of Ansible Playbooks,” said Anderson.
The company says this new tool empowers domain experts to effectively translate their process knowledge and objectives into code. Furthermore, it caters to users who have a deep understanding of what needs to be accomplished but lack the YAML proficiency to craft compliant and efficient playbooks independently.
Moreover, the tool harnesses the vast repository of Ansible subject matter expertise within the Ansible Lightspeed foundation model. This allows users to explore new automation domains.
Leveraging natural language models to streamline automation
Anderson told VentureBeat that the foundation model optimized for NLP is the cornerstone of the collaboration between Red Hat and IBM, distinguishing Ansible Lightspeed from other tools.
“The foundation model is trained with data from Ansible Galaxy, a huge open-source repository of Ansible content covering a plethora of use cases and vertical applications of Ansible technology,” said Anderson. “In addition to the data from Ansible Galaxy, the model has been (and continues to be) fine-tuned with additional Red Hat and IBM IT automation subject matter expertise.”
He said he believes that IT automation is a key driver of operational efficiency and frees teams up to focus on innovation. But, standing up automated workflows can be complicated and time-consuming. Ansible Lightspeed can boost the efficiency of an organization’s automation efforts and improve ROI and time to value.
“Writing quality automation code takes time and resources,” said Anderson. “Ansible Lightspeed can help developers and operations teams produce better automation code much more quickly. Again, Ansible Lightspeed isn’t intended to be a silver bullet. But it is a true enhancement to the creation experience.”
He added that users can access the service directly in their code editor for a “real-time productivity boost” to their existing workflows. “How much time you’re saving depends on the complexity of the playbooks you’re developing, but when you’re trimming a task from 30 to 60 minutes to 5 or 10 minutes, multiple times a day, it adds up,” he said.
Leveraging IBM Research LLM
According to the company, the development of the tool involved leveraging a large language model (LLM) derived from IBM Research. IBM contributed its expertise in the field of LLM, while Red Hat contributed their specific domain knowledge to train the model using publicly available Ansible automation content.
This collaborative effort also included post-recommendation training based on subject matter expertise, highlighting the combined strengths of Red Hat’s domain expertise and IBM’s proficiency in LLMs, foundation models and AI.
“This uses generative AI from IBM’s foundation model trained on a specific domain (Ansible), to help people create automation faster,” Anderson explained. “Existing subject matter experts will be much more efficient by filling out a lot of the repetitive pieces of code as they’re creating an automation playbook, which is ultimately YAML code. This accelerates their ability to generate that a lot faster. Ansible Lightspeed makes existing subject matter experts far more efficient by doing a lot of the work for them.”
Anderson added that IBM’s CIO team actively participated as early testers of Ansible Lightspeed with IBM Watson Code Assistant, resulting in notable productivity improvements.
Seamless accessibility tailored to IT environments
In addition, during the pilot phase, the preview version of Watson Code Assistant proved instrumental in assisting IBM CIO teams in generating approximately 60% of their accurate code while adopting the Ansible automation platform.
The tool offers users accessibility through the Ansible VSCode extension, enabling them to interact directly with the AI within their code editor. Users can prompt the AI, evaluate suggestions, and make modifications or accept/reject them, with the convenience of incorporating the generated code into an Ansible Playbook.
In addition, Ansible Lightspeed operates within the user’s IT environment, acquiring knowledge and providing recommendations for variables and settings tailored to meet specific requirements.
Additionally, the tool boasts pre- and post-processing capabilities, ensuring all code recommendations align with recognized best practices in Ansible and automation. This feature enables users to confidently leverage generative AI, knowing that the suggestions adhere to established guidelines and standards.
“All generated code recommendations are backed by “content source matching,'” said Anderson. “That means that users can see the specific URL and path of where the code was pulled from, a description of the data source, the license under which the code is covered and the type of Ansible content it is. All Ansible Galaxy users will have the choice to opt out of having their code used as data to train the Ansible Lightspeed foundation model.”
The promising future of automation empowered by generative AI
Anderson said Red Hat recognizes the potential of foundational models to deliver significant business value.
Data scientists and developers can enhance accuracy by adapting these models to specific use cases like writing automation code. However, the initial training of these models demands considerable infrastructure and resources including specialized tools and platforms, even before addressing serving, tuning and management.
These are challenges that Red Hat OpenShift AI can help address by providing a foundation that is already familiar to the enterprise IT organizations that need to manage AI infrastructure, and can still meet needs of data scientists and app developers, said Anderson.
“We do see domain-specific AI being a key factor in future adoption — taking a model and shaping it to meet a specific need for an organization is incredibly valuable,” he said. “This helps create unique AI-enabled applications, and with a foundation like OpenShift AI, you can run it on a manageable, scalable platform that still fuels further innovation.”
He explained that the company aims to broaden the accessibility of AI to enterprises across the hybrid cloud.
“This could [include] organizations that want to use foundational models in-house but need a platform that they can use, or it could be a business that just wants to reap the benefits of an AI-driven application without managing any of the plumbing,” he added. “Red Hat’s goal is to support both of these paths via open standards-based approaches and bring customers choice — choice in tooling, choice in deployment method and choice in how they consume the final product.”
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.