Check out the on-demand sessions from the Low-Code/No-Code Summit to learn how to successfully innovate and achieve efficiency by upskilling and scaling citizen developers. Watch now.


Artificial Intelligence (AI) everywhere has the potential to transform every business and improve the life of every person on the planet. In fact, every day we hear about AI breaking new ground, from detecting cancer and playing Minecraft, to creating “sentient” chatbots and generating compelling art. The goal of AI is simple: To accelerate “data to insights.” We have seen tremendous progress in the basic AI ingredients — the exponential growth of “data, compute and algorithms.”

Data, as measured by the total number of bytes, is in zettabytes (1021). Compute, as measured by hardware execution capacity of operations per second, is in petaflops (1015) to exaflops (1018), and algorithms, as measured by the number of parameters in a neural network, have exceeded a trillion (1012).

However, research has found that 87% of AI concepts do not make it into deployment for several reasons, including performance, infrastructure, and multi-vendor software and tooling. As data sets grow and systems become more complex, developers face new challenges with AI implementation and deployment. As a result, business objectives are slowed drastically as developers spend valuable time and resources resolving technical, process and organizational issues, working through failed projects and updating code — all of which create additional cost. 

AI is an end-to-end problem that requires end-to-end support. To truly experience AI everywhere, developers and data scientists working within the space need to bring compute, data and algorithms together. For those looking to broaden the use of AI within their organization, focusing on the principles of human productivity and computer performance is key.

Event

Intelligent Security Summit

Learn the critical role of AI & ML in cybersecurity and industry specific case studies on December 8. Register for your free pass today.

Register Now

The question is: What is the best way to do this?

Bridging the divide

AI software is the bridge from “data to insights” with the support of “compute and algorithms.” Still, this software bridge needs to be constructed for millions of data scientists and developers, whose AI applications are in turn used by billions of users. 

AI software can enhance human productivity to scale AI everywhere. To drive the proliferation of AI, it is important that the industry actualizes methods that make it easier for developers and data scientists to build on current AI solutions and algorithms, or pioneer new ones. It should not require a PhD in AI to apply AI widely. Therefore, it is equally important to ensure that the data and the infrastructure are readily accessible. 

Productivity can be achieved with the right data and AI platform and tooling, such as those that increase performance of popular industry-standard AI frameworks or provide open tools to facilitate end-to-end AI workflows. These might include AI analytics toolkits, development and deployment toolkits, end-to-end distributed AI toolkits, reference toolkits and AutoML toolkits.

Also: domain specific toolkits, low-code or no-code development environments, data labeling and augmentation tools, bias detection tools, and tools for transfer learning, federated learning and others.

All of these are open, standards-based, unified and secure to make it easier for developers and data scientists to engineer data and build and deploy AI solutions. For example, some tools can increase human productivity by more than ten-fold.

Accelerating AI software

There is no “one-size-fits-all” solution for the software that is utilized during each phase of the AI application lifecycle, because it varies across verticals and use cases. As a result, leaders within the industry must collaborate on open- source tools.

For instance, Intel is partnering with Accenture to help enterprises innovate and accelerate their digital transformation journey with the introduction of open source AI reference kits. These reference kits can reduce the time to solution from weeks to days, helping data scientists and developers train models faster and at a lower cost by overcoming the limitations of proprietary environments.

AI software can enhance computer performance through automatic software optimizations. The impact of software AI acceleration can be significant, from 10 to 100 times in many cases. Computer performance is often the primary requirement that IT teams work toward because of the resource and compute-intensive nature of AI workloads, which lead to cost or compute time constraints.

Hardware AI acceleration needs to be complemented with software AI acceleration because of the performance optimizations that it enables. Without advanced software optimizations, the utilization of petaflops or exaflops could be very low, especially when new hardware is released. That means that more than half of the hardware execution capability is idle.

Software AI acceleration can help improve the performance of AI hardware by reducing training length, inference time, energy consumption, memory usage and cost — all while maintaining high levels of performance and accuracy. This is key for easing the development and deployment of intelligent applications.

Getting to AI everywhere

Given the diversity of workloads in AI, a heterogenous architecture strategy that provides greater choice to users works best for hardware, which ties directly to the performance of these models. CPUs with built-in AI acceleration, GPUs, custom AI accelerators and even FPGAs all have a role to play. In addition, AI software can provide a consistent user interface to allow users and developers to move from one hardware accelerator to another depending on the workloads.

Across all industries, AI is growing. According to Gartner, worldwide AI software revenue alone is expected to reach $62.5 billion in 2022, an increase of 21.3% from 2021.

AI software is the bridge for AI everywhere, increasing human productivity and computer performance. To experience AI everywhere, developers and data scientists need to simplify processes related to AI systems, ensure productivity through software that features automation, and find solutions that can optimize performance of AI workloads in open ecosystems and secure cross-architecture environments. Only then can organizations bring AI everywhere to life.

Wei Li is the vice president and general manager of AI & Analytics at Intel.

DataDecisionMakers

Welcome to the VentureBeat community!

DataDecisionMakers is where experts, including the technical people doing data work, can share data-related insights and innovation.

If you want to read about cutting-edge ideas and up-to-date information, best practices, and the future of data and data tech, join us at DataDecisionMakers.

You might even consider contributing an article of your own!

Read More From DataDecisionMakers