Were you unable to attend Transform 2022? Check out all of the summit sessions in our on-demand library now! Watch here.
A clear majority of employees (87%) peg data quality issues as the reason their organizations failed to successfully implement AI and machine learning. That’s according to Alation’s latest quarterly State of Data Culture Report, produced in partnership with Wakefield Research, which also found that only 8% of data professionals believe AI is being used across their organizations.
For the report, Wakefield conducted a quantitative research study of 300 data and analytics leaders at enterprises with more than 2,500 employees in the U.S., U.K., Germany, Denmark, Sweden, and Norway. The enterprises were polled regarding their progress in establishing a culture of data-driven decision-making and the challenges they continue to face.
According to Alation, 87% of professionals say inherent biases in the data being used in their AI systems produce discriminatory results that create compliance risks for their organizations. Survey-takers pointed to the need for curation and governance, data literacy and understanding, and data from more varied sources.
A lack of executive buy-in was also cited as a top reason AI wasn’t being used effectively at organizations, with 55% of respondents citing this as more important than a lack of employees with the skills to create AI models. When it comes to data quality issues, data professionals said inconsistent standards across data collection, compliance and privacy issues, and a lack of democratization or access to data were the three most common blockers.
MetaBeat will bring together thought leaders to give guidance on how metaverse technology will transform the way all industries communicate and do business on October 4 in San Francisco, CA.
As Broadridge VP of innovation and growth Neha Singh noted in a recent piece, many firms try to develop AI solutions without having clean, centralized data pools or a strategy for actively managing them. Without this critical building block for training AI solutions, the reliability, validity, and business value of any AI solution is likely to be limited. McKinsey estimates that companies may be squandering as much as 70% of their data-cleansing efforts.
Of the enterprises that have deployed AI, respondents cited better modeling skills among analysts, cataloging data for visibility and access to data, and the ability to crowdsource info as ways to combat bias in AI. Roughly a third (31%) say incomplete data is a top issue that leads to AI failing.
The findings agree with other surveys showing that, despite enthusiasm around AI, enterprises struggle to deploy AI-powered products. Business use of AI grew a whopping 270% over the past four years, according to Gartner, while Deloitte says 62% of respondents to its corporate October 2018 report adopted some form of AI, up from 53% in 2019. But adoption doesn’t always meet with success, as the roughly 25% of companies that have seen half their AI projects fail will tell you.
“To assess readiness for AI, one must first look at the larger role of data within organizations — a language some companies struggle to learn and command,” the report reads. “There remains a large gap between the haves and the have-nots; successful deployment of AI among the haves and failure or a stop-and-start implementation among the rest will only widen that gap. Companies should be asking themselves if they have the right plans in place to become a more data-driven organization, and what that actually looks like in practice.”
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.