Hear from CIOs, CTOs, and other C-level and senior execs on data and AI strategies at the Future of Work Summit this January 12, 2022. Learn more


Data sprawl has become a real and costly problem inside organizations, and it’s hurting innovation. According to a new Hakkoda survey, 6% of business and IT leaders labeled their data organization and processes a “dumpster fire.”

For organizations spending millions, or even billions, on data and analytics, data sprawl is the unplanned result of layer after layer of individual initiatives in which data is copied, exported, or sent to various applications. It’s the natural — but costly — byproduct of organizations chasing opportunity. As organizations have seen the power of data analytics, initiatives have soared. Collecting data is easy, but turning data into intelligence is hard, expensive, and not always entirely effortless.

Costs are high across the data analytics lifecycle, with BI and reporting (54%) and data architecture (52%) costing the most. After decades of data initiatives, most companies have an ever-growing tangle of BI apps, data warehouses, and service providers: one in four respondents have more than 10 BI apps, 35% use five or more data warehouses, and nearly a quarter rely on ten or more service providers.

Graphic. Title: The Hard and Soft Costs of Data Sprawl. What types of costs does your company struggle to capture in order to fully understand the cost of data analytic programs? 50% said cost impacts of manual processes. 49% say cost impact of inefficient or duplicate work. 41% say understanding business impact of wrong data or data used in the wrong way. 36% say project costs. 35% say software and infrastructure. 30% say increased costs caused by inadequate training. 28% say number and cost of analysts within the line of business. 20% say internal data-related headcount in IT. 3% say they don't face any barriers or costs. 6% say we've never made an attempt to understand the full cost of data analytics.

While talent is scarce — 97% reported problems finding data scientists, architects, analysts, and engineers — more than three in four executives surveyed (77%) report large numbers of staffers doing basic reporting and analytics, including a quarter (25%) which reported having more than 500 data analysts in their organization.

Freeing or finding great talent could have a big impact on innovation. The vast majority (94%) of respondents reported some barriers to innovation in their data programs. When asked what is preventing organizations from innovating or creating new offerings with data, the number one answer was insufficient internal expertise. Internal bureaucracy and the inability to tap into existing resources were a close second.

In addition to the hard costs of hiring and retaining employees, as well as adding licenses, there’s a long tail of often-unmeasured costs that come with these manual processes, inefficiencies, duplicate work, poor training, and delayed work. More than 90% of data leaders report that it’s hard to capture the costs of their data analytics programs. Half struggle to capture the costs of manual processes. More than a third (36%) say they struggle to capture the project costs of adding external contractors and service providers. A full 6% don’t even bother.

To understand the cost, causes, and business impact of data sprawl in organizations, Hakkoda commissioned a survey of more than 300 business and IT leaders responsible for data and analytics initiatives at mid-to-large size companies.

Read the full report by Hakkoda.

DataDecisionMakers

Welcome to the VentureBeat community!

DataDecisionMakers is where experts, including the technical people doing data work, can share data-related insights and innovation.

If you want to read about cutting-edge ideas and up-to-date information, best practices, and the future of data and data tech, join us at DataDecisionMakers.

You might even consider contributing an article of your own!

Read More From DataDecisionMakers