We are excited to bring Transform 2022 back in-person July 19 and virtually July 20 - 28. Join AI and data leaders for insightful talks and exciting networking opportunities. Register today!
Marketing’s potential to deliver results relies on data quality, but data accuracy, consistency, and validity continue to be a challenge for many organizations. Inconsistent data quality is holding marketing teams back from converting leads into sales, accurately tracking campaign performance, and taking on the larger challenges of optimizing product mix and product/service revenue forecasts.
The latest analytics, Account-Based Marketing (ABM), CRM, marketing automation, and lead scoring tools all provide real-time data capture and analysis. How the tools ensure consistent data quality directly impacts the quality of the AI and machine learning models the tools use.
Inconsistent data drives opportunities away
Marketing teams can’t deliver on their goals with bad data quality. For example, inaccurate prospect data clogs sales pipelines by slowing down efforts to turn marketing qualified leads (MQLs) into sales qualified leads (SQLs).
- Two-thirds of sales leads don’t close because of bad data quality, and up to 25% of a typical organization’s customer and prospect records have critical data errors jeopardizing deals, Forrester said in a recent research brief.
- Gartner’s 2020 Magic Quadrant for Data Quality Solutions says poor data quality costs the typical enterprise up to $12.9 million or more every year.
- Dun & Bradstreet’s study The Past, Present, and Future of Data says that 25% of businesses with over 500 employees have lost a customer due to incomplete or inaccurate information.
Problems with data quality increase the odds of failure for AI initiatives such as predictive audience offers and promotions, personalization, AI-enabled chatbots for advanced service, and automated service recovery. A quarter of organizations attempting to adopt AI report an up to a 50% failure rate, IDC said recently. The leading causes of inconsistent data quality in marketing include problems with taxonomy and meta-tagging, lack of data governance, and loss of productivity.
No data consistency
The most common reason AI and ML fail in the marketing sector is that there’s little consistency to the data across all campaigns and strategies. Every campaign, initiative, and program has its unique meta-tags, taxonomies, and data structures. It’s common to find marketing departments with 26 or more systems supporting 18 or more taxonomies, each created at one point in a marketing department’s history to support specific campaigns. O’Reilly’s The State of Data Quality In 2020 survey found that over 60% of enterprises see their AI and machine learning projects fail due to too many data sources and inconsistent data. While the survey was on the organization level, it would not be a stretch to assume the failure rate would be higher within marketing departments, as it’s common to create unique taxonomies, databases, and metatags for each campaign in each region.
The larger, more globally based, and more fragmented a marketing department is, the harder it is to achieve data governance. The O’Reilly State of Data Quality Survey found that just 20% of enterprises publish information about data provenance or data lineage, which are essential tools for diagnosing and resolving data quality issues. Creating greater consistency across taxonomies, data structures, data field definitions, and meta-tags would give marketing data scientists a higher probability of succeeding with their ML models at scale.
Up to a third of a typical marketing team’s time is spent dealing with data quality issues, which has a direct impact on productivity, according to Forrester’s Why Marketers Can’t Ignore Data Quality study. Inaccurate data makes tactical decisions harder to get right, which could impact revenues. Forrester found that 21 cents of every media dollar have been wasted over the last 12 months (as of 2019) due to poor data quality. Taking the time to improve data quality and consistency in marketing would convert the lost productivity to revenue.
Start with change management and data governance
Too often, marketers and the IT teams supporting them rely on data scientists to improve inconsistent data. It’s time-consuming, tedious work and can consume up to 80% or more of the data scientist’s time. It is no surprise that data scientists rate cleaning up data as their least-liked activity.
Instead of asking data scientists to solve the marketing department’s data quality challenges, it would be far better to have the marketing department focus on creating a single, unified content data model. The department should consolidate diverse data requirement needs into a single, unified model with a taxonomy rigid enough to ensure consistency, yet adaptive enough to meet unique campaign needs. Change management makes the marketer’s job easier and more productive because there is a single, common enterprise taxonomy. Data governance is key to solving this problem, and marketing leaders have to be able to explain how improving metadata consistency and content data models fits within the context of each team member’s role. After that, the marketing organization should focus on standardizing across all taxonomies and the systems supporting them.
The bottom line is that inconsistent data quality in marketing impacts the team by jeopardizing new sales cycles and creating confusion in customer relationships. The ability to get AI and ML pilots into production and provide insights valuable enough to change a company’s strategic direction depends on reliable data. Companies will find their marketing campaigns’ future contributions to growth are defined by how the team improves data quality today.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn more about membership.