DZone
Thanks for visiting DZone today,
Edit Profile
  • Manage Email Subscriptions
  • How to Post to DZone
  • Article Submission Guidelines
Sign Out View Profile
  • Post an Article
  • Manage My Drafts
Over 2 million developers have joined DZone.
Log In / Join
Refcards Trend Reports
Events Video Library
Refcards
Trend Reports

Events

View Events Video Library

Zones

Culture and Methodologies Agile Career Development Methodologies Team Management
Data Engineering AI/ML Big Data Data Databases IoT
Software Design and Architecture Cloud Architecture Containers Integration Microservices Performance Security
Coding Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
Culture and Methodologies
Agile Career Development Methodologies Team Management
Data Engineering
AI/ML Big Data Data Databases IoT
Software Design and Architecture
Cloud Architecture Containers Integration Microservices Performance Security
Coding
Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance
Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks

Enterprise AI Trend Report: Gain insights on ethical AI, MLOps, generative AI, large language models, and much more.

2024 Cloud survey: Share your insights on microservices, containers, K8s, CI/CD, and DevOps (+ enter a $750 raffle!) for our Trend Reports.

PostgreSQL: Learn about the open-source RDBMS' advanced capabilities, core components, common commands and functions, and general DBA tasks.

AI Automation Essentials. Check out the latest Refcard on all things AI automation, including model training, data security, and more.

Related

  • Building Safe AI: A Comprehensive Guide to Bias Mitigation, Inclusive Datasets, and Ethical Considerations
  • Introduction to Modern Data Stack
  • Data Lineage in Modern Data Engineering
  • From Chaos to Control: Nurturing a Culture of Data Governance

Trending

  • The Impact of Biometric Authentication on User Privacy and the Role of Blockchain in Preserving Secure Data
  • Spring Boot 3.2: Replace Your RestTemplate With RestClient
  • Types of Data Breaches in Today’s World
  • AI-Driven API and Microservice Architecture Design for Cloud
  1. DZone
  2. Data Engineering
  3. Data
  4. Data Quality Survey: Resolving Data Issues Takes 166% Longer Year Over Year

Data Quality Survey: Resolving Data Issues Takes 166% Longer Year Over Year

The second annual State of Data Quality Survey revealed shocking statistics, such as how data quality impacts 31% of revenue on average. Find out what else we uncovered.

By 
Lior Gavish user avatar
Lior Gavish
·
May. 17, 23 · News
Like (1)
Save
Tweet
Share
1.5K Views

Join the DZone community and get the full member experience.

Join For Free

The headline of the second annual The State of Data Quality survey was, without a doubt, the fact that data downtime nearly doubled year over year (1.89x).

The Wakefield Research data quality survey, which was commissioned by Monte Carlo and polled 200 data professionals in March 2023, found three critical factors that contributed to this increase in data downtime. These factors included:

  • A rise in monthly data incidents, from 59 in 2022 to 67 in 2023;
  • 68% of respondents reported an average time of detection for data incidents of four hours or more, up from 62% of respondents in 2022; and
  • A 166% increase in average time to resolution, rising to an average of 15 hours per incident across respondents. 

The increase in the time to resolve data quality incidents is particularly troubling and perhaps reflective of the increasing complexity of modern data architectures. 

Consequences of Bad Data

More than half of respondents reported 25% or more of revenue was subjected to data quality issues. The average percentage of impacted revenue jumped to 31%, up from 26% in 2022.

Additionally, an astounding 74% reported business stakeholders identify issues first, “all or most of the time,” up from 47% in 2022. 

These findings suggest data quality remains among the biggest problems facing data teams, with bad data having more severe repercussions on an organization’s revenue and data trust than in years prior. 

Data Quality Tradeoffs

The survey also suggests data teams are making a tradeoff between data downtime and the amount of time spent on data quality as their datasets grow. 

For instance, organizations with fewer tables reported spending less time on data quality than their peers with more tables, but their average time to detection and average time to resolution was comparatively higher. Conversely, organizations with more tables reported lower average time to detection and average time to resolution but spent a greater percentage of their team’s time to do so. 

  • Respondents that spent more than 50% of their time on data quality had more tables (average of 2,571) compared to respondents that spent less than 50% of their time on data quality (average of 208).
  • Respondents that took less than 4 hours to detect an issue had more tables (average 1,269) than those who took longer than 4 hours to detect an issue (average 346).
  • Respondents that took less than 4 hours to resolve an issue had more tables (average 1,172) than those who took longer than 4 hours to resolve an issue (average 330).

Data Quality Survey: Other Findings Of Note

The survey revealed additional insights on the state of data quality management, including:

  • 50%of respondents reported data engineering is primarily responsible for data quality, compared to:
    • 22% for data analysts,
    • 9% for software engineering, 
    • 7% for data reliability engineering,
    • 6% for analytics engineering,
    • 5% for the data governance team, and 
    • 3% for non-technical business stakeholders. 
  • Respondents averaged 642 tables across their data lake, lakehouse, or warehouse environments. 
  • Respondents reported having an average of 24 dbt models, and 41% reported having 25 or more dbt models.
  • Respondents averaged 290 manually written tests across their data pipelines. 
  • The number one reason for launching a data quality initiative was that the data organization identified data quality as a need (28%), followed by a migration or modernization of the data platform or systems (23%).

It appears that data testing remains data engineers’ number one defense against data quality issues — and that’s clearly not cutting it. Incidents are falling through the cracks, stakeholders are the first to identify problems, and teams fall further behind. 

Leaning into more robust incident management processes and automated, ML-driven approaches like data observability is the future of data engineering at scale.

Data quality Data (computing)

Published at DZone with permission of Lior Gavish. See the original article here.

Opinions expressed by DZone contributors are their own.

Related

  • Building Safe AI: A Comprehensive Guide to Bias Mitigation, Inclusive Datasets, and Ethical Considerations
  • Introduction to Modern Data Stack
  • Data Lineage in Modern Data Engineering
  • From Chaos to Control: Nurturing a Culture of Data Governance

Partner Resources


Comments

ABOUT US

  • About DZone
  • Send feedback
  • Community research
  • Sitemap

ADVERTISE

  • Advertise with DZone

CONTRIBUTE ON DZONE

  • Article Submission Guidelines
  • Become a Contributor
  • Core Program
  • Visit the Writers' Zone

LEGAL

  • Terms of Service
  • Privacy Policy

CONTACT US

  • 3343 Perimeter Hill Drive
  • Suite 100
  • Nashville, TN 37211
  • support@dzone.com

Let's be friends: