U.S. organizations say a third of their data is bad

GCN

Agencies are relying on data aggregation and analytics to enhance citizen services and understand social, scientific and financial trends.  Given the meteoric rise in the uses of data aggregation, as well as a growing reliance on its methods, data accuracy is paramount.

Many organizations struggle with data inaccuracy, despite having an established data quality strategy. However, in a startling increase from last year, 1,200 respondents to a global study believe 26 percent of their data is inaccurate, U.S. respondents believe 32 percent of their data is inaccurate.

The Experian Data Quality’s study noted three common data quality errors: incomplete or missing data, outdated information and inaccurate data. Most organizations cited duplicate data as a contributor to overall inaccuracies , while human error is believed to be the biggest factor in the data spoilage. Lack of automation – and a consequent dependence on manual data input – has also contributed to the problem, the study suggested.

One way to address these concerns is to institute data audit software, Experian suggested, noting that  only 24 percent of their study’s respondents use such software.  Organizations that do not deploy proactive software to detect errors not only waste resources and damage productivity, but they may not be able to derive accurate insights from their data.

Besides auditing technology, organizations can use data profiling or matching and linking technology to detect errors.

In order to make improvements, 89 percent of U.S. organizations will seek to invest in some type of data management solution, Experian said, warning that without a coherent data management strategy, these types of errors will continue to increase.