Many major companies are routinely making important commercial decisions based on "remarkably inaccurate" data, industry watchers have warned.
According to analyst Gartner, more than a quarter of critical data within Fortune 1,000 businesses will continue to be inaccurate or incomplete through to 2007.
"Most enterprises don't fathom the magnitude of the impact that data quality problems can have," said Ted Friedman, principal analyst for Gartner, in a statement.
"These problems cause wasted labour and lost productivity that directly affect profitability."
The analyst firm claimed that data quality problems are responsible for the failure of many costly business intelligence and customer relationship management projects.
These programmes fail, in large part, because the poor quality of underlying data is not recognised or addressed.
Many enterprises simply look to technology they can buy to resolve data quality problems without first focusing on people and business processes, Friedman warned.
"Throwing technology at data quality issues usually doesn't solve the problem and won't yield positive long-term results," he explained.
Friedman advised enterprises to examine organisational approaches and methodologies to improve data quality.
"If the IT group is the only organisation that actively works and focuses on the issue, the business's ability to achieve data quality goals will be severely limited," he said.
"The greatest success in managing data quality comes from engaging both business users and the IT organisation."
A smartphone maker fiddling its benchmarking scores? That's unusual, isn't it?
'We are making good progress on 10nm,' claims Intel
Engineer calculates that Chengdu's plan to replace streetlights with artificial moonlight would cost $100bn
Research could also apply to other 'space weather' events involving hot, fast-moving plasma