We often hear that data is the fuel of modern business, but we think that food provides an even better analogy. When we go to fill our car up at the pumps, very few of us prefer a particular brand– we just want a full tank. But when it comes to what we eat, it’s not enough to have a full belly; we need the right sort of food that is both nourishing and tastes good.
It’s the same with data. Filling up on information
doesn’t necessarily make a business better; in fact, the wrong sort of data can
have a highly damaging effect on the health of the whole organization. That’s
because – in the era of the connected business – the effects of bad data aren’t
confined to the system in which it resides. Instead, it ripples out to a range
of other applications and processes that rely on that information.
Businesses may not realize
it, but bad data is a serious and costly issue. In 2016, IBM estimated that
poor quality data costs more than $3 trillion in the US alone. (By comparison, the size of
the entire big data industry in the same year, according to IDC, was a “paltry” $136 billion.)
This can only ever be an
estimate, though, because it’s difficult to put a price tag on the missed
opportunities, reputational damage and lost revenue that comes from having the
wrong data – not to mention the time and effort involved in searching for and
correcting it. Knowledge workers spend far too much time searching for and
correcting errors in their data.
Other researchers provide
further evidence for the devastating impact of bad data. Gartner found that the
average cost to organizations is $15 million a year, while a report from the Royal Mail suggested that it causes a loss of six percent
of annual turnover. Why are businesses failing to address an issue with such a
direct impact on their bottom line – especially given today’s fixation on
The domino effect of bad data
You would expect that
the figures listed above would provide plenty of food for thought, especially
as every line of business, from marketing to finance, customer service to
supply chain, is now so completely dependent on accurate data on which to base
their insights. Yet in our pursuit of data quantity, we seem to have forgotten
one of the oldest tenets of the information age: “Garbage In, Garbage Out.”
Too often, businesses lack a coherent data integration
strategy, which means that inaccurate or incomplete data causes a domino effect
through the organization.
Nothing highlights the
interconnected nature of modern business better than the issue of bad data. If
a department does a bad job of keeping data clean, up-to-date, and accurate, it
affects every other department that relies on that data. This means that the
effects are not limited to those who are responsible for managing records and
updating systems; instead, they spread throughout the organization. This
results in all manner of problems: from badly-targeted marketing campaigns to
poor customer service outcomes to errors in payroll, resource allocation, and product
Another grave consequence
of inaccurate data is that it can lead to people mistrusting the insights that
they gain, and even resenting the data creators who have allowed erroneous
information to creep into their systems.
A recipe for success
For all the hype around
data-driven insights, businesses are facing a data credibility problem, with
insights and performance metrics badly skewed by inaccurate information. So,
while no one discounts the importance of having large data sets from which to
draw insight, the more urgent challenge facing organizations is to improve the
quality and accuracy of the information that they hold.
Just as the food we eat
has a direct effect on our wellbeing, so the quality of their information has a
bearing on the health of a business. That’s why businesses need to treat data
as a delicacy, rather than just fuel. By focusing on data quality, they can
then ensure a positive domino effect throughout the organization, with departments
and workers able to trust the insight they derive from it.
To do this, every organization
must undertake a regular data quality audit that not only verifies the accuracy
of the information that is kept but also examines the internal processes and
workflows associated with gathering and storing information.
For example, the organization needs to have complete confidence that employees are capturing all relevant information in systems such as ERP systems, and that all data is entered accurately and kept up to date. This should include cross-referencing with information held in other systems, such as CRM, ensuring that the business can have faith in the data on which it bases its most important decisions.
The recipe for success is simple: be as discriminating
with your data as you would towards the food you put in your mouth: prioritize
data quality to ensure you get accurate insights.
https://erpnews.com/v2/wp-content/uploads/2019/10/data.jpg404600katiehttps://erpnews.com/v2/wp-content/uploads/2018/10/[email protected]katie2019-10-15 13:06:412019-10-15 13:06:42A Recipe for Better Data Management
Do You Know How ERP Systems Have Evolved Up Until 2019?