If you wanted to program a computer in the early 1980s, you didnt have the option of coding using a keyboard. You had to create a series of cards, each punched with a pattern of holes. The holes had to be entirely accurate, in both pattern and placement. A typical card contained hundreds of potential positions. Just one error in your card deck would cause the entire card to be invalid.
In the early days of office computing, mistakes were simply not an option. There was zero tolerance. Correcting errors, or repunching jammed cards that the machine didnt verify, could take several days, per card.
Nowadays, were so used to getting instant results that weve become far more error tolerant, and we dont have this perfectionist approach. We can add a record to a database in well under a minute, and we have ways to get around pesky validation errors when a record wont save.
If theres already a record of a person in the database, we can always add the word NEW to their name, rather than backtracking and looking for the duplicate.
What harm can it do, really?
The Age of Automation
As we move towards an age of complete automation, fudging verification and tolerating bad spelling is starting to hinder our success, and our profitability. Perfect data is rarely affordable, or achievable. But we are certainly becoming increasingly aware of mistakes, because its stopping us working as efficiently as we need to.
Punch cards aside, if you arent putting the right effort into accuracy, there are three main cases to answer:
- The first is inconvenience: Imagine that your job requires you to make rapid analysis based on missing or muddled information. Its an uphill struggle, and it slows productivity to a crawl. For the call centre operative, the tech support engineer, or the salesperson planning their next move, inaccurate and duplicated data create frustration, cost and dissatisfaction.
- The second issue is waste: If we have duplicate records, how can we ensure budgets are being spent wisely? If customers are allowed to have multiple loyalty accounts, how do we really know what their purchasing patterns are? Marketing teams, faced with strict budgets, dont want to waste a penny. Yet 42 per cent of companies who responded to an Experian survey said data quality was draining the bank account and causing marketing problems.
- The third concern is poor decision making: This can be demonstrated at any level of the business. From the customer service team referring to someone by the wrong name, to the boardroom decisions based on messy data, you simply cannot use poor data as a foundation for anything. This can have financial consequences for years to come, since every output is going to be hindered by doubt.
Put all of this into the context of automated working, and we have a recipe for disaster. One bad record in a good database is going to filter through into every other system. Every department will be inconvenienced. Everyone is going to waste time. When it comes to pulling together another report, you wont be able to trust even one of them.
Morals and Ethics
There are other reasons to focus on data quality, quite apart from the need to ensure profits and reduce waste. Consider the market for wearables. Were already seeing these devices being used as evidence in court.
One example is the very serious issue of a rape case reported in Lancaster, Pennsylvania in the US. When investigating the data reported by the womans Fitbit fitness tracker, detectives found that her movements did not support her story. The Engadget report makes reference to the fact that wearable data is never totally accurate, which is a useful reminder of the dangers we face in putting too much trust in faulty statistics.
There are also implications for the many organisations that access anonymous data. In a recent survey by KPMG, 78 per cent of respondents said theyd be happy to share wearable data with their GP. This could have a direct impact on healthcare outcomes for individuals, and we could reach a stage where devices like this are informing healthcare policy and planning.
If were going to start using data in this way, we need to be absolutely sure its correct. The rape case is a rather extreme example of this, and the data was presumably analysed in context. But its a timely reminder that data quality can no longer be considered optional in any case.
Inconvenience, Insolvency or Worse
Its very difficult to put an absolute figure on the cost of poor data, and businesses need to take a balanced approach when seeking data quality solutions. Theres always a tipping point for data quality, where the investment makes a worthwhile difference without bankrupting the business.
But for the purposes of this article, we need to look at the cost of inaction, as well as the cost of change, including:
- Poor marketing ROI
- Mountains of returned catalogues
- Inability to integrate old systems
- Failure to act on market trends
- Lack of integration, leading to poor efficiency
- Inability to capitalise on the Internet of Things
- Poor data security
- DIY workarounds to try to overcome system failings
- High staff churn rates
- More agile competition in the market
So yes: transformation, automation and modernisation all cost money, and nobody likes to spend. Retraining requires investment, and increasing the quality hit rate is as much about your staff as your systems.
But inaction makes data unfit for purpose.
So is data quality an optional extra for a modern business? Wed argue that its not its an essential, core component. And businesses are going to have to adapt to survive. You cant deliver an exceptional experience to customers if youre not sure who they are. And you cannot make a positive change to the quality of data without changing your processes and your mindset.