Ok, I know, it’s obvious that, when there is one mistake in a hundred, quality is pretty good. But 1% failures add up when there are 10,000 records: that’s 100 failures. Even small or medium systems can process a million transactions a day. I guess it’s a matter of how important the accuracy of the data matters to the business. You’d be surprised though how many failures are dealt with manually without fixing the root causes of the failures. Let’s consider what types of failures this 1% might impact:
Pharma: good clinical trial practice
Clinical trial documentation keeps track of test results by patient demographics. If there are any data quality issues, the effects ripple throughout the trial results. The safety and efficacy of the trials can easily be influenced by any data quality issues, let alone 1%. Audits for CFR part11 compliance will be disastrous otherwise.
Healthcare: patient safety
If registration systems are not 100% accurate, you could have a patient safety issue. For example, an allergic reaction to a specific medication is not information that should be 99% accurate. I have experienced this first hand and I can tell you it is not fun. Try requesting information on your visit to a hospital and see if it is all there. Even .001% failure is not acceptable.
Finance: entities of entities
At a hedge fund, following the account numbers between entities and fund of funds, accuracy has to be 100%. Documentation metadata has to be validated against source data at every possible opportunity. SOX fines would add up otherwise.
If anyone at your company considers 99% information accuracy a viable goal, ask them if they would care if they got 1% less pay in the paycheck.
No comments:
Post a Comment