There’s so much hype about Data Quality Management as more and more companies are realising the critical role of data as the ‘new oil’ in this Digital Era.
But first, you’ll have to know and understand what you’re trying to manage.
So, what is data quality, really?
Data quality refers to how well the data describes the objects, events, or ideas it represents. Essentially, how well it meets the expectation of users who will consume it in whatever job function they’re in.
If you think that this definition isn’t practical because ‘how well’ isn’t exactly quantifiable, well, think again!
Your data can be measured against several criteria to determine its ‘wellness’, hence, its quality. And what are the criteria used? They vary depending on your business requirements and your end-users.
We recommend measuring against these criteria—Accuracy, Validity, Uniqueness, Completeness, Consistency, Timeliness, Integrity, and Conformity. These criteria should also be set up as rules in your Data Quality Management system to maintain high-quality data at all times.
Let’s deep dive into the definition and real-life examples of each criterion so you’ll have a clearer understanding and better appreciation of what each of them represents.
Closeness of data to the true values, checked against external data sources and visual data governance.
One way to uphold supply chain data accuracy is by cross-checking supplier information such as credit ratings against Dun & Bradstreet’s database, an external data source. It stores a comprehensive list of companies’ business data and analytical insights worldwide. This allows you to mitigate external risks coming from your suppliers.
In the asset maintenance space, you can ask your Engineering and Maintenance teams to review equipment criticality data as they would have the right expertise to determine the accuracy. This is visual data governance in practice.
The degree to which your data conforms to the defined business rules of the domain (reference table, range, etc.).
For your finance and project data, only one company code should be assigned to one cost centre. This is a generally accepted cost centre accounting rule. Anything that deviates from this will become invalid.
As for asset management, every maintenance plan created needs to have a task list. In the absence of a task list that prescribes the steps for inspection, repairs, and preventive maintenance, the maintenance plan would not be valid as it will not serve its original purpose.
A ‘Golden Record’ view of data or a single version of the truth for data, accessible across the enterprise landscape.
This can involve the identification of duplicate customer info to avoid your customer service team from contacting the same customer multiple times. The records with the identifying key fields (e.g., Customer Name and Address) should be merged and other redundancies should be eliminated, forming a unique record.
In the same breath, you’d need to maintain a golden record for spares within your inventory management system. Without it, people across different warehouses will buy the same type of spares (not knowing that they’re the same), causing uncontrolled accumulation of spares and increased inventory and holding costs.
The extent to which all required data is available.
Full name, Social Security number, and bank account number in employee data are among the required fields to ensure completeness. They should become mandatory inputs as your employees would expect prompt disbursement into their bank accounts. Failing to guarantee the completeness of this data will trigger dissatisfaction and distrust among your people and you’ll have to manage the situation which shouldn’t happen in the first place.
Another example is ensuring that model number and serial number are filled in for each of your equipment. Without this information, there’s no way to track them, let alone determine their conditions. In worst-case scenarios, a piece of equipment gets badly worn out, causing an unplanned breakdown.
How consistent the data appears within data sets, across different data sets, or with other data sources.
Handling ‘non-applicable’ entries where people have different ways of entering them, e.g., “N/A”, “Not Applicable”, “NA”. They should be made consistent especially if they belong in the same datasets to avoid confusion. People might think that these have separate definitions when actually they mean the same thing.
Another example is to check that the record count is the same between target and source systems (given there’s no transformation process that changes the records). This allows consistency between different data systems.
How up-to-date the data is according to your business needs.
For month-end closing, it’s expected that all Finance-related data is made available and updated for further processing to avoid any discrepancies in financial reporting.
In the supply chain space, certificates and licenses provided by your suppliers should be current. You should be able to flag those that have passed expiration dates as these would pose unwanted third-party risks.
Consistency in the relationship between entities and attributes, including parent-child relationships and orphan records.
Within a functional location hierarchy, spares must always be linked to their parent functional location or equipment. And the attributes should also reflect the same information.
The same discipline applies to orphan items. When you delete a functional location from the system, no orphan items should be left behind—they should be removed too.
The attributes are defined based on standards and compliance requirements.
You put in place a process to organise and categorise your assets data according to ISO14224 standard. This ensures standardisation and enables better identification of your assets.
Another instance in the customer space is the implementation of data encryption to secure sensitive customer data so that you’re in compliance with GDPR.
Data Quality Management with MDO
Now that you’ve understood the data quality criteria and how to apply them, you’ll have a better footing in managing data quality. Of course, you’ll need a technology solution to help you with the setup and automation of data quality procedures and rules.
MDO is a multi-domain MDM platform that’s optimised for Data Quality Management.
Its data quality capabilities include:
- Support a wide range of business rules’ types and definitions, e.g., missing data, classification, metadata, duplicate detection, and lookup.
- De-duplication and consolidation of data from multiple systems to obtain a ‘Golden Record’ view
- Automation of large-scale data cleansing
- Workflow-based remediation, triggered via authorised approving parties
- Master data standardisation and data quality parameters based on industry/corporate standards
- Validation checks that take into account compliance rules and external sources
- Dashboards and analytics tools to monitor data health and data cleansing status
- Pre-defined integration adaptors with SAP and other systems.
With MDO, you can operationalise data quality faster and gain more confidence in your data, knowing that you’ll always have high-quality data at your fingertips.
Written by: Shigim Yusof