Here is the second article in our series from one of our clients, Zebra Technologies.
That’s why we need standards, to help us see the good data from the bad. Just like in manufacturing, it’s difficult to claim a part has quality problems unless there are specifications that spell out the standard that the part must meet. Equally true for our master data. We need to create the standards, communicate them, measure to them, and ultimately manage the data to meet those standards. Setting the standards is critical and must include input from all who use the data, so that all areas’ needs are represented.
For example, consider item description data. What’s a “good” description? That depends on the purpose for which you want to use it. A description can simply be someone’s attempt to label the item or it can follow standards that can then be used for many different purposes.
At Zebra, our item description could be used to help validate the correct classification of items, such as the type of printer – mobile, desktop, card, and others. For parts, that description might be used to help validate the correct commodity grouping, hazard class assignment, etc. But we need to create and maintain description standards in order to use it for these multiple purposes.
The Data Governance Office, with the help of IT, has implemented an exciting new tool to help us profile our master data for items, so we can identify standards, as well as concerns. But the real excitement is that our new Data Quality tool can be used to prevent data errors. We have been working with a couple of different groups to define validation steps to prevent data errors in Agile, which is the source of much of our item master data. So for those of you who use Agile, be on the lookout for further communications to the Agile user community on these validation rules as they are implemented.
In our next Zebra data governance article, we’ll discuss the important role of the Data Steward.