Data Quality Assurance
The approach you take to Data Quality can either make it easy of make it difficult.
The current approaches not only make it difficult, they make it impossible. The major problem is that they are centered ‘Data Quality’ applications designed to find and attempt to repair data errors as opposed to preventing the data errors occurring in the first place.
In Quality Management terms they are based on the Quality Control as Opposed to Quality Assurance. Data Quality Assurance is about creating correct data first time, every time. This is based on the simple, yet profound, truth that:
The only data required to be known and held in any enterprise is that required to support the execution of the Business Functions of that enterprise. Nothing more, nothing less.
Starting with this truth, the first step in Data Quality Assurance is identifying and modelling the Business Functions of the enterprise and then identifying and modelling the data need to support these Business Functions.
The Logical Data Model (LDM) is the only effective means of doing this and is therefore a key tool in Data Quality Assurance.
Master Data Management (MDM)
Master Data Management starts with asking the Six Multi-Dimensional Questions to identify the Products, Parties, Locations and Assets required by the enterprise and then building Logical Data Model (LDM) to show the structure of these.
The LDM is an essential toll for MDM. It eliminates phantom entities such a Customer, Supplier, Address, etc. and enables the true Master Data Entities to be identified, modelled and managed.