Menu
Log in



  • Home
  • News
  • DMBoK Figure 91 Context Diagram: Data Quality


DMBoK Figure 91 Context Diagram: Data Quality

12/31/2024 7:00 AM | Anonymous member (Administrator)


Data Quality can be defined as the degree to which dimensions of Data Quality meet the requirements. This implies that requirements should be formulated for each (relevant) dimension. A much shorter definition for quality of data is ‘fit for purpose.’

Data that meets the requirements are of sufficient quality; data that doesn’t meet the requirements are of insufficient quality. To keep it simple, we respectively speak of high and low, or poor quality data.

Effective Data Management involves a set of interrelated processes enabling an organization to use its data to achieve strategic goals. An underlying assertion is that the data itself is of high quality. Data Quality Management is the planning, implementation, and control of activities that apply quality management techniques to data in order to assure it is fit for consumption and meets the needs of data consumers.

High quality data is context driven. This means that the same data may be simultaneously viewed as high quality by some areas of an organization while being viewed as low quality by other areas. Many organizations fail to engage with this question of context, that is, high Data Quality being that which is fit for purpose.

If we understand organizations as data manufacturing machines, we can assert (from our experience in manufacturing) that organizations that formally manage the quality of data will be more effective, more efficient and deliver a better experience than those that leave Data Quality to chance. However, no organization has perfect business processes, technical processes, or data management practices. In reality, all organizations experience problems related to their Data Quality. Many factors undermine quality data: lack of understanding about the effects on organizational success, leadership that does not value Data Quality, poor planning, ‘siloed’ system design, inconsistent development processes, incomplete documentation, a lack of standards, or a lack of Data Governance.

As is the case with Data Governance and with Data Management as a whole, Data Quality Management is a function, not a program or project. This is because projects and even programs have starts, middles, and ends. A Data Quality Function is, or should be, a continuing business as usual set of activities. It will include both projects and programs (to address specific Data Quality improvements) as well as operational work, along with a commitment to communications and training. Most importantly, the long-term success of a Data Quality improvement program depends on getting an organization to change its culture and adopt a quality mindset. As stated in The Leader’s Data Manifesto, “fundamental, lasting change requires committed leadership and involvement from people at all levels in an organization.” People who use data to do their jobs – which in most organizations is a very large percentage of employees – need to drive change. One of the most critical changes to focus on is how their organizations manage and improve the quality of their data. 

Formal Data Quality Management is similar to continuous quality management for other product manufacturing. It includes managing data through its lifecycle by setting standards, building quality into the processes that create, transform, and store data, and measuring data against standards. Managing data to this level usually requires a Data Quality Function team. The Data Quality Function team is responsible for engaging both business and technical data management professionals and driving the work of applying quality management techniques to data to ensure that data is fit for consumption for a variety of purposes. The team will likely be involved with a series of projects through which they can establish processes and best practices while addressing high priority data issues.


Powered by Wild Apricot Membership Software