Originally Published: November 2018; Revised: March 2020
Best Practice Scribes
Gareth Isaac, Director and Principal Consultant, Ortecha, a DCAM Authorized Partner
Mark McQueen, EDM Council, Senior Advisor-DCAM
Overview
Process Design Framework
The EDM Council industry standard process design utilizes a 6-level model as defined below. Practically, an industry standard can only design to Level 3. Beyond Level 3 a standard becomes organization and role specific.
- Level 0: Value Chain – Components
- Level 1: Process Groupings – Process Groups based on Component
- Level 2: Core Processes – Activities and Tasks based on Process Group
- Level 3: Business Process Flow – Processes and Sub-processes based on Functional Role
- Level 4: Operational Process Flow – Process Documentation based on Role
- Level 5: Detailed Process Flow – Procedures (step-by-step documentation) based on Role
Stakeholders
Data Management Practitioners (Reference: Data Management Functional Construct)
- Data Officer
- Data Governance Executive
- Executive Data Steward
- Data Architect
- Data Domain Manager
- Business Data Steward
- Technical Data Steward
- Business Process Subject Matter Expert
- Data Custodian
Scope
The driver for the design of the Level 2 Data Domain Management process was the development of a best practice tied to the identification of Critical Data Elements (CDEs). The process of prioritizing data based on criticality was determined to be contained within the Data Domain Management process. (See: Prioritizing Data Based on Criticality: Critical Data Elements (CDEs) in Context)
The following process design presents a complete Level 2 Data Domain Management process. Only four of the Level 3 processes were developed as they were the processes required to represent the activities of the process to prioritize data.
DCAM Component Alignment
The Level 2 process is presented at the Component level with alignment back to DCAM Framework Capabilities and Sub-capabilities.
The table below represents the DCAM Components that are required to execute the Data Domain Management process.
| Component | Capability Requirement |
| 7.0 Data Control Environment | ● The process of data domain management is in the Data Control Environment Component. ● The process of prioritizing data based on criticality resides within the data domain management process. |
| 3.0 Business & Data Architecture | ● The process of prioritizing data based on criticality is dependent on defining requirements for data, identifying data, defining data, and profiling data. |
| 5.0 Data Quality Management | ● The process of prioritizing data based on criticality does not require the Data Quality Management Component. Data Quality Management is part of the overall Data Domain Management and will be integral to the process of managing the implications of criticality. |
| 6.0 Data Governance | ● The process of prioritizing data based on criticality leverages the Data Governance Component for approving the metadata and the criticality designation. |
Level 2 1.0: Data Domain Management Process
Summary
The Data Domain Management Process is where the DCAM Framework Components of Data Governance, Data Architecture, and Data Quality are brought together to execute on a specific set of data in the Data Control Environment.
Process Flow

Process Details
The tasks outlined in orange below support the process of prioritizing data based on criticality as defined in the best practice article. However, achieving the means of prioritizing data based on criticality and managing the criticality is dependent upon the complete Data Domain Management Process.
| Task ID | Functional Role | Task Description |
| 1.1 | Data Control Environment | Define Requirements for Data ● The business process consuming the data (Data Consumer) develops requirements for data as an input to their process. ● The requirements include proposed criticality with a description of material impact to business process as a result of poor quality data. |
| 1.2 | Data Control Environment | Validate data in Scope ● The business process producing the data (Data Producer) determines whether the requested data is within the scope of their domain. |
| D1 | Data Control Environment | Data in scope? ● If yes, move to Task 1.3. ● If no, is there a referral for which domain may be in scope. Communicate to Data Consumer the scope outcome and recommendation if applicable. Process stops. |
| 1.3 | Data Control Environment | Source Data ● Go through appropriate steps to analyze the requirements, identify the data, locate the data, source the data (access data for analysis and preparation for provisioning) and record basic metadata. |
| 1.4 | Data Control Environment | Negotiate Criticality ● Data Producer reviews Data Consumer’s proposed critical designation based on analysis of material impact to business process of poor quality data. ● Agreement of Critical Elements is reached or disagreement is escalated. |
| D2 | Data Control Environment | Agreement on Criticality? ● If yes, move to D3, is it critical ● If no, escalate criticality decision |
| D3 | Data Control Environment | Is data critical? ● If no, move to D4, is standard data required. ● If yes, move to Task 1.5 to design metadata. |
| D4 | Data Control Environment | Standard data required? ● Does the Data Consumer require standardized data? ● If no, move to Task 1.14a to provision non-standard data with appropriate controls on use. ● If yes, move to Task 1.5 to design metadata. |
| 1.10a | Data Control Environment | Define Data Sharing Agreement (DSA) & Service Level Agreement (SLA) ● Establish restrictions on the use of non-standard data and parameters of provisioning. |
| Tool | Data Sharing Agreement Service Level Agreement | |
| 1.15a | Data Control Environment | Provision Data ● Provision the non-standard data with appropriate controls on use (proportionate to the criticality of the data). |
| 1.5 | Data Architecture | Design Metadata ● Design metadata (data about the data) that is required for the appropriate level of control on the data. ● This task works in conjunction with the parameters of the Enterprise Data & Data Management Policy and the proposed Data Sharing Agreement (DSA) which defines the required level of control on the data. |
| 1.6 | Data Architecture | Record Metadata ● Record metadata about the data in the appropriate repository(ies). |
| 1.7 | Data Architecture | Monitor Metadata Quality ● Monitor the metadata quality for accuracy, completeness, and timeliness. ● Once metadata quality is deemed adequate, submit to the appropriate data governance body for approval. |
| 1.8a | Data Governance | Make Data or Data Management Decisions ● Data governance body reviews metadata quality report gaps to validate alignment with Enterprise Data & Data Management Policy. |
| D5 | Data Governance | Metadata approved? ● If no, return to Task 1.5 to close identified gaps. ● If yes, move to Task 1.9 to standardize the data. |
| 1.9 | Data Control Environment | Standardize Data ● Apply logic to transform the data into the standardized form. |
| 1.10b | Data Control Environment | Define Data Sharing Agreement (DSA) / Service Level Agreement (SLA) ● Data Producer and Data Consumer agree to all parameters in the DSA. ● Producing Technology Manager and Consuming Technology Manager in accordance with the DSA parameters agree to all parameters in the SLA. |
| 1.11 | Data Quality Management | DQ Rule Development ● Define the range of quality rules to run against each element to validate the fit-for-purpose of the data. |
| 1.12 | Data Quality Management | Evaluate / Monitor Fit-for-Purpose ● Execute the defined rules to generate defect reporting. ● Evaluate completeness of rule set. |
| D6 | Data Quality Management | Fit-for-purpose achieved? ● If no, move to Task 1.13 to remediate quality defects. ● If yes, move to Task 1.8b to make data or data management decision. |
| 1.13 | Data Quality Management | Remediate Quality Defects ● Complete all processes to remediate quality defects. |
| 1.8b | Data Governance | Make Data or Data Management Decisions ● Data governance body reviews metadata and data quality to approve data are fit-for-purpose and to validate alignment with Enterprise Data & Data Management Policy and DSA / SLA. |
| D7 | Data Governance | Fit-for-purpose approved? ● If yes, move to D6 to approve DSA / SLA. ● If no, return to 1.11 to close identified gaps. |
| D8 | Data Governance | DSA / SLA approved? ● If yes, move to 1.14 to process data for provisioning. ● If no, return to 1.10 to close identified gaps. |
| 1.14 | Data Control Environment | Process Data for Provisioning ● Run all period close activities to prepare data for provisioning. |
| 1.15b | Data Control Environment | Provision Data ● Execute provisioning routine defined in the SLA. |
Level 3 Stakeholder Functional Roles and Responsibilities
The Level 3 processes are presented at the Functional Role level with alignment back to DCAM Framework Capabilities and Sub-capabilities.
The chart below details Data Domain Management functional roles and responsibilities aligned to the Data Management Functional Construct. These functional roles apply to all of the Level 3 processes detailed in the best practice article.
| Functional Role | Detailed Description |
| Business Data Management – Producer | A process, application or stakeholder that provisions data to one or more Data Consumers |
| Business Data Management – Consumer | A process, application or stakeholder that receives or uses data from a Data Producer. |
| Data Architecture | The function that defines and implements the data content strategy for a given subset of data. |
| Technology Delivery – Producer | The function that designs, builds, and runs the technical infrastructure supporting the Data Producer. |
| Technology Delivery – Consumer | The function that designs, builds, and runs the technical infrastructure supporting the Data Consumer. |
Level 3 1.1: Define Requirements for Data
Summary
Within the Data Domain Management process the activity of defining requirements for data are completed by the Data Consumer.
Process Flow

Process Details
| Task ID | Functional Role | Detailed Description |
| 1.1.1 | Business DM – Data Consumer | Identify Target Business Element (BE) ● Based on the needs of the business process define the requirements for data |
| 1.1.2 | Business DM – Data Consumer | Initiate Business Element Request Form ● Record all known requirements in the form |
| Tool | Business Element Request Form ● The form includes a standard set of required and optional (if known) attributes necessary to accurately communicate the request to the Data Producer | |
| 1.1.3 | Business DM – Data Consumer | Review Enterprise Data Inventory ● Search the repository for a Business Element record that aligns to the defined requirements for data |
| D1 | Business DM – Data Consumer | Is the Business Element in Inventory? ● If yes, move to 1.1.5 to complete the BE request form ● If no, move to 1.1.4 to identify suspect data domain |
| 1.1.4 | Business DM – Data Consumer | Identify Prospective Data Domain ● Based on Data Consumers understanding of the data select the most likely data domain |
| 1.1.5 | Business DM – Data Consumer | Complete Business Element Request Form ● If found in inventory, cite the BE ID and identify any requirement discrepancies in the Enterprise Data Inventory ● If not found in inventory, complete all required items and those optional items that are known |
| Com1 | Business DM – Data Producer | Deliver Data Consumer BE Request ● Data Consumer delivers to the Data Producer of the suspect data domain |
Considerations
- When the target data are consumed by more than one domain some organizations support the definition of requirements for data through a centralized center of excellence.
Level 3 1.2 Validate Data in Scope
Summary
Within the Data Domain Management process the activity of validating that the Data Consumer requested data are in scope is completed by the Data Producers.
Process Flow

Process Details
| Task ID | Functional Role | Detailed Description |
| Com1 | Business DM – Data Producer | Receive Data Consumer BE Request ● Data Consumer delivers to the Data Producer of the suspect data domain |
| 1.1.2 | Business DM – Data Producer | Validate Data Owner ● Using the request form information investigate whether the data are owned by the receiving Data Producer |
| D1 | Business DM – Data Producer | Is Data Owner correct? ● If no, move to D2 to decide if the non-owned data are in scope ● If yes, move to 1.2.2 to align the DE to the requested BE |
| D2 | Business DM – Data Producer | Is non-owned data in scope (Upstream Data Owner by DSA approves pass-through distribution non-owned data)? ● If yes, move to 1.2.2 to align the DE to the requested BE ● If no, move to Com2 to communicate data are out-of-scope |
| Com2 | Business DM – Data Producer | Communicate Out-of-Scope ● Data Producer communicates to Data Consumer that the requested BE is not in scope to the Producer’s Domain (Data Producer should share any suspected data domains if known) |
| 1.2.2 | Business DM – Data Producer | Align DE to BE ● Based on the requirements for the BE identify all the DEs that align with the requirements |
| D3 | Business DM – Data Producer | Is data already sourced (available in Authoritative Provisioning Point)? ● If yes, move to Process 1.4 Negotiate Criticality (non-owned data can be in scope if the upstream Data Owner allows the pass-through distribution of the data) ● If no, move to 1.2.3 to identify the DE System of Record(s) (SORs) |
| 1.2.3 | Business DM – Data Producer | Identify SOR ● Based on the DE to BE alignment identify the SOR(s) |
| Com3 | Business DM – Data Producer | Data Sourcing Request ● Request the DE(s) to be sourced by the Technology Delivery Producer |
Level 3 1.4 Negotiate Criticality
- Data Producer reviews Data Consumer’s proposed critical designation based on analysis of the material impact to the business process of poor quality data.
- Critical or Non-critical agreement is reached or disagreement is escalated.
Summary
Within the Data Domain Management process the activity of negotiating criticality is completed by the Data Producer with the Data Consumer.
Process Flow

Process Details
| Task ID | Functional Role | Detailed Description |
| Com1 | Review Data Consumer BE Request Form | |
| 1.4.1 | Business DM – Data Producer | Review Criticality Analysis ● Data Producer reviews Data Consumer’s proposed critical designation based on analysis of the material impact on the business process of poor quality data. |
| 1.4.2 | Business DM – Data Producer | Discuss Criticality ● Data Producer discusses with Data Consumer rationale for material impact on the Consumer’s business process of poor quality data |
| D1 | Business DM – Data Producer | Agreement on Criticality? ● If no, move to 1.4.3 to escalate disagreement on criticality ● If yes, move to D2 for determination of whether data is, or, is not, critical |
| 1.4.3 | Business DM – Data Producer | Escalate Disagreement ● Escalate disagreement on criticality to appropriate Governance body for resolution |
| 1.4.5a | Data Governance | Make Data or Data Management Decision ● Appropriate Governance body reviews escalated disagreement and resolves critical designation |
| D2 | Is Data Critical? ● If yes, move to 1.4.4 to get approval of critical designation ● If no, move to D3 to determine if standardized data are required | |
| 1.4.4 | Business DM – Data Producer | Approval of Critical Designation ● The Data Producer governance body must approve criticality designation |
| 1.4.5b | Data Governance | Make Data or Data Management Decision ● Appropriate governance body decisions approval of critical designation |
| D3 | Critical data are Approved? ● If yes, move to 1.5 Design Metadata ● If no, move to D4 Standard Data Required | |
| D4 | Standard Data are Required (standard value is required across the data set)? ● If yes, move to 1.5 Design Metadata ● If no, move to 1.10 Define DSA/SLA |
Level 3 1.10 Complete DSA/SLA
Summary
Within the Data Domain Management process the activity of completing the Data Sharing Agreement is conducted by the Business Data Management-Data Producer with the Business Data Management-Data Consumer. Similarly, completing the Service Level Agreement is conducted by Technology Delivery Producer with the Technology Delivery-Data Consumer.
Considerations
- The Data Sharing Agreement is a Domain-to-Domain agreement capturing the business parameters defining the consumer requirements for data and the producer constraints on the use of the data.
- There is the possibility of business parameters at three levels:
- Domain
- Application
- Element – Data Sets
- Evaluate the maintenance of the DSA:
- Frequency of review
- Controls and compliance
- The Service Level Agreement is an application-to-application agreement capturing the technical parameters defining the consumer technical requirements for data and the producer technical constraints on the availability of the data.
- What are the implications to data that is not designated as critical? It may be necessary to establish minimum requirements for:
- Metadata
- Use control
- Quality review and threshold
Process Flow

Process Details
| Task ID | Functional Role | Detailed Description |
| Com1 | Review Data Consumer BE Request Form | |
| 1.10.1 | Business DM – Data Producer | Confirm Use & Define Use Constraint ● Review the consumer defined use in the BE Request Form ● Define the use constraints on the data |
| 1.10.2 | Business DM-Data Consumer | Propose Quality Measures for Data Elements ● Based on consumer business process, define measurements for data quality |
| 1.10.3 | Business DM – Data Producer | Confirm Quality Measures ● Evaluate proposed measurements against current measurements and confirm agreed upon measures |
| 1.10.4 | Business DM – Data Producer | Propose Quality Threshold ● Evaluating current data quality and cost to enhance data quality, propose the threshold for quality |
| 1.10.5 | Business DM-Data Consumer | Confirm Quality Threshold ● Based on the cost of poor quality to the producer business process, confirm an acceptable threshold for quality |
| 1.10.6 | Business DM – Data Producer | Create DSA Document ● Create or modify existing DSA document to include all data shared between the producer and consumer data domains |
| 1.10.7 | Business DM-Data Consumer | Confirm DSA Document ● Review and confirm the completeness of the DSA document |
| 1.10.8 | Technology Delivery-Data Consumer | Propose SLA Requirements ● Based on the requirements of the consumer business process and the constraints of the technical infrastructure, define the application-to-application parameters for data consumption |
| 1.10.9 | Technology Delivery-Data Producer | Confirm SLA Terms ● Review and confirm the ability to perform according to the defined parameters |
| 1.10.10 | Technology Delivery-Data Producer | Create SLA Document ● Create or modify existing SLA document to include data elements and the parameters for consumption |
| 1.10.11 | Technology Delivery-Data Producer | Review SLA Document ● Review and confirm the completeness of the SLA document |
| D1 | Technology Delivery-Data Producer | SLA Approved? ● If yes, move to D2 for Technology Delivery-Data Producer approval of the SLA ● If no, move to 1.10.9 to confirm the SLA terms |
| D2 | Technology Delivery-Data Producer | SLA Approved? ● If yes, move to D3 for Business DM-Data Producer approval of the DSA and SLA ● If no, move to 1.10.9 to confirm the SLA terms |
| D3 | Business DM-Data Producer | DSA /SLA Approved? ● If yes, move to D4 for Business DM-Data Consumer approval of the DSA and SLA ● If no, move to 1.10.1 to confirm use and define use constraints |
| D4 | Business DM-Data Consumer | DSA /SLA Approved? ● If yes, move to 1.10.12 to obtain appropriate governance body approval of the DSA and SLA ● If no, move to 1.10.1 to confirm use and define use constraints |
| 1.10.12 | Data Governance | Make Data or Data Management Decision ● Appropriate governance body decisions approval of DSA and SLA |
| D4 | Data Governance | DSA /SLA Approved? ● If yes, move to 1.10.13 to record documentation parameters in the metadata repository ● If no (DSA not approved), move to 1.10.1 to confirm use and define use constraints ● If no (SLA not approved), move to 1.10.9 to confirm SLA terms |
| 1.10.13 | Business DM-Data Producer | Record in Metadata Repository ● Record DSA and SLA documentation parameters in the metadata repository (critical designation, use constraints, consuming domain, quality threshold, DSA and SLA agreement IDs, etc.) |
Appendix
About the Work Group
The development of the Data Domain Management process was done within the Work Group to define the process for prioritizing data based on criticality. The best practice developed by the Work Group was originally published in November 2018 and is titled: Prioritizing Data Based on Criticality: Critical Data Elements (CDEs) in Context.
The project objective was to create an agreed upon understanding of the purpose and definition of a CDE. Then, based on that purpose and definition develop a best practice process, procedures, and tools for the identification of CDEs and for managing the implications of criticality. The execution of the process, procedures and tools will be aligned with the DCAM Framework and the Data Management capabilities it defines.
Work Group Members – organization affiliation as of November 2018
Arzaga, Raymund – Scotiabank
Atkin, Mike – EDMC
Bala, Sathya – Deutsche Bank
Bersie, Bret* – US Bank
Bland, Karen* – Moody’s Corporation
Brophy, Doris – Societe Generale
Deligiannis, Greg – S&P Global Ratings
Dewsbury, Jeff – DTCC
Dimitrion, Genevey – State Street
Doyle, Martin* – DQ Global
Farenci, Susan – MUFG Union Bank
Finnen, Michael – Mitsubishi UFJ Financial Group
Fruhstuck, Mary – BNY Mellon | Pershing
Giardin, Christopher – IBM Hybrid Cloud
Gordon, Andrew – Deutsche Bank
Hawkins, Matthew* – Goldman Sachs
Isaac, Gareth* – Ortecha
Jeffries, Denise
Keslick, Rob – BMO
Klaentschi, Kathryn
Lawson, Andrew – Brickendon
Liu, Irene – PWC
McAdams, Curtis
McQueen, Mark* – EDMC / Ortecha
Nham, Annie – Macquarie Group Limited
Pandya, Hiten* – HSBC Bank
Robeen, Erica – Mastercard
Rolles, Daniel – EXL Service Holdings, Inc.
* Data Architecture Subgroup Member
About the Authors
Gareth Isaac is a Partner in Ortecha, an EDM Council DCAM Authorized Partner data consultancy. He is a professional Data practitioner who works with stakeholders – both leadership and subject matter experts – to understand the complex challenges involved with improving processes and data throughout the end to end information lifecycle. Gareth has worked with multiple GSIBs over the years to help improve their data management practices, specializing in data lineage, control frameworks and governance functions.
gareth.isaac@ortecha.com
+44 20 3239 3823
Mark McQueen is the Senior Advisor – DCAM to the EDM Council. He joined the Council in 2016 and now leads the Best Practice Program to develop Data Management industry-standard processes for executing the DCAM Framework. Mark has over 20 years with a Fortune 25 GSIB where he was the business Data Management Executive for the Wholesale Bank. In addition to Best Practice Program facilitation, he provides training and EDMC Advisory Services related to the adoption and execution of the DCAM Framework in member organizations.
Mark is a DCAM Certified Trainer, Six Sigma Black Belt Certified, and Strategic Foresight accredited – University of Houston.
Mark is a Partner in Ortecha, an EDM Council DCAM Authorized Partner data consultancy.
mmcqueen@edmcouncil.org
+1 615.308.6465
Revision History
| Date | Author | Description |
| November 2018 | Garreth Issac; Mark McQueen | Initial Publication |
| March 2020 | Mark McQueen | Knowledge Portal Release; Broken into a Separate Article from Prioritizing Data Based on Criticality: CDEs in Context |