Exam4Training

DAMA CDMP-RMD Reference And Master Data Management Exam Online Training

Question #1

The following is a technique that you can find useful when implementing your Reference and Master program:

  • A . Business key cross references
  • B . Root Cause Analysis
  • C . Process Management
  • D . None of the answers is correct
  • E . Extract Transformation Load (ETL)

Reveal Solution Hide Solution

Correct Answer: A
A

Explanation:

When implementing a Reference and Master Data Management (RMDM) program, it is crucial to utilize techniques that ensure consistency, accuracy, and reliability of data across various systems.

Business key cross-references is one such technique. This technique involves creating a mapping between different identifiers (keys) used across systems to represent the same business entity. This mapping ensures that data can be accurately and consistently referenced, integrated, and analyzed across different systems.

Reference: DAMA-DMBOK: Data Management Body of Knowledge (2nd Edition), Chapter 11:

Reference and Master Data Management.

"Master Data Management and Data Governance" by Alex Berson and Larry Dubov, which emphasizes the importance of business key cross-referencing in MDM.

Question #2

Which of the following is NOT ,1 characteristic of n deterministic matching algorithm?

  • A . Is better suited when there is no great consequence to an error in matching
  • B . Is not highly dependent on the quality of the data being matched
  • C . Has a discrete all or nothing outcome
  • D . Matches exact character to character of one or more fields
  • E . All identifiers being matched have equal weight

Reveal Solution Hide Solution

Correct Answer: B
B

Explanation:

Deterministic matching algorithms rely on exact matches between data fields to determine if records are the same. These algorithms require high-quality data because any discrepancy, such as typographical errors or variations in data entry, can prevent a match.

Characteristics of deterministic matching:

It has a discrete all or nothing outcome (C).

It matches exact character to character of one or more fields (D).

All identifiers being matched have equal weight (E).

Since deterministic matching is highly dependent on the quality of the data being matched, option B

is incorrect.

Reference: DAMA-DMBOK: Data Management Body of Knowledge (2nd Edition), Chapter 11: Reference and Master Data Management.

"Master Data Management and Data Governance" by Alex Berson and Larry Dubov.

Question #3

Within the Corporate Information Factory, what data is used to understand transactions?

  • A . Master Data and Unstructured Data
  • B . Internal Data. Physical Schemas
  • C . Master Data. Reference Data, and External Data
  • D . Reference Data and Vendor Data
  • E . Security Data and Master Data

Reveal Solution Hide Solution

Correct Answer: C
C

Explanation:

In the context of the Corporate Information Factory, understanding transactions involves integrating various types of data to get a comprehensive view. Master Data (core business entities), Reference Data (standardized information), and External Data (information sourced from outside the organization) are essential for providing context and enriching transactional data.

Reference: DAMA-DMBOK: Data Management Body of Knowledge (2nd Edition), Chapter 3: Data Architecture and Chapter 11: Reference and Master Data Management.

"Building the Data Warehouse" by W.H. Inmon, which introduces the Corporate Information Factory concept.

Question #4

For MDMs. what is meant by a classification scheme?

  • A . Codes that represent a controlled set of values
  • B . A vocabulary view covering a limited range of topics
  • C . Descriptive language used to control objects
  • D . A way of classifying unstructured data

Reveal Solution Hide Solution

Correct Answer: A
A

Explanation:

In Master Data Management (MDM), a classification scheme refers to a structured way of organizing data by using codes that represent a controlled set of values. These codes help in categorizing and standardizing data, making it easier to manage, search, and analyze.

Reference: DAMA-DMBOK: Data Management Body of Knowledge (2nd Edition), Chapter 11: Reference and Master Data Management.

"Master Data Management and Data Governance" by Alex Berson and Larry Dubov.

Question #5

Information Governance is a concept that covers the ‘what’, how’, and why’ pertaining to the data assets of an organization.

The ‘what’, ‘how’, and ‘why’ are respectively handled by the following functional areas:

  • A . Data Management. Information Technology, and Compliance
  • B . Customer Experience. Information Security, and data Governance
  • C . Data Governance. Information Technology, and Customer Experience
  • D . Data Governance. Information Security, and Compliance
  • E . Data Management, Information Security, and Customer Experience

Reveal Solution Hide Solution

Correct Answer: D
D

Explanation:

Information Governance involves managing and controlling the data assets of an organization, addressing the ‘what’, ‘how’, and ‘why’.

‘What’ pertains to Data Governance, which defines policies and procedures for data management.

‘How’ relates to Information Security, ensuring that data is protected and secure.

‘Why’ is about Compliance, ensuring that data management practices meet legal and regulatory requirements.

Reference: DAMA-DMBOK: Data Management Body of Knowledge (2nd Edition), Chapter 1: Data Governance.

"Information Governance: Concepts, Strategies, and Best Practices" by Robert F. Smallwood.

Question #6

What is a registry as it applies to Master Data?

  • A . An index that points to Master Data in the various systems of record
  • B . Any data available during record creation
  • C . Reconciled versions of an organization’s systems
  • D . A starling point for matching and linking new records
  • E . A system to identify how data is used for transactions and analytics

Reveal Solution Hide Solution

Correct Answer: A
A

Explanation:

A registry in the context of Master Data Management (MDM) is a centralized index that maintains pointers to master data located in various systems of record. This type of architecture is commonly referred to as a "registry" model and allows organizations to create a unified view of their master data without consolidating the actual data into a single repository. The registry acts as a directory, providing metadata and linkage information to the actual data sources.

Reference: DAMA-DMBOK2 Guide: Chapter 10 C Master and Reference Data Management "Master Data Management: Creating a Single Source of Truth" by David Loshin

Question #7

The concept of tracking the number of MDM subject areas and source system attributes Is referred to as:

  • A . Publish and Subscribe
  • B . Hub and Spoke
  • C . Mapping and Integration
  • D . Subject Area and Attribute
    Scope and Coverage

Reveal Solution Hide Solution

Correct Answer: D
D

Explanation:

Tracking the number of MDM subject areas and source system attributes refers to defining the scope and coverage of the subject areas and attributes involved in an MDM initiative. This process includes identifying all the data entities (subject areas) and the specific attributes (data elements) within those entities that need to be managed across the organization. By establishing a clear scope and coverage, organizations can ensure that all relevant data is accounted for and appropriately managed.

Reference: DAMA-DMBOK2 Guide: Chapter 10 C Master and Reference Data Management "Master Data Management and Data Governance" by Alex Berson, Larry Dubov

Question #8

All of the following methods arc a moans to protect and secure master data.

In a production environment except for which of the following?

  • A . Encryption ciphers
  • B . Static masking
  • C . Trust Model Technologies
  • D . Usage Agreements
  • E . Dynamic Masking

Reveal Solution Hide Solution

Correct Answer: D
D

Explanation:

Protecting and securing master data in a production environment can be achieved through various methods. Encryption ciphers, static masking, trust model technologies, and dynamic masking are all techniques used to safeguard data. However, usage agreements, while important for data governance and legal compliance, are not a technical method for securing data in the same way that the other options are. Usage agreements define the terms under which data can be accessed and used, but they do not directly protect the data itself.

Reference: DAMA-DMBOK2 Guide: Chapter 11 C Data Security Management "Data Masking: A Key Component of a Secure Data Management Strategy" by Anjali Kaushik

Question #9

Managing master data elements can be performed at which of the following points?

  • A . Third-party Provider (e.g. D&B)
  • B . Enterprise
  • C . Application Suite (e.g. ERP)
  • D . All Answers are correct

Reveal Solution Hide Solution

Correct Answer: D
D

Explanation:

Managing master data elements can be performed at multiple levels within an organization. This includes third-party providers such as Dun & Bradstreet (D&B) which can supply enriched and standardized master data. At the enterprise level, organizations manage master data centrally to ensure consistency and quality across all systems and processes. Within application suites such as ERP (Enterprise Resource Planning) systems, master data management ensures that data is consistent and accurate within and across different applications. Therefore, master data elements can be managed at all these points.

Reference: DAMA-DMBOK2 Guide: Chapter 10 C Master and Reference Data Management "The Data Warehouse Toolkit: The Definitive Guide to Dimensional Modeling" by Ralph Kimball

Question #10

Which of the following is a method of deterministic matching?

  • A . Sorted Neighborhood
  • B . Regional Frequency
  • C . Editing Distance
  • D . Phonetic
  • E . Exact string match

Reveal Solution Hide Solution

Correct Answer: E
E

Explanation:

Deterministic matching is a method of record linkage that relies on exact matching criteria. This means that records are considered a match if certain key fields (e.g., name, Social Security Number) have exactly the same values. Exact string match is a straightforward example of deterministic matching, where the strings in specific fields must be identical for a match to be declared. Other methods like sorted neighborhood, regional frequency, editing distance, and phonetic matching are probabilistic or heuristic approaches that allow for some degree of variation or error in the data.

Reference: DAMA-DMBOK2 Guide: Chapter 10 C Master and Reference Data Management "Entity Resolution and Information Quality" by John R. Talburt

Question #11

Master Data is similar to a physical product produced and sold by a company except for which of the following characteristics?

  • A . Unavailability may impact the business
  • B . Must fit the consumers’ required use
  • C . Need for information about its characteristics
  • D . Depletes when pulled from inventory
  • E . Has a useful life span

Reveal Solution Hide Solution

Correct Answer: D
D

Explanation:

Master Data, similar to a physical product, must meet certain requirements such as fitting consumers’ needs, needing information about its characteristics, impacting business when unavailable, and having a useful lifespan. However, unlike physical products, Master Data does not deplete when pulled from inventory. Master Data remains available for use even after being accessed multiple times, as it is digital information that can be replicated and shared without loss.

Reference: DAMA-DMBOK: Data Management Body of Knowledge (2nd Edition), Chapter 11: Reference and Master Data Management.

"Master Data Management and Data Governance" by Alex Berson and Larry Dubov.

Question #12

Which of the following Is a characteristic of a probabilistic matching algorithm?

  • A . A score is assigned based on weight and degree of match
  • B . Each variable to be matched is assigned a weight based on its discriminating power
  • C . Individual attribute matching scores arc used to create a match probability percentage.
  • D . All answers are correct
  • E . Following the matching process there are typically records requiring manual review and decisioning.

Reveal Solution Hide Solution

Correct Answer: D
D

Explanation:

Probabilistic matching algorithms assign a score based on the weight and degree of match, assign weights to variables based on their discriminating power, and use individual attribute matching scores to create a match probability percentage. Additionally, after the matching process, some records typically require manual review and decisioning to ensure accuracy. Therefore, all provided characteristics describe the nature of probabilistic matching algorithms accurately.

Reference: DAMA-DMBOK: Data Management Body of Knowledge (2nd Edition), Chapter 11: Reference and Master Data Management.

"Master Data Management and Data Governance" by Alex Berson and Larry Dubov

Question #13

The ISO definition of Master Data quality is which of the following?

  • A . Data meets the objective dimensions but not the subjective dimensions
  • B . Data meets all common requirements of all data users
  • C . Data is compliant to all international, country, and industry standards
  • D . The degree to which the data’s characteristics fulfill individual users’ requirements
  • E . Identifies the company that created and owns the Master Data

Reveal Solution Hide Solution

Correct Answer: D
D

Explanation:

The ISO definition of Master Data quality focuses on the degree to which the data’s characteristics meet the requirements of individual users. This implies that quality is subjective and depends on whether the data is suitable and adequate for its intended purpose, fulfilling the specific needs of its users.

Reference: ISO 8000-8:2015 – Data quality ― Part 8: Information and data quality: Concepts and measuring. DAMA-DMBOK: Data Management Body of Knowledge (2nd Edition), Chapter 13: Data Quality Management.

Question #14

Where is the most time/energy typically spent tor any MDM effort?

  • A . Subscribing content from the MDM environment
  • B . Designing the Enterprise Data Model
  • C . Vetting of business entities and data attributes by Data Governance process
  • D . Publishing content to the MDM environment
  • E . Securing funding for the MDM effort

Reveal Solution Hide Solution

Correct Answer: C
C

Explanation:

In any Master Data Management (MDM) effort, the most time and energy are typically spent on vetting business entities and data attributes through the Data Governance process. This step ensures that the data is accurate, consistent, and adheres to defined standards and policies. It involves significant collaboration and decision-making among stakeholders to validate and approve the data elements to be managed.

Reference: DAMA-DMBOK: Data Management Body of Knowledge (2nd Edition), Chapter 11: Reference and Master Data Management.

"Master Data Management and Data Governance" by Alex Berson and Larry Dubov.

Question #15

Can Reference data be used for financial trading?

  • A . No because customer data is not considered reference data
  • B . No. reference data is static, financial data trading is dynamic
  • C . No. since financial trades change every second they cannot use reference data
  • D . Yes. but only less than 1096 can be used
  • E . Yes. an estimated 70% of data being used in financial transactions is reference data

Reveal Solution Hide Solution

Correct Answer: E
E

Explanation:

Reference data plays a crucial role in financial trading. It includes data such as financial instrument identifiers, market data, currency codes, and regulatory classifications. Despite the dynamic nature of financial trades, reference data provides the necessary static information to execute and settle transactions. Industry estimates suggest that approximately 70% of the data used in financial transactions is reference data, underscoring its importance in the financial sector.

Reference: DAMA-DMBOK: Data Management Body of Knowledge (2nd Edition), Chapter 11: Reference and Master Data Management.

"The Data Warehouse Toolkit: The Definitive Guide to Dimensional Modeling" by Ralph Kimball and Margy Ross.

Industry publications and whitepapers on reference data management in financial services.

Question #16

ISO 8000 is a Master Data international standard tor what purpose?

  • A . Provides a standard format for defining a model for a data dictionary
  • B . Provide guidance only to the Buy side of the supply chain
  • C . To replace the ISO 9000 standard
  • D . Define and measure data quality
  • E . Defines a format to exchange data between parties

Reveal Solution Hide Solution

Correct Answer: D
D

Explanation:

ISO 8000 is an international standard focused on data quality and information exchange. Its primary purpose is to define and measure the quality of data, ensuring that it meets the requirements for completeness, accuracy, and consistency. The standard provides guidelines for data quality management, including requirements for data governance, data quality metrics, and procedures for improving data quality over time. ISO 8000 is not meant to replace ISO 9000, which is focused on quality management systems, but to complement it by addressing data quality specifically.

Reference: ISO 8000: Overview and Benefits of ISO 8000, International Organization for Standardization (ISO) DAMA-DMBOK2 Guide: Chapter 12 C Data Quality Management

Question #17

An organization chart where a high level manager has department managers with staff and non-managers without staff as direct reports would best be maintained in which of the following?

  • A . A fixed level hierarchy
  • B . A ragged hierarchy
  • C . A reference file
  • D . A taxonomy
  • E . A data dictionary

Reveal Solution Hide Solution

Correct Answer: B
B

Explanation:

A ragged hierarchy is an organizational structure where different branches of the hierarchy can have varying levels of depth. This means that not all branches have the same number of levels. In the given scenario, where a high-level manager has department managers with staff and non-managers without staff as direct reports, the hierarchy does not have a uniform depth across all branches. This kind of structure is best represented and maintained as a ragged hierarchy, which allows for flexibility in representing varying levels of managerial relationships and reporting structures.

Reference: DAMA-DMBOK2 Guide: Chapter 7 C Data Architecture Management "Master Data Management and Data Governance" by Alex Berson, Larry Dubov

Question #18

Matching or candidate identification is the process called similarity analysis. One approach is called deterministic which relies on:

  • A . Statistical techniques for assessing the probability that any pair of records represents the same entity
  • B . Taking data samples and looking at results for a subset of the records
  • C . Being able to determine the similarity between two data models
  • D . Algorithms for parsing and standardization and on defined patterns and rules for determining similarity
  • E . Finding two references that are linked with a single entity

Reveal Solution Hide Solution

Correct Answer: D
D

Explanation:

Deterministic matching, also known as exact matching, relies on predefined rules and algorithms to parse and standardize data, ensuring that records are compared based on exact or standardized values. This approach uses defined patterns and rules to determine whether two records represent the same entity by matching key attributes exactly. Deterministic matching is precise and unambiguous, making it a common approach for high-certainty matching tasks, although it can be less flexible than probabilistic methods that allow for variations in data.

Reference: DAMA-DMBOK2 Guide: Chapter 10 C Master and Reference Data Management "Entity Resolution and Information Quality" by John R. Talburt

Question #19

What statement is NOT correct as a key point of a MDM program?

  • A . Must continually prove and promote its accomplishments and benefits
  • B . Program funding requirements typically grow over time as the data inventory grows
  • C . Has an indefinite life span
  • D . Should be in scope for Big Data and loT initiatives
  • E . Can be effectively created and managed long-term using the same methodology

Reveal Solution Hide Solution

Correct Answer: E
E

Explanation:

A key point of a Master Data Management (MDM) program is that it must adapt and evolve over time. The statement that an MDM program "can be effectively created and managed long-term using the same methodology" is not correct. MDM programs must continually evolve to address new data sources, changing business requirements, and advancements in technology. As data inventory grows and the data landscape changes, MDM methodologies and strategies need to be reassessed and updated to remain effective. This adaptability is crucial for maintaining data quality and relevance.

Reference: DAMA-DMBOK2 Guide: Chapter 10 C Master and Reference Data Management "Master Data Management and Data Governance" by Alex Berson, Larry Dubov

Question #20

Taxonomic Reference Data enable which of the following?

  • A . Content classification and multi-level navigation to support Business Intelligence
  • B . The use of canonical models
  • C . Having data models physically instantiated in multiple platforms
  • D . Key processing steps for MDMs
  • E . Source systems named differently than target systems

Reveal Solution Hide Solution

Correct Answer: A
A

Explanation:

Taxonomic reference data involves categorizing and organizing data to enable structured access and retrieval. It facilitates content classification, allowing for efficient multi-level navigation, which is essential for Business Intelligence (BI) activities. By organizing data into a taxonomy, users can easily locate and analyze information, supporting better decision-making processes in BI.

Reference: DMBOK (Data Management Body of Knowledge), 2nd Edition, Chapter 11: Reference & Master Data Management.

DAMA-DMBOK Functional Framework, Function: Reference & Master Data Management.

Question #21

The ______development lifecycle is the best approach to follow for Reference & Master Data efforts.

  • A . System
  • B . Agile
  • C . Project
  • D . Data-centric
  • E . Software

Reveal Solution Hide Solution

Correct Answer: D
D

Explanation:

The data-centric development lifecycle is best suited for Reference & Master Data efforts because it prioritizes data integrity, quality, and governance throughout the entire development process. This approach ensures that reference and master data are consistently managed, maintained, and leveraged across various systems and applications.

Reference: DMBOK (Data Management Body of Knowledge), 2nd Edition, Chapter 11: Reference & Master Data Management.

DAMA International Guide to the Data Management Body of Knowledge (DAMA-DMBOK Guide), 2nd Edition.

Question #22

The most difficult MDM style to implement data governance is which of following-

  • A . Linkage style
  • B . Centralized style
  • C . Coexistence style
  • D . Consolidation style
  • E . Registry style

Reveal Solution Hide Solution

Correct Answer: E
E

Explanation:

The registry style is the most difficult MDM style to implement data governance due to its reliance on maintaining a central registry of master data without consolidating data physically. This method makes it challenging to ensure consistent governance across disparate systems since data remains distributed and only loosely connected via the registry.

Reference: DMBOK (Data Management Body of Knowledge), 2nd Edition, Chapter 11: Reference & Master Data Management.

Master Data Management: Creating a Single Source of Truth by David Loshin.

Question #23

The easiest MDM style to implement data governance based on controls that can be placed on persistent data is:

  • A . Registry style
  • B . Consolidation style
  • C . Agile Style
  • D . Centralized style
  • E . Multi-hub

Reveal Solution Hide Solution

Correct Answer: D
D

Explanation:

The centralized style is the easiest MDM style to implement data governance because it consolidates

all master data into a single central repository. This centralization simplifies the application of data

governance controls, ensuring consistent data quality, standards, and policies are applied across the

organization.

Reference: DMBOK (Data Management Body of Knowledge), 2nd Edition, Chapter 11: Reference & Master Data Management.

Master Data Management and Data Governance by Alex Berson and Larry Dubov.

Question #24

What type of interactive system model is most often used for Master Data Management?

  • A . Hub-and-Spoke
  • B . Application-coupling
  • C . Publish-Subscribe
  • D . Synchronized interface
  • E . Point-to-point

Reveal Solution Hide Solution

Correct Answer: A
A

Explanation:

The hub-and-spoke model is most often used for Master Data Management because it provides a central hub where master data is maintained, while the spokes represent different systems or applications that interact with the hub. This model allows for efficient management, synchronization, and distribution of master data across the enterprise, ensuring consistency and quality.

Reference: DMBOK (Data Management Body of Knowledge), 2nd Edition, Chapter 11: Reference & Master Data Management.

The Data Warehouse Toolkit: The Definitive Guide to Dimensional Modeling by Ralph Kimball and Margy Ross.

Question #25

When 2 records are not matched when they should have been matched, this condition is referred to as:

  • A . False Positive
  • B . A True Positive
  • C . A False Negative
  • D . A True Negative
  • E . An anomaly

Reveal Solution Hide Solution

Correct Answer: C
C

Explanation:

Definitions and Context:

False Positive: This occurs when a match is incorrectly identified, meaning records are deemed to match when they should not.

True Positive: This is a correct identification of a match, meaning records that should match are correctly identified as matching.

False Negative: This occurs when a match is not identified when it should have been, meaning records that should match are not matched.

True Negative: This is a correct identification of no match, meaning records that should not match are correctly identified as not matching.

Anomaly: This is a generic term that could refer to any deviation from the norm and does not specifically address the context of matching records.

The question asks about a scenario where two records should have matched but did not. This is the classic definition of a False Negative.

In data matching processes, this is a critical error because it means that the system failed to recognize a true match, which can lead to fragmented and inconsistent data.

Reference: DAMA-DMBOK: Data Management Body of Knowledge, 2nd Edition, Chapter 11: Master and Reference Data Management.

ISO 8000-2:2012, Data Quality – Part 2: Vocabulary.

Question #26

Bringing order to your Master Data would solve what?

  • A . 20 40% of the need to buy new servers
  • B . Distributing data across the enterprise
  • C . The need for a metadata repository
  • D . 60-80% of the most critical data quality problems
  • E . Provide a place to store technical data elements

Reveal Solution Hide Solution

Correct Answer: D
D

Explanation:

Definitions and Context:

Master Data Management (MDM): MDM involves the processes and technologies for ensuring the uniformity, accuracy, stewardship, semantic consistency, and accountability of an organization’s official shared master data assets.

Data Quality Problems: These include issues such as duplicates, incomplete records, inaccurate data,

and data inconsistencies.

Bringing order to your master data, through processes like MDM, aims to resolve data quality issues by standardizing, cleaning, and governing data across the organization.

Effective MDM practices can address and mitigate a significant proportion of data quality problems,

as much as 60-80%, because master data is foundational and pervasive across various systems and

business processes.

Reference: DAMA-DMBOK: Data Management Body of Knowledge, 2nd Edition, Chapter 11: Master and Reference Data Management.

Gartner Research, "The Impact of Master Data Management on Data Quality."

Question #27

Reference Data Dictionaries are authoritative listings of:

  • A . Master Data entities
  • B . External sources of data
  • C . Master Data sources
  • D . Master Data systems of record
  • E . Semantic rules

Reveal Solution Hide Solution

Correct Answer: B
B

Explanation:

Definitions and Context:

Reference Data Dictionaries: These are authoritative resources that provide standardized definitions and classifications for data elements.

External Sources of Data: These are data sources that come from outside the organization and are used for various analytical and operational purposes.

Reference Data Dictionaries often contain listings and definitions for data that are used across different organizations and systems, ensuring consistency and interoperability.

They typically include external data sources, which need to be standardized and understood in the context of the organization’s own data environment.

Reference: DAMA-DMBOK: Data Management Body of Knowledge, 2nd Edition, Chapter 11: Master and Reference Data Management.

ISO/IEC 11179-3:2013, Information technology – Metadata registries (MDR) – Part 3: Registry metamodel and basic attributes.

Question #28

When establishing a MOM. what is the benefit of doing data profiling?

  • A . Analyze data values and see how closely they correspond to a defined set of valid values
  • B . Develop data migration approach
  • C . Manage the design of the data warehouse
  • D . Develop batch data flows for a scheduler
  • E . Develop services to access, transform and deliver data

Reveal Solution Hide Solution

Correct Answer: A
A

Explanation:

Definitions and Context:

Data Profiling: This is the process of examining data from existing data sources and collecting statistics or informative summaries about that data.

Master Data Management (MDM): Establishing MDM involves processes and technologies for managing the non-transactional data entities of an organization.

Data profiling helps to understand the data’s characteristics and quality by analyzing data values and comparing them to defined valid values.

This process is crucial in establishing a Master Data Management (MDM) system as it ensures the

data adheres to the defined standards and is clean, accurate, and ready for integration into the MDM

system.

Reference: DAMA-DMBOK: Data Management Body of Knowledge, 2nd Edition, Chapter 11: Master and Reference Data Management.

Kimball, R. & Caserta, J. (2004). The Data Warehouse ETL Toolkit: Practical Techniques for Extracting, Cleaning, Conforming, and Delivering Data.

Question #29

A key capability to quickly onboard new data suppliers and subscribers to a MDM solution is which of the following?

  • A . Data format and transfer flexibility
  • B . Source system conformance to a single standard data input format
  • C . Requiring only delta loads of changed data attributes
  • D . Encrypting all personal information
  • E . Subscriber conformance to a single standard data output format

Reveal Solution Hide Solution

Correct Answer: A
A

Explanation:

Definitions and Context:

MDM Solution: This involves tools and processes to manage master data within an organization to ensure a single source of truth.

Onboarding Data Suppliers and Subscribers: This process involves integrating new data sources (suppliers) and distributing data to various applications or users (subscribers).

A key capability for onboarding is the flexibility in data format and transfer methods because different data suppliers may use various formats and protocols.

Ensuring flexibility allows the MDM system to easily adapt to different data sources and meet the needs of diverse data consumers, thereby facilitating quick and efficient onboarding.

Reference: DAMA-DMBOK: Data Management Body of Knowledge, 2nd Edition, Chapter 11: Master and Reference Data Management.

The Open Group, "TOGAF Series Guide: The Data Management Capability Assessment Model (DCAM)".

Question #30

Master Data Management resolves uncertainty by clearly stating that;

  • A . To have master data you must focus resources properly
  • B . Some entities [master entities) are more important than others
  • C . Only those entities in the Enterprise Data Model are considered Master Data.
  • D . All entities arc equal across an enterprise and need to be managed
  • E . Data elements must be stored in a repository before they are considered master data

Reveal Solution Hide Solution

Correct Answer: B
B

Explanation:

Master Data Management (MDM) aims to establish a single, reliable source of key business data (master data). The correct answer here is B, which states that "Some entities [master entities) are more important than others."

Definition of Master Data: Master data refers to the critical data that is essential for operations in a business, such as customer, product, and supplier information.

Significance in MDM: MDM focuses on identifying and managing these key entities because they are vital for business processes and decision-making. This is why these entities are considered more important than others.

Resolution of Uncertainty: By emphasizing the importance of master entities, MDM reduces ambiguity around which data should be prioritized and managed meticulously, ensuring consistency and accuracy across the enterprise.

Reference: DAMA-DMBOK: Data Management Body of Knowledge, 2nd Edition.

CDMP Study Guide

Question #31

MOM Harmonization ensures that the data changes of one application:

  • A . Are synchronized with all other applications who depend on that data
  • B . Are recorded in the repository or data dictionary
  • C . Agree with the overall MDM architecture
  • D . include changes to the configuration of the database as well as the data
  • E . Has a data steward to preview the data for quality

Reveal Solution Hide Solution

Correct Answer: A
A

Explanation:

Master Data Management (MDM) Harmonization ensures that the data changes of one application are synchronized with all other applications that depend on that data.

MDM Harmonization Definition: This process involves aligning and reconciling data from different sources to ensure consistency and accuracy across the enterprise.

Synchronization: Ensuring that changes in one application are reflected across all dependent applications prevents data inconsistencies and maintains data integrity.

Reference: DAMA-DMBOK: Data Management Body of Knowledge, 2nd Edition.

CDMP Study Guide

Question #32

MDM matching algorithms benefit from all of the following data characteristics except for which of the following?

  • A . Distinctiveness across the population of data
  • B . Low number of common data points
  • C . High level of comparability of the data elements
  • D . Structural heterogeneity of data elements
  • E . High validity of the data

Reveal Solution Hide Solution

Correct Answer: D
D

Explanation:

MDM matching algorithms benefit from various data characteristics but do not benefit from "Structural heterogeneity of data elements."

Matching Algorithms: These are used in MDM to identify and link data records that refer to the same

entity across different systems.

Data Characteristics:

Distinctiveness: Helps in accurately matching records.

Common Data Points: Aids in the comparison process.

Comparability: Facilitates effective matching.

Validity: Ensures the data is accurate and reliable.

Structural Heterogeneity: Different structures can complicate the matching process, making it harder

to align data.

Reference: DAMA-DMBOK: Data Management Body of Knowledge, 2nd Edition.

CDMP Study Guide

Question #33

The following are examples of entities for which you need to manage master data:

  • A . Employee Assignment. Employer. Transaction
  • B . Party, Account Balance, Order
  • C . Customer, Transaction, Product
  • D . Customer, Product, Employee
  • E . Product, Order, Inventory

Reveal Solution Hide Solution

Correct Answer: D
D

Explanation:

Entities such as Customer, Product, and Employee are typical examples of master data that need to be managed.

Master Data Entities: These are the key data objects around which business transactions are

conducted.

Examples:

Customer: Central to sales and service operations.

Product: Essential for inventory and sales management.

Employee: Critical for HR and payroll systems.

Reference: DAMA-DMBOK: Data Management Body of Knowledge, 2nd Edition.

CDMP Study Guide

Question #34

_____ are a primary supplier of master data content to a MDM program.

  • A . Configuration management database (CMDB)
  • B . Systems of record
  • C . Business intelligence applications
  • D . Data catalog
  • E . Point of sale systems

Reveal Solution Hide Solution

Correct Answer: B
B

Explanation:

Systems of record are primary suppliers of master data content to an MDM program.

Systems of Record: These are authoritative data sources that provide consistent and reliable master data.

Role in MDM: They supply accurate and up-to-date master data, ensuring that the MDM system has a solid foundation of information.

Reference: DAMA-DMBOK: Data Management Body of Knowledge, 2nd Edition.

CDMP Study Guide

Question #35

What data architecture component is typically used for ingesting raw source system data?

  • A . Integration Zone
  • B . Landing Zone
  • C . Curation Zone
  • D . Presentation Zone
  • E . Analytics Zone

Reveal Solution Hide Solution

Correct Answer: B
B

Explanation:

The Landing Zone is typically used for ingesting raw source system data.

Data Architecture Component: The landing zone is where raw data from various source systems is initially collected and stored.

Function: It acts as a staging area before the data undergoes further processing, transformation, or analysis.

Reference: DAMA-DMBOK: Data Management Body of Knowledge, 2nd Edition.

CDMP Study Guide

Exit mobile version