• Success
    Manage your Success Plans and Engagements, gain key insights into your implementation journey, and collaborate with your CSMs
    Success
    Accelerate your Purchase to Value engaging with Informatica Architects for Customer Success
  • Communities
    A collaborative platform to connect and grow with like-minded Informaticans across the globe
    Communities
    Connect and collaborate with Informatica experts and champions
    Have a question? Start a Discussion and get immediate answers you are looking for
    Customer-organized groups that meet online and in-person. Join today to network, share ideas, and get tips on how to get the most out of Informatica
  • Knowledge Center
    Troubleshooting documents, product guides, how to videos, best practices, and more
    Knowledge Center
    One-stop self-service portal for solutions, FAQs, Whitepapers, How Tos, Videos, and more
    Video channel for step-by-step instructions to use our products, best practices, troubleshooting tips, and much more
    Information library of the latest product documents
    Best practices and use cases from the Implementation team
  • Learn
    Rich resources to help you leverage full capabilities of our products
    Learn
    Role-based training programs for the best ROI
    Get certified on Informatica products. Free, Foundation, or Professional
    Free and unlimited modules based on your expertise level and journey
    Self-guided, intuitive experience platform for outcome-focused product capabilities and use cases
  • Resources
    Library of content to help you leverage the best of Informatica products
    Resources
    Most popular webinars on product architecture, best practices, and more
    Product Availability Matrix statements of Informatica products
    Monthly support newsletter
    Informatica Support Guide and Statements, Quick Start Guides, and Cloud Product Description Schedule
    End of Life statements of Informatica products
Last Updated Date May 25, 2021 |

Challenge

This Best Practice addresses issues related to lookup data standardization in the Master Data Management (MDM) hub and provides guidance on how to approach standardization in various scenarios.

Description

Lookup tables facilitate enterprise standard code values used in master data. Data ingestion validates against standard lookup data. Source specific lookup code/values are converted into enterprise standard lookup code/values in MDM. Identifying lookup tables requirements and populating valid enterprise standard lookup data is an important process in MDM domain development. In the MDM data model, populating the lookup tables is the first step before populating any domain base objects.

Informatica accelerators like Customer 360 and Supplier 360 come with  pre-defined lookup tables with industry standard lookup data. Organizations can  create additional lookup tables based on their specific MDM data requirements. The lookup table data should be cleansed, de-duplicated, and standardized. This can be done via the following two ways.

  • MDM cleanse functions and match/merge
  • IDQ Reference tables

IDQ reference tables are created and managed with Informatica’s Data Quality platform. This reference table data is loaded into MDM lookup tables directly though MDM stage tables. New reference data should originate or be loaded into IDQ reference tables and be pushed into an MDM Lookup table. After the initial load of reference data, adding or updating reference data is relatively slow in any organization. IDQ works best in this situation.

If the same IDQ platform is leveraged for any other data management project, the IDQ reference tables can be referred to for the enterprise standard lookup data which is a mirror copy of MDM lookup tables.

  Advantages

  • IDQ reference tables data can be managed from the IDQ developer tool and the web-based Analyst tool.
  • The initial load of reference table data can be created using unique values in the source IDQ data profile results.
  • Incoming source data through IDQ mappings gets validated against the reference tables and standardized before loading into MDM BO tables (KB#566316).

Disadvantages

  • Real-time load of source lookup data into MDM is not recommended because the lookup data has to be added into IDQ reference tables
  • The MDM UI cannot be leveraged to add or modify lookup data. Architecturally, lookup data is managed in IDQ reference tables and changing lookup data in MDM directly through the UI is not recommended.

MDM Managing Lookup Data

Initial load lookup data is consolidated in spread sheets or IDQ profiling results. IDQ Mappings load the initial consolidated lookup data into MDM lookup tables through MDM stage tables. New or updated lookup data is loaded through IDQ or directly ingested through the MDM UI or BES APIs. De-duplication match/merge can be run on BO lookup tables.

Advantages

  • Leverage MDM cleanse functions and match/merge to standardize and consolidate lookup data.
  • The MDM UI can be provisioned to add/update lookup data directly in MDM.
  • Lookup data can be inserted in real-time through BES APIs

Disadvantages

  • The need to develop and manage MDM UIs for each of the lookup tables for relatively slow-to-change change lookup data.

One vs Multiple Lookup Tables

In the MDM data model, lookup tables can be designed in two different ways.

  • One lookup table populated with multiple types of lookup data
  • Separate lookup tables for each type of lookup data.

The one lookup table design will have an additional column to define the type of lookup data.

RowiD

Lookup Type

Code Value
1

Gender

M

Male

2

Gender

F

Female

3

Country

USA

United States of America

4

Country

UK

United Kingdom

5

State

US-CA

USA-California

6

State

US-AL

USA-Alabama

Separate lookup tables will not have a type column as the table itself represents the type of lookup data.

Single Lookup Table

Multiple Lookup Tables

A single UI can handle multiple lookup data inserts and updates.

Multiple UIs have to be developed and provisioned to handle multiple lookup tables.

A single BES API can handle real-time data inserts and updates.

Multiple BES APIs have to be managed to handle real-time inserts and updates.

No filter is required to extract lookup data.

Filters are required to extract the type of the lookup data.

Table of Contents

Success

Link Copied to Clipboard