• Success
    Manage your Success Plans and Engagements, gain key insights into your implementation journey, and collaborate with your CSMs
    Success
    Accelerate your Purchase to Value engaging with Informatica Architects for Customer Success
    All your Engagements at one place
  • Communities
    A collaborative platform to connect and grow with like-minded Informaticans across the globe
    Communities
    Connect and collaborate with Informatica experts and champions
    Have a question? Start a Discussion and get immediate answers you are looking for
    Customer-organized groups that meet online and in-person. Join today to network, share ideas, and get tips on how to get the most out of Informatica
  • Knowledge Center
    Troubleshooting documents, product guides, how to videos, best practices, and more
    Knowledge Center
    One-stop self-service portal for solutions, FAQs, Whitepapers, How Tos, Videos, and more
    Video channel for step-by-step instructions to use our products, best practices, troubleshooting tips, and much more
    Information library of the latest product documents
    Best practices and use cases from the Implementation team
  • Learn
    Rich resources to help you leverage full capabilities of our products
    Learn
    Role-based training programs for the best ROI
    Get certified on Informatica products. Free, Foundation, or Professional
    Free and unlimited modules based on your expertise level and journey
    Self-guided, intuitive experience platform for outcome-focused product capabilities and use cases
  • Resources
    Library of content to help you leverage the best of Informatica products
    Resources
    Most popular webinars on product architecture, best practices, and more
    Product Availability Matrix statements of Informatica products
    Monthly support newsletter
    Informatica Support Guide and Statements, Quick Start Guides, and Cloud Product Description Schedule
    End of Life statements of Informatica products
Last Updated Date May 25, 2021 |

Challenge

Data Exchange event modeling is a powerful means to capture every activity in the system to provide the users of the system with all of the information they need. Having an incorrect or inefficient event model can easily lead to a variety of issues, the most important being adverse performance of the system. This document describes some of the best practices to be taken into consideration during the event modeling.

Description

Events represent activity in the system. For example, the activity could be file processing related or system related. DX events are identified using “event id,” which is also referred to as the event handle. Events are attached to a profile that needs to be specified at the time of creation of the event. The profile attached to an event cannot be changed after the event is created.

Parent and Child Events

A DX event has multiple parts that capture data pertaining to the activity that is being captured, such as: subject, status, attributes, reconciliation status and event logs that are attached to every event. It is important to understand how DX allows for manipulation and display of the event information in order to create an optimal event model. For example, events can be filtered based on attributes but not on event logs. If the operator wants to retrieve an event based on a transaction ID or amount, this information should be attached to the event attribute of the event associated with the processing of the document or message.

DX also supports hierarchical event structure that allows for the creation of child events underneath any event (called parent event). The topmost parent event is also called the root event. From a repository perspective, parent events are like any other event with subject, status and etcetera, and the difference is that there are child events attached to this event. However, from DX console user interface (UI) perspective, the root events are always displayed as part of the event search, and child events are not displayed unless the option to display child events is turned on. This is something that the users of the UI may not like to do.

Event Model

Event model refers to the definition of events in the system for various flows and the profiles, statuses, attributes, event logs, event types, reconciliation and hierarchy that correspond with the definition of an event.

The most logical way to look at an event is to associate one for each file or message that is received so that all information associated with the file or message is self-contained within the event. Once an appropriate retrieval mechanism is provided for the operator, this design is very simple and efficient for the operators to visualize and troubleshoot. The retrieval mechanism is used to populate the event subject or event attribute with a unique identifier of the file, such as a transaction ID. It is recommended to talk to the operators and business users in order to understand what will drive the retrieval of the file. For example, usage of entities such as control numbers in X12 based EDI or HIPAA may not be desirable, since control numbers may only pertain to the transmission aspect of the messages, and the business users may be interested in starting with the transaction identifier or claim identifier.

There are situations when more than one event needs to be created for processing a file. The most obvious example is when an inbound file is very large and requires splitting for a variety of reasons; furthermore, businesses may have regulatory mandates that call for a mechanism to track the entire inbound file. In these situations, a root event can be created for each large file that is received, and a child event can be created under the root event for each of the splits. All actions and information pertaining to the entire file can be captured at the root event level. Likewise, any actions and information pertaining to the individual split can be captured under the appropriate child event.

Operator Visibility

An important aspect of event modeling is to ensure that the operators can quickly and easily access the event they are looking for with minimal UI operations. This means that each event needs to be retrieved using a primary identification mechanism, such as a Transaction ID. Any attribute that will be used for the search needs to be defined as an attribute of the event and populated as part of the process flow.

Uniform Event Model

Data Exchange implementations may handle multiple document types, each with different attributes, statuses, logging information and etcetera. However, it is recommended that a uniform event model be used for different documents as much as possible. This way, there is some level of consistency in the event model for different document types. This consistency leads to two clear advantages for two different groups: the operators (making it easier during troubleshooting) and the developers (where one mapping blueprint can be used for all of the document types).

Tying Events Together

It is recommend that all related activity pertaining to a single, inbound file be connected if they cannot be adequately represented using a parent child relationship. Consider this example: DX receives hundreds of messages every hour from various trading partners and need to respond with an acknowledgement for each of the messages individually. Also, DX needs to send a batch file with some of the data elements from these messages every three hours to a backend system. This flow can be modeled as follows:

  • One event will be created for each inbound message with the event type called “Transaction.” This event will follow the process flow from reception of the message to sending the acknowledgement back.
  • One event type called “Batch” will be created each time the scheduled batch file process runs. This event will capture the batch file itself apart from any statistics about the batch file and the process, such as the number of records or transactions, transaction totals in the file, and etcetera.

Though these events are beneficial, if the operator is asked to identify what was sent in the batch file for a specific transaction, it will not be easy to locate the information. However, if the batch file event is associated with the transaction event by storing the batch event ID as an attribute of the transaction event (as part of the batch file process), the problem is solved.

Reconciliation and Multiple Reconciliation Tokens

Reconciliation in DX is used for two primary purposes: to connect a response to a request so that the event representing the request is taken to its logical conclusion; and to implement an SLA mechanism for a process in which reconciliation is initiated as the first step and completed as the last step of a process flow.. There may be requirements where more reconciliation tokens need to be set up for an event. It is recommended that these be converted to child events with individual tokens, since having multiple open reconciliation tokens is detrimental to the DX server process.

Too Many Events and Repository Size

DX event model needs to present as much information as possible for the operators and business users. DX event model is very flexible and lends itself to any amount of customization in the event model. At the same time, it is important not to stretch it too much. Creating an event takes up resources in the form of CPU and memory on the DX server process, as well as the database process. In addition, each event requires a minimum of 1 KB of repository space. The following are some examples of unwanted events:

  • Additional events created on the same file to present a different view to the operator.
  • Additional events created in lieu of using appropriate event statuses.
  • If a single ST segment in an EDI will contain only one transaction, do not create one event for the ST segment and one more child event for the transaction. All the transaction related information can be captured underneath the ST event itself in an effective manner.

Creating one event for each received file or generate file is reasonable. If the event model calls for more than one event for each file, examine the model and ensure that these additional events are required and add value.

Table of Contents

Success

Link Copied to Clipboard