Informatica’s CDI supports high-performance, scalable analytics with advanced transformations; enterprise-grade asset management; and sophisticated data integration capabilities such as mass ingestion, advanced pushdown optimization, and advanced workload orchestrations. Improve and simplify your data integration processes with comprehensive and easy-to-use capabilities and designers.
The Advanced level will help you develop expertise in Informatica Cloud Data Integration. It constitutes many videos, documents, and articles that will take you through pushdown optimization with Azure, mapping parameters, advanced transformation, and advanced connectors like Netsuite, Workday, Hadoop, SAP and more.
After you successfully complete all the three levels of Informatica CDI product learning path, you will earn an Informatica Badge for Cloud Data Integration. So continue your product learning and earn your badge!
This module covered topics on Pushdown Optimization, Mapping Parameters, Macros, Dynamic linking, and Advanced Transformation (Aggregator, Normalizer Transformations).
This module also discussed information on caching and how to read log files for cache information for disk space requirement and Import and Export Assets with CLI/REST API.
On an advanced level, the section walked you through Netsuite Connector, Workday Connector, Hadoop connection, and SAP connection setup.
You have successfully completed all the three levels of IICS product learning path. You have now earned your Badge for IICS!
Click here to know how to do full pushdown optimization with Azure DataWarehouseV2 ODBC connector in CDI.
You can use pushdown optimization to push transformation logic to source databases or target databases for execution. Use pushdown optimization when using database resources can improve task performance.
This video explains parameterization and its uses with the help of an example, advanced parameterization, Rest utilization and Rest Tools and discussing advantages of Rest API and parameterization.
Click here to understand more about parameters and their types that include Input parameters, In-Out parameters and parameter files.
Click here to understand more about In-Out parameters, which are a placeholder for a value that stores a counter or task stage. Data Integration evaluates the parameter at run time based on your configuration.
Click here to know how to use In-out parameters in the Cloud Mapping Designer.
Click here to know how to use In-Out parameters in Informatica Cloud.
Watch this video that gives an overview of Dynamic Linking. It also discusses how to use the Create a New Runtime option.
An Expression Macro is a macro that you can use to create repetitive or complex expressions in mappings.
Watch this video that will take you through the different types of Expression Macro.
This video will help you understand what an Aggregator Transformation is and how it is created. It will also help you know how it is applied to aggregate functions such as sum, and average on groups of data,. This video also discusses Normalizer transformations and how to create a mapping using Normalizer transformation.
Click here to understand more about Java transformation and extend Data Integration functionality with the Java transformation.
Click here to understand more about Union transformation, which is an active transformation that you use to merge data from two pipelines into a single pipeline.
How to Read Log Files for Cache Information for Disk Space Requirement
Click here to know how to change the cache directory for IICS Data Integration cache files (for joiner, sorter, aggregator, and lookup transformations.
Click here to view the Data Integration SAP Connector Guide, which contains information on how to set up and use SAP Connector. The guide also explains how organization administrators and business users can use the SAP Connector to read from and write data to SAP.
Workday V2 Connector Guide
Click here to go through the Data Integration Workday Connector Guide, which contains information on how to set up and use Workday Connector. The guide explains how organization administrators and business users can use Workday Connector to perform operations in multiple Workday modules.
Click here to know how to increase the performance of the Netsuite V1 connector tasks by increasing the concurrent threads in IICS.
Click here to view the Data Integration NetSuite Connector Guide, which contains information about how to set up and use NetSuite Connector. The guide explains how organization administrators and business users can use NetSuite connections to securely read data from or write data to NetSuite.
Rest V2 connector and Swagger
The Data Integration REST V2 Connector Guide contains information on how to set up and use REST V2 Connector. The guide also explains how organization administrators and business users can use REST V2 Connector to read data from and write data to a web service that supports REST API.
Click here to view the Data Integration Hadoop Files V2 Connector Guide, which contains information on how to set up and use Hadoop Files V2 Connector. This guide also explains how organization administrators and business users can use Hadoop Files V2 Connector to securely read data from or write data to complex files on the local system or in HDFS.
Learn how to track and monitor the status of all ETL tasks and secure agents in the environment in IICS.
Watch the video to learn how to:
- Monitor user login activity, projects, and folders updates
- Monitor bundle usage in the organization
This video discusses how to create and publish a bundle in Marketplace.
Learn how to copy a pre-existing bundle to development projects or folders to improve productivity and quality of data integration projects.
Learn how to create and run a mass ingestion task to transfer high volume data to the target system.
Learn how to use Mapping Designer to configure source to target mappings, add expression transformations, and sort transformations for cleansing.
This webinar is intended for IICS customers who are using Cloud Data Quality or want to know more about the same.
At the end of this session, customers will be able to get insights about the dimensions of Data Quality, how to create and manage rule specifications and dictionaries in IICS, and how to use Data Profiling to create and run data profiling tasks.
This session is intended for IICS customers who want to optimize performance of tasks using Partitions and Pushdown optimization.
After this session, customers would be able to create partitions, utilize Pushdown optimization to improve performance of their tasks.
This session is intended for IICS customers who want to ingest data from on-premises to their cloud environments in a highly scalable manner.
At the end of this session, customers will be able to use mass ingestion tasks to transfer a large number of files of any file type between on-premises and cloud repositories and use the Data Ingestion service to create, run, monitor, and manage tasks and streaming ingestion tasks.