Informatica’s Cloud Data Integration supports high-performance, scalable analytics with advanced transformations; enterprise-grade asset management; and sophisticated data integration capabilities such as mass ingestion, advanced pushdown optimization, and advanced workload orchestrations. Improve and simplify your data integration processes with comprehensive and easy-to-use capabilities and designers.
After completing the Beginner level, the Intermediate level explores many How-Tos, whitepapers, and videos along with product assets, components, architecture, mapping, administration, connectors and much more.
After you successfully complete all the three levels of Informatica Cloud Data Integration product learning, you will earn an Informatica Badge for Cloud Data Integration. So continue on your product learning path and earn your badge!
This module covered Java Heap increase, proxy, Certificate Import, maxDTMProcesses, Data Integration Assets, Components, REST APIs, RunaJobCLI Utility, and partitioning.
The section also discussed advanced runtime options, unconnected lookup, hierarchy parser, hierarchy builder, Intelligent Structure Discovery, Transformations (commonly used like Filter, Expression, Joiner, Lookup), Salesforce Oauth, Externalize Connections, and Object Level Permissions.
You also learned about connectors for PK Chuking with Salesforce, Bulk API, Microsoft Dynamics CRM, Amazon Redshift, WS Consumer for SOAP, Azure Data Warehouse, ODBC setup on Linux, Azure Data Lake and much more.
Now move on to the next level and get to know more about the product.
Increase Java Heap Size
In CDI, with a large data volume with Java-based connectors, you might get an error that can be handled by increasing the Java heap space on the JVM of the secure agent. Click HOW TO: Increase Java heap size on IICS to allocate more memory to the JVM for large data processing with certain connectors to learn more.
When installing Informatica Cloud Secure Agent, if proxy settings are used, it is important to configure the proxy settings in the Secure Agent prior to registering it as a new runtime environment. Else, the Process-Server will not start as it does not have a .ini file and so this information is depended on the agent for the details of the proxy setting.
Click HOW TO: Setup Secure Agent when proxy is used to learn more.
Click HOW TO: Import certificates into Informatica Cloud Secure Agent JRE to learn how to import certificates into Informatica Cloud Secure Agent JRE.
ODBC Setup for Linux
Click here to know how to create MS SQL Server DSN on Linux for the IICS ODBC connection and setup required to create a SQL server DSN.
Partitioning license enables you to use partitions to optimize performance for mapping tasks.
If a mapping task processes large data sets or includes transformations that perform complicated calculations, the task can take a long time to process. When you use multiple partitions, the mapping task divides data into partitions and processes the partitions concurrently, which can optimize performance.
Click here to understand more about partitioning and its rules and guidelines with examples for transformations.
Click here to understand more about partitioning and its rules and guidelines with examples for mappings.
Advanced Runtime Options
On the Schedule page of the Synchronization task wizard, you can specify to run a synchronization task manually or schedule it to run at a specific time/ interval. You can create a schedule or use an existing schedule. You can also configure email notifications and advanced options for the task on the Schedule page. Click here to learn more.
This video will help you understand the Hierarchy Parser Transformation that will enable you to convert Hierarchical input to Relational input. The video includes important points that need to be considered while creating a mapping for the Hierarchy Parser transformation.
Click here to learn more about the Hierarchy Parser transformation including examples.
Click here to see an example of how a Hierarchy Builder is created. If you want to convert relational data to hierarchical data and write the data to a target file in a hierarchical format, you need to configure a hierarchical schema that uses a schema file to define the hierarchy of the output data. This example will help you know the schema hierarchy that you want to use.
Click HOW TO: Create a hierarchical schema in IICS to learn more.
Watch the video to know how to import and export tasks in IICS.
Click here to learn how to export an asset in IICS step by step.
Watch the video to understand more about some Salesforce specific options available when creating the task. This video will help you set the Salesforce target batch size, use the Salesforce bulk API and use Salesforce outbound messaging to trigger a synchronization task in real-time.
This video will help you understand what PK Chunking is, discuss when it can be used with the help of an example and also discuss what a PK Chunking header is.
Click here to know how to enable PK Chunking for a Salesforce source object in Data Integration. The PK Chunking option is available in Synchronization tasks, Replication tasks, and Mapping tasks.
WS Consumer for SOAP
The Cloud Data Integration Web Service Consumer Connector Guide contains information about how to set up and use Web Service Consumer Connector. The guide explains how organization administrators and business users can use Web Service Consumer Connector to read data from and write data to a web service that supports SOAP API.
Watch this demo video to learn how Informatica helps accelerate data migration from on-premises EDW to Azure SQL DW.
Click on this link that will direct you to the Data Integration Microsoft Azure Blob Storage V3 Connector Guide, which contains information on how to set up and use Microsoft Azure Blob Storage V3 Connector. The guide explains how organization administrators and business users can use Microsoft Azure Blob Storage V3 Connector to read from and write data to Microsoft Azure Blob Storage.
Snowflake Cloud Data Warehouse
Click here to review the Cloud Data Integration Snowflake Cloud Data Warehouse V2 Connector Guide containing information on how to set up and use Snowflake Cloud Data Warehouse V2 Connector. The guide explains how organization administrators and business users can use Snowflake Cloud Data Warehouse V2 Connector to read data from or write data to Snowflake Cloud Data Warehouse.
The Data Integration File Processor Connector Guide contains information on how to set up and use File Processor Connector. The guide explains how organization administrators and business users can use File Processor Connector to transfer files.
We’ve written this workbook to help you accelerate your data-driven digital transformation and guide you in modernizing your data architecture with Microsoft Azure. We will show you how cloud data management can enable the next generation of agile analytics initiatives.
We’ll describe key cloud data management hurdles and how to overcome them to support common usage patterns for cloud data warehousing with Microsoft Azure.
We’ve written this workbook to guide you through the steps to modernize your data warehouse architecture with Amazon Redshift. We will show you how public cloud data management can enable the next generation of agile analytics initiatives. We’ll describe the key cloud data management challenges and how to overcome them in order to support three common usage
Click here to review the Data Integration Amazon S3 V2 Connector Guide that contains information about how to set up and use Amazon S3 V2 Connector. The guide explains how business users can use Amazon S3 V2 Connector to read or write Avro, JSON, ORC, and Parquet file formats in Amazon S3.
Click here to access the Cloud Data Integration Amazon Redshift V2 Connector Guide, which contains information about how to set up and use Amazon Redshift V2 Connector. The guide explains how business users can use Amazon Redshift V2 Connector to read data from and write data to Amazon Redshift.
Microsoft Dynamics CRM
The Cloud Data Integration Microsoft Dynamics CRM Connector Guide provides information about how to read data from and write data to Microsoft Dynamics CRM. This guide explains how organization administrators can configure the Microsoft Dynamics CRM Connector, and business users can use Microsoft Dynamics CRM Connector to create connections, develop mappings, and run synchronization and mapping tasks. This guide assumes that you have knowledge of Microsoft Dynamics CRM and Data Integration.
See how Informatica Cloud B2B Gateway removes partner onboarding complexity and facilitates collaboration with automation.
This is the second of a three-part demo of data and application integration using intelligent APIs. You had reviewed the first part in the Beginner level earlier.
This demo showcases how IT Operations can monitor services and APIs, ensure business processes move forward, and provide support in case of exceptions.
This is the third part of the demo of data and application integration using intelligent APIs. The demo showcases how a developer can use Informatica’s Cloud Application Integration service to build the processes that become APIs.
Here are a few resources to help you use REST API with IICS.
Here are a few resources to guide you on how to use RunAJobCli Utility with IICS:
This video briefly explains what intelligence structure is used for. It helps in reading the file and deriving data models from it to build a structure intelligence. You can combine, collapse, flatten and exclude nodes.
Click here to learn how to use Intelligent Structure Discovery to create structures. You can also learn more through the examples and different models associated with it discussed in this link.
This video takes you through the new Mass Ingestion task included in the Cloud Data Integration. This feature enables you to create mass ingestion tasks through which you can transfer larger volumes of data from on-premise flat files to cloud applications such as Amazon Webservices and Amazon Redshift using FTP, SFTP, and FTPS.
You can also use Mass Ingestion REST API to run Mass Ingestion tasks automatically. This video will help you on how to create and run a Mass Ingestion task.
Learn how to configure Informatica agent runtime such that the application runs on multiple agents
Automate tasks through job scheduling in IICS.
Learn how to ensure/ assign appropriate licenses for each environment, examples. Connectors, export/import
Learn how to edit organization properties, such as maximum log entries and other organization details as necessary (Organizational Configuration and Setup).
Learn how to use Cloud REST API to automate the execution of task flows.
Informatica Cloud Integration Hub
Playback this technical webinar to learn more about the Informatica Cloud Integration Hub, which enables you to simplify and streamline complex, point-to-point data integrations and govern your integration architecture with a modern pub-sub data integration hub.
Informatica Cloud App/API Integration
Dive deep into the API and application integration capabilities offered by our API platform. You will learn: What are APIs; Examples of use; Types of APIs and services; SOA and microservices architectural styles used to build API-based business services and applications; How Informatica helps you achieve this with its iPaaS services.
This session is intended for IICS customers who want wizard-based interface, saving time and money by eliminating the complexities of populating Salesforce sandboxes (from Production instances), in a way that completely secures sensitive information.
After this session, customers can automate/schedule the data push to Salesforce sandbox environments.
In this Webinar users can get an overview of how Github/Version control feature would work in IICS.
Watch this webinar to learn about the new capabilities of Cloud Data Integration service as part of Spring 2020 launch and understand how to leverage them in your integration flows.