Informatica Customer Experience Portal
  • Home
  • Explore
    • Cloud Platform Onboarding
    • Data Governance Series
    • IICS Primer for PC Developers
    • Tech Tuesdays Webinars
    • Upgrade Planner 10.4.x
    • Explore Resources
  • Product Learning Path
  • Home
  • Explore
    • Cloud Platform Onboarding
    • Data Governance Series
    • IICS Primer for PC Developers
    • Tech Tuesdays Webinars
    • Upgrade Planner 10.4.x
    • Explore Resources
  • Product Learning Path
  1. Learning Path
  2. Cloud Data Integration: Intermediate
Level 1
Start Learning

Informatica’s Cloud Data Integration supports high-performance, scalable analytics with advanced transformations; enterprise-grade asset management; and sophisticated data integration capabilities such as mass ingestion, advanced pushdown optimization, and advanced workload orchestrations. Improve and simplify your data integration processes with comprehensive and easy-to-use capabilities and designers. 

After completing the Beginner level, the Intermediate level explores many How-Tos, whitepapers, and videos along with product assets, components, architecture, mapping, administration, connectors and much more.

After you successfully complete all the three levels of Informatica Cloud Data Integration product learning, you will earn an Informatica Badge for Cloud Data Integration. So continue on your product learning path and earn your badge!

Give Feedback

This module covered Java Heap increase, proxy, Certificate Import, maxDTMProcesses, Data Integration Assets, Components, REST APIs, RunaJobCLI Utility, and partitioning.

The section also discussed advanced runtime options, unconnected lookup, hierarchy parser, hierarchy builder, Intelligent Structure Discovery, Transformations (commonly used like Filter, Expression, Joiner, Lookup), Salesforce Oauth, Externalize Connections, and Object Level Permissions.

You also learned about connectors for PK Chuking with Salesforce, Bulk API, Microsoft Dynamics CRM, Amazon Redshift, WS Consumer for SOAP, Azure Data Warehouse, ODBC setup on Linux, Azure Data Lake and much more.

Now move on to the next level and get to know more about the product.  

Back to Learning Path
Next Level: Advanced

Contents

1
1.1
select
2
2.1
select
2.2
select
3
3.1
select
4
4.1
select
4.2
select
4.3
select
4.4
select
4.5
select
4.6
select
4.7
select
4.8
select
4.9
select
4.10
select
5
5.1
select
6
6.1
select
6.2
select
7
7.1
select
7.2
select
8
8.1
select
9
9.1
select
10
10.1
select
10.2
select
10.3
select
10.4
select
10.5
select
11
11.1
select
12
12.1
new
12.2
new
12.3
new
12.4
new
12.5
new

Increase Java Heap Size

In CDI, with a large data volume with Java-based connectors, you might get an error that can be handled by increasing the Java heap space on the JVM of the secure agent. Click HOW TO: Increase Java heap size on IICS to allocate more memory to the JVM for large data processing with certain connectors to learn more.

Proxy

When installing Informatica Cloud Secure Agent, if proxy settings are used, it is important to configure the proxy settings in the Secure Agent prior to registering it as a new runtime environment. Else, the Process-Server will not start as it does not have a .ini file and so this information is depended on the agent for the details of the proxy setting.

Click HOW TO: Setup Secure Agent when proxy is used to learn more.

Certificate Import

Click HOW TO: Import certificates into Informatica Cloud Secure Agent JRE to learn how to import certificates into Informatica Cloud Secure Agent JRE.

ODBC Setup for Linux

Click here to know how to create MS SQL Server DSN on Linux for the IICS ODBC connection and setup required to create a SQL server DSN.

Partitioning 

Partitioning license enables you to use partitions to optimize performance for mapping tasks.

If a mapping task processes large data sets or includes transformations that perform complicated calculations, the task can take a long time to process. When you use multiple partitions, the mapping task divides data into partitions and processes the partitions concurrently, which can optimize performance.

Click here to understand more about partitioning and its rules and guidelines with examples for transformations.

Click here to understand more about partitioning and its rules and guidelines with examples for mappings.

Advanced Runtime Options

On the Schedule page of the Synchronization task wizard, you can specify to run a synchronization task manually or schedule it to run at a specific time/ interval. You can create a schedule or use an existing schedule. You can also configure email notifications and advanced options for the task on the Schedule page.  Click here to learn more. 

Hierarchy Parser

This video will help you understand the Hierarchy Parser Transformation that will enable you to convert Hierarchical input to Relational input. The video includes important points that need to be considered while creating a mapping for the Hierarchy Parser transformation.

Click here to learn more about the Hierarchy Parser transformation including examples.

Hierarchy Builder

Click here to see an example of how a Hierarchy Builder is created. If you want to convert relational data to hierarchical data and write the data to a target file in a hierarchical format, you need to configure a hierarchical schema that uses a schema file to define the hierarchy of the output data. This example will help you know the schema hierarchy that you want to use.

Click HOW TO: Create a hierarchical schema in IICS to learn more.

Watch the video to know how to import and export tasks in IICS.

Click here to learn how to export an asset in IICS step by step.

Watch the video to understand more about some Salesforce specific options available when creating the task. This video will help you set the Salesforce target batch size, use the Salesforce bulk API and use Salesforce outbound messaging to trigger a synchronization task in real-time.

This video will help you understand what PK Chunking is, discuss when it can be used with the help of an example and also discuss what a PK Chunking header is.

Click here to know how to enable PK Chunking for a Salesforce source object in Data Integration. The PK Chunking option is available in Synchronization tasks, Replication tasks, and Mapping tasks.

WS Consumer for SOAP

The Cloud Data Integration Web Service Consumer Connector Guide contains information about how to set up and use Web Service Consumer Connector. The guide explains how organization administrators and business users can use Web Service Consumer Connector to read data from and write data to a web service that supports SOAP API.

Learn More

Watch this demo video to learn how Informatica helps accelerate data migration from on-premises EDW to Azure SQL DW.

Learn More

Click on this link that will direct you to the Data Integration Microsoft Azure Blob Storage V3 Connector Guide, which contains information on how to set up and use Microsoft Azure Blob Storage V3 Connector. The guide explains how organization administrators and business users can use Microsoft Azure Blob Storage V3 Connector to read from and write data to Microsoft Azure Blob Storage.

Learn More

Snowflake Cloud Data Warehouse

Click here to review the Cloud Data Integration Snowflake Cloud Data Warehouse V2 Connector Guide containing information on how to set up and use Snowflake Cloud Data Warehouse V2 Connector. The guide explains how organization administrators and business users can use Snowflake Cloud Data Warehouse V2 Connector to read data from or write data to Snowflake Cloud Data Warehouse.

File Processor

The Data Integration File Processor Connector Guide contains information on how to set up and use File Processor Connector. The guide explains how organization administrators and business users can use File Processor Connector to transfer files.

Learn More

We’ve written this workbook to help you accelerate your data-driven digital transformation and guide you in modernizing your data architecture with Microsoft Azure. We will show you how cloud data management can enable the next generation of agile analytics initiatives. 

We’ll describe key cloud data management hurdles and how to overcome them to support common usage patterns for cloud data warehousing with Microsoft Azure.

Learn More

We’ve written this workbook to guide you through the steps to modernize your data warehouse architecture with Amazon Redshift. We will show you how public cloud data management can enable the next generation of agile analytics initiatives. We’ll describe the key cloud data management challenges and how to overcome them in order to support three common usage 

Learn More

Amazon S3

Click here to review the Data Integration Amazon S3 V2 Connector Guide that contains information about how to set up and use Amazon S3 V2 Connector. The guide explains how business users can use Amazon S3 V2 Connector to read or write Avro, JSON, ORC, and Parquet file formats in Amazon S3.

Amazon Redshift

Click here to access the Cloud Data Integration Amazon Redshift V2 Connector Guide, which contains information about how to set up and use Amazon Redshift V2 Connector. The guide explains how business users can use Amazon Redshift V2 Connector to read data from and write data to Amazon Redshift.

Microsoft Dynamics CRM

The Cloud Data Integration Microsoft Dynamics CRM Connector Guide provides information about how to read data from and write data to Microsoft Dynamics CRM. This guide explains how organization administrators can configure the Microsoft Dynamics CRM Connector, and business users can use Microsoft Dynamics CRM Connector to create connections, develop mappings, and run synchronization and mapping tasks. This guide assumes that you have knowledge of Microsoft Dynamics CRM and Data Integration.

Learn More

See how Informatica Cloud B2B Gateway removes partner onboarding complexity and facilitates collaboration with automation.

Learn More

This is the second of a three-part demo of data and application integration using intelligent APIs. You had reviewed the first part in the Beginner level earlier.

This demo showcases how IT Operations can monitor services and APIs, ensure business processes move forward, and provide support in case of exceptions.

View Demo

This is the third part of the demo of data and application integration using intelligent APIs. The demo showcases how a developer can use Informatica’s Cloud Application Integration service to build the processes that become APIs.

View Demo

Here are a few resources to help you use REST API with IICS.

HOW TO: Create Mapping task in IICS using REST API

HOW TO: Fetch activity monitor details for IICS tasks using REST API

HOW TO: Run the Informatica Cloud task (IICS) with REST API using the new Global identifier Id of the object

Click here to learn how you can use the IICS REST API to interact with your Informatica Intelligent Cloud Services organization

Here are a few resources to guide you on how to use RunAJobCli Utility with IICS:

HOW TO: Run the Informatica Cloud task using Windows batch script/Linux shell script Using RunAJobCli Utility

HOW TO: Use RunAJobcli utility to trigger the task in Informatica Intelligent Cloud Services

HOW TO: Run taskflow in IICS from Runajob utility using third party schedulers?

This video briefly explains what intelligence structure is used for. It helps in reading the file and deriving data models from it to build a structure intelligence. You can combine, collapse, flatten and exclude nodes. 

Click here to learn how to use Intelligent Structure Discovery to create structures. You can also learn more through the examples and different models associated with it discussed in this link.

This video takes you through the new Mass Ingestion task included in the Cloud Data Integration. This feature enables you to create mass ingestion tasks through which you can transfer larger volumes of data from on-premise flat files to cloud applications such as Amazon Webservices and Amazon Redshift using FTP, SFTP, and FTPS.

You can also use Mass Ingestion REST API to run Mass Ingestion tasks automatically. This video will help you on how to create and run a Mass Ingestion task.

Learn More

Learn how to configure Informatica agent runtime such that the application runs on multiple agents 

Automate tasks through job scheduling in IICS.

Learn how to ensure/ assign appropriate licenses for each environment, examples. Connectors, export/import

Learn how to edit organization properties, such as maximum log entries and other organization details as necessary (Organizational Configuration and Setup).

Learn how to use Cloud REST API to automate the execution of task flows.

Informatica Cloud Integration Hub

Playback this technical webinar to learn more about the Informatica Cloud Integration Hub, which enables you to simplify and streamline complex, point-to-point data integrations and govern your integration architecture with a modern pub-sub data integration hub.

Watch the Webinar

Informatica Cloud App/API Integration

Dive deep into the API and application integration capabilities offered by our API platform. You will learn: What are APIs; Examples of use; Types of APIs and services; SOA and microservices architectural styles used to build API-based business services and applications; How Informatica helps you achieve this with its iPaaS services.

Watch the Webinar

This session is intended for IICS customers who want wizard-based interface, saving time and money by eliminating the complexities of populating Salesforce sandboxes (from Production instances), in a way that completely secures sensitive information.

After this session, customers can automate/schedule the data push to Salesforce sandbox environments.

In this Webinar users can get an overview of how Github/Version control feature would work in IICS.

Watch this webinar to learn about the new capabilities of Cloud Data Integration service as part of Spring 2020 launch and understand how to leverage them in your integration flows.

This webinar is intended for customers who are using IICS and want to monitor their jobs and tasks using Operational Insights. At the end of this session, users will learn about new features like User Analytics and Alerting for CDI.

This webinar is intended for Cloud Data Integration developers and will enable them on how to integrate data from/to Amazon S3 bucket using Informatica Intelligent Cloud Services - Cloud Data Integration.

Summary
More at informatica.com
  • Solutions
  • Move to the Cloud
  • Data Governance & Privacy
  • Provide Analytics Insights
  • Drive 360 Engagement
  • Explore Ecosystems
  • Products
  • Data Engineering
  • Cloud Services
  • Data Integration
  • Data Quality
  • Data Security
  • Master Data Management
  • Data Catalog
  • Industries
  • Banking & Capital Markets
  • Government Agencies
  • Healthcare
  • Higher Education
  • Insurance
  • Life Sciences
  • Manufacturing
  • Retail
  • Utilities
  • Support & Training
  • Certification
  • Glossary of Terms
  • Informatica University
  • Professional Services
  • Success Offerings
  • About Us
  • Blog
  • Careers
  • Company Information
  • Customers
  • Customer Community
  • Events
  • News
  • Partners
  • Trust Center
  • Webinars
Contact Us Locations Trust Center Trademarks Terms of Use Legal Privacy Cookies Email Preferences


©Informatica All Rights Reserved