Skip to the content
LeaplogicLeaplogic
  • Home
  • About Us
  • Contact
SIGN IN
  • Home
  • About Us
  • Contact

  • Getting Started
    • Before You Begin
    • Creating an Account
    • Logging into LeapLogic
    • Reset Password
    • Quick Tour of the Web Interface
    • LeapLogic in 15 minutes
      • Prerequisites
      • Step 1. Log into LeapLogic
      • Step 2. Create Assessment and Get Insights
      • Step 3. Create Transformation Pipeline and See Results
      • Step 4. Edit or Optimize the Transformed Code
      • Step 5: Complete the Transformation Lifecycle
  • Introduction to LeapLogic
    • Overview
    • High Level Architecture
    • Supported Legacy and Cloud Platforms
    • Key Features
  • Workload Assessment
    • Overview
    • Value Proposition
    • Creating Assessment
      • Prerequisites
      • Step 1. Provide Primary Inputs
        • Automation Coverage
      • Step 2. Add the Additional Inputs
        • Table Stat Extraction Steps
          • Teradata
          • Oracle
          • Netezza
      • Step 3. Update the Source Configuration
      • Step 4. Configure the Recommendation Settings
    • Assessment Listing
    • Understanding Insights and Recommendations
      • Volumetric Info
      • EDW
        • Oracle
          • Highlights
          • Analysis
          • Optimization
          • Lineage
          • Recommendations
          • Downloadable Reports
        • Vertica
          • Highlights
          • Analysis
          • Optimization
          • Lineage
          • Recommendations
          • Downloadable Reports
        • Snowflake
          • Highlights
          • Analysis
          • Optimization
          • Lineage
          • Recommendations
          • Downloadable Reports
        • Azure Synapse
          • Highlights
          • Analysis
          • Optimization
          • Lineage
          • Recommendations
          • Downloadable Reports
        • SQL Server
          • Highlights
          • Analysis
          • Optimization
          • Lineage
          • Recommendations
          • Downloadable Reports
        • Teradata
          • Highlights
          • Analysis
          • Optimization
          • Lineage
          • Recommendations
          • Downloadable Reports
        • Netezza
          • Highlights
          • Analysis
          • Optimization
          • Lineage
          • Recommendations
          • Downloadable Reports
        • Google Big Query
          • Highlights
          • Analysis
          • Optimization
          • Lineage
          • Recommendations
          • Downloadable Reports
        • Redshift
          • Highlights
          • Analysis
          • Optimization
          • Lineage
          • Recommendations
          • Downloadable Reports
        • PostgreSQL
          • Highlights
          • Analysis
          • Optimization
          • Lineage
          • Recommendations
          • Downloadable Reports
        • Duck DB
          • Highlights
          • Analysis
          • Optimization
          • Lineage
          • Recommendations
          • Downloadable Reports
        • ClickHouse
          • Highlights
          • Analysis
          • Optimization
          • Lineage
          • Recommendations
          • Downloadable Reports
        • Exasol
          • Highlights
          • Analysis
          • Optimization
          • Lineage
          • Recommendations
          • Downloadable Reports
        • DB2
          • Highlights
          • Analysis
          • Optimization
          • Recommendations
          • Lineage
          • Downloadable Reports
      • ETL
        • Informatica
          • Highlights
          • Analysis
          • Lineage
          • Downloadable Reports
        • Ab Initio
          • Highlights
          • Analysis
          • Lineage
          • Downloadable Reports
        • DataStage
          • Highlights
          • Analysis
          • Lineage
          • Downloadable Reports
        • Talend
          • Highlights
          • Analysis
          • Lineage
          • Downloadable Reports
        • SSIS
          • Highlights
          • Analysis
          • Lineage
          • Downloadable Reports
        • Informatica BDM
          • Highlights
          • Analysis
          • Lineage
          • Downloadable Reports
        • Oracle Data Integrator
          • Highlights
          • Analysis
          • Downloadable Reports
        • Pentaho
          • Highlights
          • Analysis
          • Downloadable Reports
        • Azure Data Factory
          • ARM Template
          • Highlights
          • Analysis
          • Downloadable Reports
        • Matillion
          • Highlights
          • Analysis
          • Downloadable Reports
        • SnapLogic
          • Highlights
          • Analysis
          • Downloadable Reports
      • Orchestration
        • AutoSys
          • Highlights
          • Analysis
          • Downloadable Reports
        • Control-M
          • Highlights
          • Analysis
          • Lineage
          • Downloadable Reports
        • SQL Server
          • Highlights
          • Analysis
      • BI
        • OBIEE
          • Highlights
          • Analysis
          • Lineage
          • Downloadable Reports
        • Tableau
          • Highlights
          • Analysis
          • Lineage
          • Downloadable Reports
        • IBM Cognos
          • Highlights
          • Analysis
          • Downloadable Reports
        • MicroStrategy
          • Highlights
          • Analysis
          • Lineage
          • Downloadable Reports
        • Power BI
          • Highlights
          • Analysis
          • Lineage
          • Downloadable Reports
        • SSRS
          • Highlights
          • Analysis
          • Downloadable Reports
        • SAP BO
          • Highlights
          • Analysis
          • Lineage
          • Downloadable Reports
        • WebFOCUS
          • Highlights
          • Analysis
          • Downloadable Reports
      • Analytics
        • SAS
          • Highlight
          • Analysis
          • Lineage
          • Downloadable Reports
        • Alteryx
          • Highlights
          • Analysis
          • Lineage
          • Downloadable Reports
      • Integrated Assessment (EDW, ETL, Orchestration, BI)
        • Highlights
        • Analysis
        • Optimization
        • Lineage
        • Recommendations
    • Managing Assessment Reports
      • Downloading Report
      • Input Report Utility
      • View Configuration
    • Complexity Calculation Logic
    • Key Benefits
    • Ad hoc Query
  • Metadata Management
    • Overview
    • Introduction to Data Catalog
      • Managing Data Catalog
        • Building Data Catalog
        • Insights to Data Catalog
        • Managing the Repository and Data Source
      • Creating Repository (Repo)
      • Creating Data Source
    • Tag Management
    • Key benefits
  • Batch Processing using Pipeline
    • Introduction
    • Designing Pipeline
      • How to create a pipeline
        • Configuring Migration Stage
          • Schema Optimization
        • Configuring Transformation Stage
          • On-premises to Cloud
          • Cloud-to-Cloud
          • LeapLogic Express
        • Configuring Validation Stage
          • Data Validation
            • Table
            • File
            • File and Table
            • Cell-by-cell validation
          • Query Validation
            • Query Validation (When Data is Available)
            • Query Validation (When Data is Not Available)
          • Schema Validation
        • Configuring Execution Stage
        • Configuring ETL Conversion Stage
          • Ab Initio
          • Informatica
          • Informatica BDM
          • Matillion
          • DataStage
          • SSIS
          • IICS
          • Talend
          • Oracle Data Integrator
          • Pentaho
          • SnapLogic
        • Configuring Mainframe Conversion Stage
          • Cobol
          • JCL
        • Configuring Orchestration Stage
          • AutoSys
          • Control-M
        • Configuring BI Conversion Stage
          • OBIEE to Power BI
          • OBIEE to AWS QuickSight
          • Tableau to Amazon QuickSight
          • Tableau to Power BI
          • Tableau to Superset
          • Tableau to Looker
          • IBM Cognos to Power BI
        • Configuring Analytics Conversion Stage
          • SAS
          • Alteryx
        • Configuring Script Conversion Stage
    • Key Features
      • How to schedule a pipeline
      • Configuring Parameters
  • Pipeline Reports
    • Overview of Pipeline Report
    • Pipeline Listing
    • Reports and Insights
      • Migration
      • Transformation
        • On-premises to Cloud
        • Cloud-to-Cloud
        • LeapLogic Express
      • Validation
        • Data
          • File
          • Table
          • File and Table
        • Query
          • Query Validation Report (When Data is Available)
          • Query Validation Report (When Data is not Available)
        • Schema
      • Execution
      • ETL
        • Ab Initio
        • Informatica
        • Informatica BDM
        • Matillion
        • DataStage
        • SSIS
        • IICS
        • Talend
        • Oracle Data Integrator
        • Pentaho
        • SnapLogic
      • Mainframe
        • Cobol
        • JCL
      • Orchestration
        • AutoSys
        • Control-M
      • BI
        • OBIEE to Power BI
        • OBIEE to Amazon QuickSight
        • Tableau to Amazon QuickSight
        • Tableau to Power BI
        • Tableau to Superset
        • Tableau to Looker
        • IBM Cognos to Power BI
      • Analytics
        • SAS
        • Alteryx
      • Shell Script
      • Common Model
    • Automation Level Indicator
      • ETL
        • Informatica
        • Matillion
        • DataStage
        • Informatica BDM
        • SnapLogic
        • IICS
        • Ab Initio
        • SSIS
        • Talend
        • Pentaho
      • Orchestration
        • AutoSys
        • Control-M
      • EDW
      • Analytics
        • SAS
        • Alteryx
      • BI
      • Shell Script
    • Error Specifications & Troubleshooting
  • SQL Transformation
    • Overview
    • Creating and Executing the Online Notebook
      • How to Create and Execute the Notebook
      • Supported Features
    • Configuring the Notebook
      • Transformation
      • Unit Level Validation
      • Script Level Validation
    • Notebook Listing
  • Operationalization
    • Overview
      • Basic
      • Advanced
      • Cron Expression
    • Parallel Run Pipeline Listing
  • Transformation Source
    • Introduction
    • Creating Transformation Source Type
  • Governance
    • Summary of Governance - Roles and Permissions
    • User Creation
      • Creating a new User Account
    • Adding Roles and permissions
      • How to add Roles and Permissions to a new user?
    • Adding Group Accounts
    • Default Quota Limits
    • Product Usage Metrics
  • License
    • EDW
    • ETL
  • LeapLogic Desktop Version
    • Overview
    • Registration and Installation
    • Getting Started
    • Creating Assessment
      • ETL
      • DML
      • Procedure
      • Analytics
      • Hadoop
    • Reports and Insights
      • Downloadable Reports
      • Reports for Estimation
    • Logging and Troubleshooting
    • Sample Scripts
    • Desktop vs. Web Version
    • Getting Help
  • LeapLogic (Version 4.8) Deployment
    • System Requirements
    • Prerequisites
    • Deployment
      • Extracting Package
      • Placing License Key
      • Executing Deployment Script
      • Accessing LeapLogic
    • Uploading License
    • Appendix
    • Getting Help
  • Removed Features
    • Configuring File Validation Stage
    • Variable Extractor Stage
      • Variable Extractor Report
    • Configuring Meta Diff Stage
      • Meta Diff
    • Configuring Data Load Stage
      • Data Load
    • Configuring Multi Algo Stage
  • FAQs
  • Tutorial Videos
  • Notice
Home   »  Pipeline Reports   »  Reports and Insights   »  ETL  »  Informatica Conversion Report

Informatica Conversion Report

This topic provides a comprehensive report on the Informatica ETL conversion.

This section provides an overview of the ETL conversion, which includes:

  • Scripts Transformed: Displays the number of scripts successfully converted and the number of scripts that are not. It also includes the number of scripts successfully mapped to the target and the number of scripts that are not mapped.
  • Status: Status of the ETL Conversion stage.
  • Automation Level Indicator: Displays the automation conversion coverage.
  • Auto-Validation (Query): Displays the percentage of queries that have undergone syntax validation successfully.
  • User Feedback: Displays the auto conversion percentage based on your feedback.

In this Topic:

  • Scripts
  • Package


Scripts

This topic shows a comprehensive report of the converted artifacts. It includes the name of the script, mapping summary, and file-wise auto-conversion percentage.

Browse through each script to get more insights into the transformations. In each transformation, you can see information about its status, type, auto-conversion percentage, and more. In addition, if you perceive that a successfully converted stage is inaccurate, then you can provide feedback and mark it as incorrect.

Click Mark as incorrect to indicate that the converted stage is incorrect. Additionally, you can provide detailed comments in the Add Comment field to explain the specific issues or discrepancies you have observed. After providing your constructive feedback, click to recalculate the conversion automation based on your feedback. This ensures that the converted stage aligns more closely with the intended requirements or desired outcomes.

After clicking (Recalculate), you can see the updated conversion automation (User Feedback) based on your feedback in the Summary section.

Besides providing this information, this section allows you to take a variety of actions, such as:

Feature Icon Description
Download All
To download all the graphs.
You can download graphs in an alternative way by:
 
  • Click on the preferred graph or node.
  • Click and then select Download Graph.

Upload
To upload a modified graph after making the required changes.
You can upload graphs in an alternative way by:
 
  • Click on the preferred graph or node.
  • Click and then select Change Graph.

Regenerate
To update and repackage the uploaded or changed graph.
If the artifacts are updated successfully, the system generates a snackbar popup to notify the success.

Sort
To sort the nodes. You can sort the nodes based on:
  • All Nodes
  • Success Nodes
  • Failed Nodes
 


Package

This topic provides a detailed report of the converted artifacts containing Python, sql, files, etc. which can be directly executed on the target platform.

To effectively transform your Informatica workloads to the target platform (AWS Glue Notebook, Delta Live Tables, Databricks, etc.) you need to enable common model.

To learn how to enable the common model, refer to Common Model.

Informatica ETL to AWS Glue Job

This section offers a detailed report on the conversion process from Informatica ETL to AWS Glue Job. Depending on the selected data interaction technique is:

  • Glue: Redshift: The legacy Informatica ETL workloads are transformed to target equivalent code which fetches input data from Amazon Redshift, processes it in Glue, and stores the processed or output data in Redshift. In the transformed artifacts, all the source queries are converted to Redshift whereas the intermediate tables are converted to Spark. The below image shows the transformed Redshift equivalent query.

The image below showcases the transformed Spark equivalent code of the filter component. This code is utilized to create views, which can subsequently be leveraged in other components within the mapping.

  • Glue: Data Catalog: The legacy Informatica ETL workloads are transformed to target equivalent code which accesses data through the data catalog, processes data in Glue, and stores the processed or output data in the data catalog.
  • Glue: External: The legacy Informatica ETL workloads are transformed to target equivalent code which fetches input data from an external source such as Oracle, Netezza, Teradata, etc., processes that data in Glue, and then moves the processed or output data to an external target. For instance, the source input file contains data from Oracle, and then the data is processed in Glue, and finally, the processed or output data gets stored in Oracle.
  • Redshift ETL Orchestrated via Glue: The legacy Informatica ETL workloads are transformed to target equivalent code which accesses, processes, and executes data in Amazon Redshift and uses Glue for orchestration jobs. In the transformed artifacts, both source input and intermediate or temporary tables are converted to Redshift. In the below image, you can see the filter component that is converted to Redshift equivalent code.
  • Glue: Hybrid: This interaction technique is a combination of different data interaction techniques (Glue: Redshift, Glue: Data Catalog, and Glue: External) it fetches and processes data from multiple data sources. It can take input from the data sources such as Redshift, Delta, Iceberg, RDS instances, or external sources, and processes it in Glue, and moves the output or processed data into the respective data sources. That is, using this technique you can convert source tables that resides on multiple data sources.

Informatica ETL to Databricks Lakehouse (Delta Live Tables)

This section offers a detailed report on the conversion process from Informatica ETL to Delta Live Tables. Depending on the selected data interaction technique,

  • Databricks-Native: The legacy Informatica ETL workloads are transformed to target-equivalent code that fetches, processes, and stores data in Databricks Lakehouse.
  • Databricks: Unity Catalog: The legacy Informatica ETL workloads are transformed to target-equivalent code that fetches, processes, and stores data within the Databricks Unity catalog.
  • Databricks: External: The legacy Informatica ETL workloads are transformed to target-equivalent that fetches input data from an external source such as Oracle, Netezza, Teradata, etc., processes that data in Databricks, and then moves the processed or output data to an external target. For instance, the source input file contains data from Oracle, and then the data is processed in Glue, and finally, the processed or output data gets stored in Oracle.

Let us see the output of Informatica ETL to Delta Live Tables conversion using DLT Meta (common model) enabled. The DLT Meta facilitates the creation of a bronze table within the Databricks Lakehouse. So, rather than fetching data directly from the source such as flat files, it creates a bronze table (replica of the file) within Databricks. As a result, instead of reading data from the file, it directly fetches data from the table.

Here, in addition to the output package, you can see conf (Onboarding.json file) and ddl folders, where:

  • ddl: ddl contains the schema information to generate bronze table from the flat file. These flat files are delimited files where the field separator such as comma, colon, etc., delimits the information across the columns in the input file. So, providing the schema structure is essential for creating a table.
  • Onboarding.json: Onboarding.json contains information about all the bronze tables including source schema path, output path, and other relevant details. Based on this information, bronze table will be created in Databricks.

In Addition, the TODO_Report file displays messages regarding all unsupported components or queries that cannot auto-convert and require manual intervention.

In the Package section, you can also see Transformation_Report.xlsx file which contains a comprehensive report of the transformation, queries, and the associated deductible license quota. License quota is deducted for units when converting Informatica workloads to target equivalents, where unit refers to transformation.

The license deduction logic is determined based on File similarity and Complexity.

File similarity – File similarity checks whether the source file is related or similar to an already executed file. A higher file similarity value indicates that you already executed a similar file. The license quota deduction depends on whether the file similarity value falls below the predefined threshold. Specifically, if the file similarity value is below the threshold, then the system will deduct the license quota. Conversely, if the file similarity value exceeds the threshold, it won’t incur any license quota deduction, indicating that the file has been processed before. For instance,

  • If the file similarity value is 70% (below the threshold) and the threshold is 80%, then system deducts the unit quota.
  • If the file similarity value is 92% (above the threshold) and the threshold is 80%, then the system will not deduct unit quota.

Complexity – Another criterion used to determine the license quota deduction is based on the complexity of each unit. The unit complexity is categorized into SIMPLE, MEDIUM, and COMPLEX. The complexity weightage for the deductible unit quota is outlined as follows:

  • If the unit complexity is SIMPLE, the deductible unit quota is 1.
  • If the unit complexity is MEDIUM, the deductible unit quota is 3.
  • If the unit complexity is COMPLEX, the deductible unit quota is 5.

For instance, consider the example provided in the image below for the file Case1.xml. This workflow contains a total of 7 transformations (units), all of which are successfully transformed. Among these successfully transformed units:

  • 4 transformations are categorized as simple complexity with a weightage of 1. Consequently, the deductible unit quota for these units is 4.
  • 3 transformations are categorized as medium complexity with a weightage of 3. Consequently, the deductible unit quota for these units is 9.

So, the total deductible unit quota for the Case1.xml file is 13.

The Summary sheet (refer the above image) in Transformation_Report.xlsx file showcases detailed information about the license deductible quota.

  • File Name: Displays the name of the file.
  • Folder Name: Displays the name of the folder.
  • Workflow: Displays the name of the workflow.
  • Conversion Percentage: Displays the auto-conversion percentage of each workflow.
  • File Similarity: Indicates whether the file is similar to the already executed file. If the value is Yes, then the file is similar to an already executed file else No. If the file is similar to the already executed file, then the Deducted Script Quota, Deducted Block Quota, and Deducted Unit Quota will be zero and will not deduct any quota.
  • Complexity: Displays the complexity of each workflow.
  • Total Transformation: Displays the number of total transformations.
  • Success Transformation: Displays the number of successfully transformed transformations.
  • Deductible Units: Displays the unit quota that needs to be deducted based on the complexity of transformed transformations.
  • Deducted Unit Quota: Displays the actual unit quota deducted.

The Query sheet provides information about all queries along with its conversion status, transformation engine, validation status, and more.

  • File Name: Displays the name of the file.
  • Folder_Name: Displays the name of the folder.
  • Workflow: Displays the name of the workflow.
  • Mappings: Displays the associated mappings.
  • Transformation name: Displays the name of the transformation.
  • Transformation type: Displays the type of transformation.
  • Query: Displays the source query.
  • Query Status: Displays the status of the query validation. If the status is SUCCESS, it indicates the query passed syntax validation, FAILED indicates a validation failure, and N/A indicates syntax validation is not applied.
  • Error message: Shows the reason for query validation failure.
  • Conversion Status: Displays the conversion status of each query. If the conversion status is SUCCESS, it signifies that the query has been successfully transformed to the target equivalent whereas the FAILED status indicates that the query is failed to transform.
  • Conversion Engine: Displays the conversion engine used to transform each query. If the LeapLogic Transformation Engine fails to convert queries to the target equivalent, then the system will convert failed queries to the target equivalent based on the transformation model selected on the configuration page.
  • Query id: Displays the query id.

The Translation_Report sheet in Transformation_Report.xlsx file provides a comprehensive report of the conversion. It includes information about workflows, mapplets, mappings, transformations, and more.

  • File Name: Displays the name of the file.
  • Folder_Name: Displays the name of the folder.
  • Workflow: Displays the name of the workflow.
  • Mapplet: Displays the name of the mapplet.
  • Mappings: Displays the associated mappings.
  • Transformation: Displays the associated transformation.
  • Transformation_Name: Displays the name of the transformation.
  • Used_By_Workflow: Displays the associated workflow.
  • Target_File_Name: Displays the name of the target file.
  • Function_Name: Displays the function associated with the transformation.
  • Status: Indicates the conversion status of each function. If the value is SUCCESS the function is converted successfully, else FAIL.

The Transformation_License_Report sheet provides a comprehensive report of the conversion along with the unit quota deduction.

  • File Name: Displays the name of the file.
  • Folder_Name: Displays the name of the folder.
  • Workflow: Displays the name of the workflow.
  • Mappings: Displays the associated mappings.
  • Type: Displays the type of each workflow.
  • Transformation_Name: Displays the associated transformation.
  • TRANSFORMATION_TYPE: Displays the type of the transformation.
  • Conversion Status: Indicates the conversion status of each transformation. If the value is SUCCESS the transformation is converted successfully, else FAIL.
  • Auto-Conversion (%): Displays the automation conversion percentage of each transformation.
  • Complexity: Displays the complexity of each transformation.
  • Deductible Units: Displays the unit quota that needs to be deducted based on the complexity of transformed transformations.
  • Deductible Query Units: Displays the EDW query units that need to be deducted based on the number of transformed queries.
  • Total Deductible Units: Displays the total number of deductible unit quota and EDW query units.
  • Deducted Unit: Displays the actual unit quota deducted.

When you try to convert units that exceed the available license quota, those within the quota will successfully convert. However, any additional units beyond the quota will fail to transform and display an error message indicating “License quota insufficient.”

You can apply:

Feature Icon Description
Compare
Line-by-line comparison of the transformed queries and the source queries to identify the changes.
Edit
To manually edit queries using the Notebook-based online editor. Furthermore, you can repackage the updated code with the Regenerated option. To do so,
  • Click .
  • Make necessary changes in the transformed queries.
  • Click to update the queries.

To learn more, contact our support team or write to: info@leaplogic.io

Copyright © 2025 Impetus Technologies Inc. All Rights Reserved

  • Terms of Use
  • Privacy Policy
  • License Agreement
To the top ↑ Up ↑