Configuring Transformation Stage
In the Transformation stage, the legacy code and the business logic are transformed to the preferred target. Double-click the Transformation stage to access the configuration page.
In this Topic:
Overview
In this section, you can customize the Transformation stage’s name and give a suitable description as required. By default, Transformation is provided as the Name field. Provide a relevant name and description that helps you to understand the purpose and scope of the Transformation stage.
Transform
The transformation stage ensures seamless transformation and operationalization of large-scale legacy workloads to the modern cloud platform.
LeapLogic’s intelligent grammar engine automatically converts legacy EDW workloads, including core business logic into target-native equivalents. Once transformed, the target compatible packaged code can be orchestrated and executed as production-ready jobs in the target environment.
You can configure the transformation stage as a standalone pipeline by defining the source, target, input, and output types required for workload transformation. Alternatively, you include it an integrated pipeline (Migration stage and Transformation stage). In this case, the outputs of the DML scripts, source type, and the metadata details of the Migration stage are automatically mapped to the Transformation stage, minimizing manual configuration effort.
The transformation stage also provides a notebook-based inline editor, enabling you to review and fine-tune queries for optimization. LeapLogic enables workload transformation across:
By automating workload conversion and optimization, the transformation stage minimizes manual effort and accelerates the journey to a modern data platform.
The table below lists the EDW sources along with the input types and supported targets for workload transformation.
Source |
Input Type |
Target |
Teradata |
SQL/BTQ/Procedure |
Hive |
Spark |
Amazon Redshift |
Azure Synapse |
Databricks Lakehouse |
Databricks Notebook |
Google BigQuery |
Snowflake |
AWS Glue |
KSH |
Spark |
Google BigQuery |
Amazon Redshift |
Hive |
MLOAD |
Databricks Notebook |
TPT |
AWS Glue Jobs
Spark
Databricks
|
Triggers |
Databricks Notebook |
Analytics queries |
Kyvos semantic cube |
Netezza |
SQL/Procedure |
Spark |
Amazon Redshift |
Azure Synapse |
Databricks Lakehouse |
Databricks Notebook |
Snowflake |
SQL Scripting (Databricks) |
Hive |
KSH |
Hive |
Spark |
Oracle |
SQL/Procedure |
Spark |
Amazon Redshift |
Azure Synapse |
Databricks Lakehouse |
Databricks Notebook |
GCP PostgreSQL |
Google BigQuery |
Snowflake |
Hive |
Amazon Aurora |
SQL Scripting (Databricks) |
SQL Server |
SQL/Procedure |
Spark |
Amazon Redshift |
Aurora PostgresSQL |
Databricks Lakehouse |
Databricks Notebook |
Snowflake |
SQL Scripting (Databricks) |
Hive |
Vertica |
SQL/Procedure |
Spark |
Hive |
KSH/BASH/SH |
Spark |
SQL/Procedure |
Databricks Lakehouse |
DB2 |
SQL/Procedure |
AWS Glue Job (Delta and Iceberg) |
Amazon Redshift |
Sybase |
SQL |
Databricks Lakehouse |
Dremio |
SQL |
Athena |
Greenplum |
SQL/Procedure |
Spark |
Hive |
Snowflake |
DML/DDL |
Databricks Notebook |
JavaScript Procedures |
Databricks Notebook |
Generic ANSI SQL |
SQL/Procedure |
Amazon Redshift |
Databricks Lakehouse |
Snowflake |
Spark |
PostgreSQL |
SQL/Procedure |
Databricks Notebook |
Redshift |
SQL/Procedure |
Databricks Notebook |
Google BigQuery |
SQL/Procedure |
Amazon Redshift |
EDW Unity Catalog |
|
Databricks Lakehouse |
To view the Transformation Stage report, visit Transformation Report.
Output
The output of this transformation is HQL/Spark SQL, Java/Python/Scala code, or cloud compatible scripts such as SnowSQL. A detailed report is generated as output and required queries can be edited through an online notebook-based code editor.
You can configure the output of this transformation for navigation to a further stage. By default, the output configuration is set to Stop if the transformation is not 100%, or that can be configured to Continue, Error, and Pause as required.
Next:
Configuring Validation Stage