Configuring On-premises to Cloud Transformation
In the Transformation stage, the legacy codes and the business logic are transformed to the preferred target.
To configure the Transformation stage, follow the below steps:
- In Source Type, select the required source data store, such as Netezza, Oracle, SQL Server, Teradata, Vertica, and more.
- In Input Type, select the type of input script, such as SQL/BTQ/Procedure or KSH.
- In Target Type, select the preferred target type, such as Spark, Snowflake, Amazon Redshift, and so on.
- In Output Type, select the output type (such as Python, RSQL) to generate artifacts in that format.
- In Input Artifacts, upload the files that you need to transform to the target source.
- Click Data Configuration to configure the data.
- In Validation Type, select the validation type:
- None: Performs no validation.
- Cluster: Syntax validation is performed on queries transformed by the LeapLogic Core transformation engine.
- In LeapFusion, select the preferred Base and Helper models to convert queries that require augmented transformation, and perform syntax validation for accuracy and optimal performance.
- Base Model: Select the preferred base model to convert queries that are not handled by the default LeapLogic Core transformation engine to the target equivalent. The Base Model supports both offline (LeapLogic) and online (Amazon Bedrock) modes of transformation. By default, the system enables online mode (Amazon Bedrock) and disables offline mode (LeapLogic). To enable the offline modes (LeapLogic) such as Embark, Energize, Intercept, and Velocity, as a prerequisite you need to provide EC2 instance connection details and a prompt file (.txt) on the Add New Sources and Targets page (Governance > Intelligence Modernization > Custom/ Source/ Target > Add New Sources and Targets).
To view the detailed steps for providing EC2 instance connection details and the prompt file, click here.
The offline modes include:
- Energize, Velocity, and Embark: To convert small- to medium-sized SQLs.
- Intercept: To convert large-sized SQLs and procedural code.
- Helper Model: Select helper models to perform syntax validation of the queries that are transformed by the Base Model and suggest corrections if needed. By default, the selected base model is set as a helper model. If needed, you can add multiple helper models. When multiple Helper models are configured:
- The first helper model (same as the base model) validates the queries transformed by the base model. If it claims any queries are incorrectly transformed, it suggests updated queries.
- Then, the updated queries are passed to the next Helper model, which validates them and suggests corrections if required.
- This process continues through all configured helper models until the queries are validated successfully.
This validation process ensures higher accuracy, better performance, and more efficient transformation.
To access this intelligent modernization feature (LeapFusion), ensure that your account has the manager and llexpress_executor roles.
To view the detailed steps for assigning manager and llexpress_executor roles to your account, click here.
- In Source Configuration, select the source configuration as Live or Offline.
- If the selected Source Configuration is:
- Live: Upload the data source and select the Default Database.
- Offline: Upload the DDL files. It supports .sql and .zip file formats.
- In Target Configuration, select the target configuration as Live.
- Upload the target data source.
- In File Format, select the storage type as ORC, Parquet, Text File, or Avro
- In Mapping as per Changed Data Model, upload files for mapping between source and target tables.
- In Triggers, upload the trigger statements for a more accurate conversion.
- Click Save to save the Transformation stage.
- An alert pop-up message appears. This message prompts you to refer your respective assessment to determine the anticipated quota deduction required when converting your scripts to target. Then click Ok.
- Click the Execute icon to execute the integrated or standalone pipeline. Clicking the Execute icon navigates you to the pipeline listing page which shows your pipeline status as Running state. It changes its state to Success when it is completed successfully.
- Click on your pipeline card to see reports.
To view the On-premises to Cloud Transformation, visit On-premises to Cloud Transformation Report.