Skip to the content
LeaplogicLeaplogic
  • Home
  • About Us
  • Contact
SIGN IN
  • Home
  • About Us
  • Contact

  • Getting Started
    • Before You Begin
    • Creating an Account
    • Logging into LeapLogic
    • Reset Password
    • Quick Tour of the Web Interface
    • LeapLogic in 15 minutes
      • Prerequisites
      • Step 1. Log into LeapLogic
      • Step 2. Create Assessment and Get Insights
      • Step 3. Create Transformation Pipeline and See Results
      • Step 4. Edit or Optimize the Transformed Code
      • Step 5: Complete the Transformation Lifecycle
  • Introduction to LeapLogic
    • Overview
    • High Level Architecture
    • Supported Legacy and Cloud Platforms
    • Key Features
  • Workload Assessment
    • Overview
    • Value Proposition
    • Creating Assessment
      • Prerequisites
      • Step 1. Provide Primary Inputs
        • Automation Coverage
      • Step 2. Add the Additional Inputs
        • Table Stat Extraction Steps
          • Teradata
          • Oracle
          • Netezza
      • Step 3. Update the Source Configuration
      • Step 4. Configure the Recommendation Settings
    • Assessment Listing
    • Understanding Insights and Recommendations
      • Volumetric Info
      • EDW
        • Oracle
          • Highlights
          • Analysis
          • Optimization
          • Lineage
          • Recommendations
          • Downloadable Reports
        • Vertica
          • Highlights
          • Analysis
          • Optimization
          • Lineage
          • Recommendations
          • Downloadable Reports
        • Snowflake
          • Highlights
          • Analysis
          • Optimization
          • Lineage
          • Recommendations
          • Downloadable Reports
        • Azure Synapse
          • Highlights
          • Analysis
          • Optimization
          • Lineage
          • Recommendations
          • Downloadable Reports
        • SQL Server
          • Highlights
          • Analysis
          • Optimization
          • Lineage
          • Recommendations
          • Downloadable Reports
        • Teradata
          • Highlights
          • Analysis
          • Optimization
          • Lineage
          • Recommendations
          • Downloadable Reports
        • Netezza
          • Highlights
          • Analysis
          • Optimization
          • Lineage
          • Recommendations
          • Downloadable Reports
        • Google Big Query
          • Highlights
          • Analysis
          • Optimization
          • Lineage
          • Recommendations
          • Downloadable Reports
        • Redshift
          • Highlights
          • Analysis
          • Optimization
          • Lineage
          • Recommendations
          • Downloadable Reports
        • PostgreSQL
          • Highlights
          • Analysis
          • Optimization
          • Lineage
          • Recommendations
          • Downloadable Reports
        • Duck DB
          • Highlights
          • Analysis
          • Optimization
          • Lineage
          • Recommendations
          • Downloadable Reports
        • ClickHouse
          • Highlights
          • Analysis
          • Optimization
          • Lineage
          • Recommendations
          • Downloadable Reports
        • Exasol
          • Highlights
          • Analysis
          • Optimization
          • Lineage
          • Recommendations
          • Downloadable Reports
        • DB2
          • Highlights
          • Analysis
          • Optimization
          • Recommendations
          • Lineage
          • Downloadable Reports
      • ETL
        • Informatica
          • Highlights
          • Analysis
          • Lineage
          • Downloadable Reports
        • Ab Initio
          • Highlights
          • Analysis
          • Lineage
          • Downloadable Reports
        • DataStage
          • Highlights
          • Analysis
          • Lineage
          • Downloadable Reports
        • Talend
          • Highlights
          • Analysis
          • Lineage
          • Downloadable Reports
        • SSIS
          • Highlights
          • Analysis
          • Lineage
          • Downloadable Reports
        • Informatica BDM
          • Highlights
          • Analysis
          • Lineage
          • Downloadable Reports
        • Oracle Data Integrator
          • Highlights
          • Analysis
          • Downloadable Reports
        • Pentaho
          • Highlights
          • Analysis
          • Downloadable Reports
        • Azure Data Factory
          • ARM Template
          • Highlights
          • Analysis
          • Downloadable Reports
        • Matillion
          • Highlights
          • Analysis
          • Downloadable Reports
        • SnapLogic
          • Highlights
          • Analysis
          • Downloadable Reports
      • Orchestration
        • AutoSys
          • Highlights
          • Analysis
          • Downloadable Reports
        • Control-M
          • Highlights
          • Analysis
          • Lineage
          • Downloadable Reports
        • SQL Server
          • Highlights
          • Analysis
      • BI
        • OBIEE
          • Highlights
          • Analysis
          • Lineage
          • Downloadable Reports
        • Tableau
          • Highlights
          • Analysis
          • Lineage
          • Downloadable Reports
        • IBM Cognos
          • Highlights
          • Analysis
          • Downloadable Reports
        • MicroStrategy
          • Highlights
          • Analysis
          • Lineage
          • Downloadable Reports
        • Power BI
          • Highlights
          • Analysis
          • Lineage
          • Downloadable Reports
        • SSRS
          • Highlights
          • Analysis
          • Downloadable Reports
        • SAP BO
          • Highlights
          • Analysis
          • Lineage
          • Downloadable Reports
        • WebFOCUS
          • Highlights
          • Analysis
          • Downloadable Reports
      • Analytics
        • SAS
          • Highlight
          • Analysis
          • Lineage
          • Downloadable Reports
        • Alteryx
          • Highlights
          • Analysis
          • Lineage
          • Downloadable Reports
      • Integrated Assessment (EDW, ETL, Orchestration, BI)
        • Highlights
        • Analysis
        • Optimization
        • Lineage
        • Recommendations
    • Managing Assessment Reports
      • Downloading Report
      • Input Report Utility
      • View Configuration
    • Complexity Calculation Logic
    • Key Benefits
    • Ad hoc Query
  • Metadata Management
    • Overview
    • Introduction to Data Catalog
      • Managing Data Catalog
        • Building Data Catalog
        • Insights to Data Catalog
        • Managing the Repository and Data Source
      • Creating Repository (Repo)
      • Creating Data Source
    • Tag Management
    • Key benefits
  • Batch Processing using Pipeline
    • Introduction
    • Designing Pipeline
      • How to create a pipeline
        • Configuring Migration Stage
          • Schema Optimization
        • Configuring Transformation Stage
          • On-premises to Cloud
          • Cloud-to-Cloud
          • LeapLogic Express
        • Configuring Validation Stage
          • Data Validation
            • Table
            • File
            • File and Table
            • Cell-by-cell validation
          • Query Validation
            • Query Validation (When Data is Available)
            • Query Validation (When Data is Not Available)
          • Schema Validation
        • Configuring Execution Stage
        • Configuring ETL Conversion Stage
          • Ab Initio
          • Informatica
          • Informatica BDM
          • Matillion
          • DataStage
          • SSIS
          • IICS
          • Talend
          • Oracle Data Integrator
          • Pentaho
          • SnapLogic
        • Configuring Mainframe Conversion Stage
          • Cobol
          • JCL
        • Configuring Orchestration Stage
          • AutoSys
          • Control-M
        • Configuring BI Conversion Stage
          • OBIEE to Power BI
          • OBIEE to AWS QuickSight
          • Tableau to Amazon QuickSight
          • Tableau to Power BI
          • Tableau to Superset
          • Tableau to Looker
          • IBM Cognos to Power BI
        • Configuring Analytics Conversion Stage
          • SAS
          • Alteryx
        • Configuring Script Conversion Stage
    • Key Features
      • How to schedule a pipeline
      • Configuring Parameters
  • Pipeline Reports
    • Overview of Pipeline Report
    • Pipeline Listing
    • Reports and Insights
      • Migration
      • Transformation
        • On-premises to Cloud
        • Cloud-to-Cloud
        • LeapLogic Express
      • Validation
        • Data
          • File
          • Table
          • File and Table
        • Query
          • Query Validation Report (When Data is Available)
          • Query Validation Report (When Data is not Available)
        • Schema
      • Execution
      • ETL
        • Ab Initio
        • Informatica
        • Informatica BDM
        • Matillion
        • DataStage
        • SSIS
        • IICS
        • Talend
        • Oracle Data Integrator
        • Pentaho
        • SnapLogic
      • Mainframe
        • Cobol
        • JCL
      • Orchestration
        • AutoSys
        • Control-M
      • BI
        • OBIEE to Power BI
        • OBIEE to Amazon QuickSight
        • Tableau to Amazon QuickSight
        • Tableau to Power BI
        • Tableau to Superset
        • Tableau to Looker
        • IBM Cognos to Power BI
      • Analytics
        • SAS
        • Alteryx
      • Shell Script
      • Common Model
    • Automation Level Indicator
      • ETL
        • Informatica
        • Matillion
        • DataStage
        • Informatica BDM
        • SnapLogic
        • IICS
        • Ab Initio
        • SSIS
        • Talend
        • Pentaho
      • Orchestration
        • AutoSys
        • Control-M
      • EDW
      • Analytics
        • SAS
        • Alteryx
      • BI
      • Shell Script
    • Error Specifications & Troubleshooting
  • SQL Transformation
    • Overview
    • Creating and Executing the Online Notebook
      • How to Create and Execute the Notebook
      • Supported Features
    • Configuring the Notebook
      • Transformation
      • Unit Level Validation
      • Script Level Validation
    • Notebook Listing
  • Operationalization
    • Overview
      • Basic
      • Advanced
      • Cron Expression
    • Parallel Run Pipeline Listing
  • Transformation Source
    • Introduction
    • Creating Transformation Source Type
  • Governance
    • Summary of Governance - Roles and Permissions
    • User Creation
      • Creating a new User Account
    • Adding Roles and permissions
      • How to add Roles and Permissions to a new user?
    • Adding Group Accounts
    • Default Quota Limits
    • Product Usage Metrics
  • License
    • EDW
    • ETL
  • Script/ Query Log/ Code Extraction Prerequisites
    • Cloud
      • Azure Data Factory
      • Snowflake
      • Azure Synapse
      • Google BigQuery
      • Redshift
      • Azure SQL Database Hyperscale
      • Aurora PostgreSQL
    • EDW
      • Oracle
      • Netezza
      • Teradata
      • Vertica
      • SQL Server
      • Db2
      • MySQL
      • PostgreSQL
    • ETL
      • DataStage
      • Informatica
      • SSIS
      • Talend
      • ODI
      • IICS
      • DBT
      • Pentaho
      • Matillion
      • SnapLogic
      • Ab Initio
      • SAP BODS
      • TAC
      • WebFOCUS
    • BI
      • IBM Cognos
      • OBIEE
      • Tableau
      • Metabase
      • MicroStrategy
      • PowerBI
      • LeapLogic Utility for SAP BO
      • SAP BO Universe and Web Intelligence
      • SSRS
    • Analytics
      • SAS
      • Alteryx
    • Orchestration
      • AutoSys
      • Control-M
      • SQL Server
    • Mainframe
  • LeapLogic Desktop Version
    • Overview
    • Registration and Installation
    • Getting Started
    • Creating Assessment
      • ETL
      • DML
      • Procedure
      • Analytics
      • Hadoop
    • Reports and Insights
      • Downloadable Reports
      • Reports for Estimation
    • Logging and Troubleshooting
    • Sample Scripts
    • Desktop vs. Web Version
    • Getting Help
  • LeapLogic (Version 4.9) Deployment
    • System Requirements
    • Prerequisites
    • Deployment
      • Extracting Package
      • Placing License Key
      • Executing Deployment Script
      • Accessing LeapLogic
    • Uploading License
    • Appendix
    • Getting Help
  • Removed Features
    • Configuring File Validation Stage
    • Variable Extractor Stage
      • Variable Extractor Report
    • Configuring Meta Diff Stage
      • Meta Diff
    • Configuring Data Load Stage
      • Data Load
    • Configuring Multi Algo Stage
  • FAQs
  • Tutorial Videos
  • Notice
Home   »  Script/ Query Log/ Code Extraction Prerequisites   »  EDW  »  Netezza Query Execution Logs Extraction and Source Code Prerequisites

Netezza Query Execution Logs Extraction and Source Code Prerequisites

This topic briefs about the Netezza assessment query execution logs extraction and source code prerequisites.

In This Topic:

  • Introduction
  • Assessment Process
  • Database Object DDL Extraction
  • Source code Assessment

Introduction

LeapLogic Assessment profiles existing inventory, identifies complexity, draws out dependency structure and provides comprehensive recommendations for migration to modern data platform.

Assessment checklist

Information Format Availability?
Netezza Query Logs File Export in text format Yes/ No
Total I/O Usage by Days File Export in text format Yes/ No
Hardware Config for Total CPU Calculation Inline in the document Yes/ No
Database Object Count File Export in text format Yes/ No
Database Volume File Export in text format Yes/ No
High Data Volume Tables File Export in text format Yes/ No
Total Number of Databases File Export in text format Yes/ No
Total Number of Users File Export in text format Yes/ No
Data for Partitioning bucketing File Export in text format Yes/ No
Source Code Assessment File Export Yes/ No


Assessment Process

All the environment specific variables are highlighted. The start and end date should be consistent for all the queries.

Logs

This requires query execution logs in the CSV format. It must have the following columns (~~ separated file) retrieved from the history database tables of Netezza.

IDWWM~~NPSID~~NPSINSTANCEID~~OPID~~UserName~~AppId~~ClientID~~StartTime~~AmpCpuTime~~TotalIoCount~~ParserCpuTime~~FirstRespTime~~FirstStepTime~~ProcID~~QueryID~~MaxAmpCpuTime~~MaxAmpIO~~TotalCpu~~TotalIO~~QueryExecutionTime~~SchemaName~~SQLROWNO~~StatementType~~QueryText

Netezza query logs with wrapper script

Script for extracting artifacts from Netezza history database Version 3

nz_query_log.kshDownload

Script for extracting artifacts from Netezza history database Version 1

nz_query_log.kshDownload

Download the above file and rename it to the nz_query_log.ksh.

Follow the steps below for execution of wrapper script for extracting the query logs from Netezza.

  1. Login to your Netezza server/ Netezza Client Machine (Linux-based) where Netezza client utilities are installed.
  2. Create a folder and copy the file nz_query_log.ksh inside this folder.
  3. Edit the file nz_query_log.ksh for environmental details and date range for which you want to extract the logs as specified below:

Edit the value of the following configuration parameters as per your specific use case.

# Path of the file where you want to generate the logs file.

OUT_FILE_PATH=’/export/home/nz/nz_hist_log’

# Input variables (start date, end date, Netezza db username, Netezza db password)

INPUT_START_DT=2020-04-01

INPUT_END_DT=2020-05-31

NZ_USER=admin

NZ_PASSWORD=password

# Provide the comma separated database name for which you want to generate the log file. Leave the value of this parameter blank to generate log files for all the databases.

LOG_ASSESSMENT_DATABASES=NEWTPCDS,TPDS

#Provide the name of history database

HISTDB=HISTDB

#Provide the name of history database user HISTUSER=HISTDBOWNER

  1. Give the execute permission to nz_query_log.ksh file by running the command.

chmod +x nz_query_log.ksh

  1. Run the following command to execute the shell script:
    1. Ensure that the current directory is the above-created folder
    2. Run the command

    ./nz_query_log.ksh

  1. Execution of preceding step will generate the output file in the form of query logs inside the output folder path that you will specify in the OUT_FILE_PATH parameter in the nz_query_log.ksh file.

Except LOG_ASSESSMENT_DATABASES, setting of all the other mentioned variables is mandatory.

The output would be created in the form of query logs files for the given date range. For example, with the start and end dates in the above Shell script variables, the log file will be generated with the name nz_out.log.2023-04-01_2023-05-31

Total I/O Usage by Days

Save the results of this query in a CSV file:

select cast(START_TIME as date) as startDate,

cast(END_TIME as date) as endDate,

SUM( HOST_DISK_READ_SECS +  HOST_DISK_WRITE_SECS + HOST_FABRIC_SECS + SPU_DISK_READ_SECS +  SPU_DISK_WRITE_SECS + SPU_FABRIC_SECS +  SPU_DATA_DISK_READ_SECS + SPU_DATA_DISK_WRITE_SECS + SPU_TEMP_DISK_READ_SECS + SPU_TEMP_DISK_WRITE_SECS ) as TOTALREDWRITE

from _V_SCHED_GRA_EXT

group by 1,2

– – limit 10

Replace startDate and endDate with actual export assessment start and end date.

Hardware Configuration

The hardware configuration is required for the calculation of total CPU utilization.

  • Netezza series and specs (Cores, RAM, HDD/SDD)
  • Number of Netezza nodes
  • Total available AMPCPU seconds per day

Database Object Count

Save its results in a CSV file.

select database::nvarchar(64) as database,OBJTYPE, count(OBJNAME) as Tables_count

from _V_OBJ_RELATION_XDB

group by database,OBJTYPE;

Database Volume

Save its results in a CSV file.

SELECT ORX.database::nvarchar(64) AS “DatabaseName”,

case when sum (SOD.allocated_bytes) is null then 0 else SUM(SOD.allocated_bytes)/1073741824 end AS “AllocatedSpace_GB”

FROM _V_SYS_OBJECT_DSLICE_INFO SOD INNER JOIN _V_OBJ_RELATION_XDB ORX ON ORX.objid = SOD.tblid

GROUP BY “DatabaseName”

ORDER BY “DatabaseName”;

High Data Volume Tables

This query provides result for tables with high data volume. Save its results in a CSV file.

select objname as table_name,

       database as db_name,

       (allocated_bytes/1048576) as allocated_mbytes

  from _v_sys_relation_xdb sys,

       _v_sys_object_dslice_info ds

where ds.tblid = sys.objid

   and dsid in (1,2,3,4)

and allocated_mbytes > 10240

order by 

allocated_mbytes desc,

table_name,

db_name,

dsid;

This SQL will collect databases with volume equal or above 10 GB.

Note:

The filter condition can be changed for collecting lower volumes.

Databases

This query provides the total number of databases. Save its results in a CSV file.

select distinct database::nvarchar(64) from _V_OBJ_RELATION_XDB;

Users

This query provides the total number of users. Save its results in a CSV file.

SELECT GROUPNAME,OWNER,USERNAME FROM _V_GROUPUSERS;

Data for Partitioning and Bucketing

This query extracts table size and column details to be utilized for partition and bucket recommendation. Save the result in a CSV file.

Please note that this query will give details for the database that we are in. Please run it for all databases. Also, please generate the statistics before executing this query.

select DATABASE_NAME,Table_Name,TABLE_SIZE_BYTES,NUM_ROWS,Column_Name, to_number(num_of_Unique_Values, 999999999999999999999999999999) as num_unique_val

from(

SELECT

_V_TABLE.”DATABASE” as database_name,

_V_TABLE.TABLENAME as Table_Name,

case when tbl_stat.used_bytes is null then 0 else tbl_stat.used_bytes end as table_size_bytes,

_V_TABLE.RELTUPLES as num_rows,

SUBSTR(_v_relation_column.attname || ‘   ‘, 1, 25) as Column_Name,

case when _v_relation_column.attdispersion = ‘0’ then ‘0’

when _v_relation_column.attdispersion = ‘-1’ then _V_TABLE.RELTUPLES::varchar(100)

else

TO_CHAR(  (CAST((1.0/_v_relation_column.attdispersion) AS BIGINT)), ‘ 999,999,999,999,999 ‘ ) end as num_of_Unique_Values

FROM _v_relation_column

left outer join _v_statistic on

(_v_relation_column.objid  = _v_statistic.objid    AND

_v_relation_column.attnum = _v_statistic.attnum

)

inner join _V_TABLE

on (_v_relation_column.objid = _V_TABLE.OBJID)

inner join _v_table_storage_stat as tbl_stat

on(tbl_stat.OBJID = _V_TABLE.OBJID)

WHERE

_V_TABLE.OBJTYPE = ‘TABLE’

AND

( _v_relation_column.schema=current_schema OR upper(_v_relation_column.schema) in (‘DEFINITION_SCHEMA’, ‘INFORMATION_SCHEMA’))

ORDER BY  

_V_TABLE.TABLENAME ,

_v_relation_column.attnum

) as a;


Database Object DDL Extraction

LeapLogic needs specific artifacts to perform an assessment. You can either copy these artifacts from your GIT repository or export them, including table DDL scripts, stored procedures, functions, macros, views, etc., using the shell script provided later in this section.

The shell script depends on the Netezza-supplied admin script nz_ddl, which is usually found in the /nz/support/bin directory.

#!/bin/bash

#Database connection parameters

export NZ_HOST=”192.168.218.19“

export NZ_DATABASE=”NEWTPCDS“

export NZ_USER=”admin“

export NZ_PASSWORD=”password“

# Netezza admin Script folder

NZ_ADMIN_SCRIPTS=”/nz/support/bin“

# Output DIRECTORY

OUTPUT_DIR=”/export/home/nz/ddlExtractor/ddls“

# A comma-separated list of databases from which DDLs should be extracted.

DATABASE_LIST=”NEWTPCDS,SYSTEM_BKP,AATESTQLV1,AATESTQLV2“

IFS=’,’ read -r -a databases <<< “$DATABASE_LIST”

if [ -z “$DATABASE_LIST” ]; then

    sh $NZ_ADMIN_SCRIPTS/nz_ddl > $OUTPUT_DIR/”all_ddl_output.sql”

elif [ ${#databases[@]} -eq 1 ]; then

    DATABASE=${databases[0]}

    sh $NZ_ADMIN_SCRIPTS/nz_ddl $DATABASE > $OUTPUT_DIR/”${DATABASE}_ddl_output.sql”

else

    for DATABASE in “${databases[@]}”; do

        sh $NZ_ADMIN_SCRIPTS/nz_ddl $DATABASE > $OUTPUT_DIR/”${DATABASE}_ddl_output.sql”

    done fi

Update the highlighted details in the script and run it on the machine where the nz_ddl admin script is available.

Also attached script for reference:

nz_export_ddlDownload


Source Code Assessment

Provide the below active or in-scope Netezza source code artifacts as applicable in the migration scope.

Code Artifact Criticality Remarks
Orchestration Scripts (Control-M / Autosys / Cron etc.) Must To identify interdependencies across scheduler scripts / jobs, queries, and dependent workloads
Procedures / Functions Must To identify workload complexity, query patterns, query count etc. for effort estimation and technical debt
Views Must To identify view complexity, patterns and effort estimations
Shell Scripts Must To identify count, dependencies, SQL queries and PL/SQL, logic (example: email, notification etc.) and effort estimations
DDL Must To identify column usage, and provide recommendation on column level lineage, and query optimization on the target system
DML / SQL Files Must To identify count, dependencies, SQL queries and effort estimations

Note:

Limit: Assuming the orchestration script is a trigger point for every single use case execution in the existing setup. If customer is not comfortable sharing all the workloads, then share those workloads which are referred or executed through the orchestration scripts. However, in such scenarios the scope and effort estimates will be based on the given workloads.


To learn more, contact our support team or write to: info@leaplogic.io

Copyright © 2025 Impetus Technologies Inc. All Rights Reserved

  • Terms of Use
  • Privacy Policy
  • License Agreement
To the top ↑ Up ↑