Skip to the content
LeaplogicLeaplogic
  • Home
  • About Us
  • Contact
SIGN IN
  • Home
  • About Us
  • Contact

  • Getting Started
    • Before You Begin
    • Creating an Account
    • Logging into LeapLogic
    • Reset Password
    • Quick Tour of the Web Interface
    • LeapLogic in 15 minutes
      • Prerequisites
      • Step 1. Log into LeapLogic
      • Step 2. Create Assessment and Get Insights
      • Step 3. Create Transformation Pipeline and See Results
      • Step 4. Edit or Optimize the Transformed Code
      • Step 5: Complete the Transformation Lifecycle
  • Introduction to LeapLogic
    • Overview
    • High Level Architecture
    • Supported Legacy and Cloud Platforms
    • Key Features
  • Workload Assessment
    • Overview
    • Value Proposition
    • Creating Assessment
      • Prerequisites
      • Step 1. Provide Primary Inputs
        • Automation Coverage
      • Step 2. Add the Additional Inputs
        • Table Stat Extraction Steps
          • Teradata
          • Oracle
          • Netezza
      • Step 3. Update the Source Configuration
      • Step 4. Configure the Recommendation Settings
    • Assessment Listing
    • Understanding Insights and Recommendations
      • Volumetric Info
      • EDW
        • Oracle
          • Highlights
          • Analysis
          • Optimization
          • Lineage
          • Recommendations
          • Downloadable Reports
        • Vertica
          • Highlights
          • Analysis
          • Optimization
          • Lineage
          • Recommendations
          • Downloadable Reports
        • Snowflake
          • Highlights
          • Analysis
          • Optimization
          • Lineage
          • Recommendations
          • Downloadable Reports
        • Azure Synapse
          • Highlights
          • Analysis
          • Optimization
          • Lineage
          • Recommendations
          • Downloadable Reports
        • SQL Server
          • Highlights
          • Analysis
          • Optimization
          • Lineage
          • Recommendations
          • Downloadable Reports
        • Teradata
          • Highlights
          • Analysis
          • Optimization
          • Lineage
          • Recommendations
          • Downloadable Reports
        • Netezza
          • Highlights
          • Analysis
          • Optimization
          • Lineage
          • Recommendations
          • Downloadable Reports
        • Google Big Query
          • Highlights
          • Analysis
          • Optimization
          • Lineage
          • Recommendations
          • Downloadable Reports
        • Redshift
          • Highlights
          • Analysis
          • Optimization
          • Lineage
          • Recommendations
          • Downloadable Reports
        • PostgreSQL
          • Highlights
          • Analysis
          • Optimization
          • Lineage
          • Recommendations
          • Downloadable Reports
        • Duck DB
          • Highlights
          • Analysis
          • Optimization
          • Lineage
          • Recommendations
          • Downloadable Reports
        • ClickHouse
          • Highlights
          • Analysis
          • Optimization
          • Lineage
          • Recommendations
          • Downloadable Reports
        • Exasol
          • Highlights
          • Analysis
          • Optimization
          • Lineage
          • Recommendations
          • Downloadable Reports
        • DB2
          • Highlights
          • Analysis
          • Optimization
          • Recommendations
          • Lineage
          • Downloadable Reports
      • ETL
        • Informatica
          • Highlights
          • Analysis
          • Lineage
          • Downloadable Reports
        • Ab Initio
          • Highlights
          • Analysis
          • Lineage
          • Downloadable Reports
        • DataStage
          • Highlights
          • Analysis
          • Lineage
          • Downloadable Reports
        • Talend
          • Highlights
          • Analysis
          • Lineage
          • Downloadable Reports
        • SSIS
          • Highlights
          • Analysis
          • Lineage
          • Downloadable Reports
        • Informatica BDM
          • Highlights
          • Analysis
          • Lineage
          • Downloadable Reports
        • Oracle Data Integrator
          • Highlights
          • Analysis
          • Downloadable Reports
        • Pentaho
          • Highlights
          • Analysis
          • Downloadable Reports
        • Azure Data Factory
          • ARM Template
          • Highlights
          • Analysis
          • Downloadable Reports
        • Matillion
          • Highlights
          • Analysis
          • Downloadable Reports
        • SnapLogic
          • Highlights
          • Analysis
          • Downloadable Reports
      • Orchestration
        • AutoSys
          • Highlights
          • Analysis
          • Downloadable Reports
        • Control-M
          • Highlights
          • Analysis
          • Lineage
          • Downloadable Reports
        • SQL Server
          • Highlights
          • Analysis
      • BI
        • OBIEE
          • Highlights
          • Analysis
          • Lineage
          • Downloadable Reports
        • Tableau
          • Highlights
          • Analysis
          • Lineage
          • Downloadable Reports
        • IBM Cognos
          • Highlights
          • Analysis
          • Downloadable Reports
        • MicroStrategy
          • Highlights
          • Analysis
          • Lineage
          • Downloadable Reports
        • Power BI
          • Highlights
          • Analysis
          • Lineage
          • Downloadable Reports
        • SSRS
          • Highlights
          • Analysis
          • Downloadable Reports
        • SAP BO
          • Highlights
          • Analysis
          • Lineage
          • Downloadable Reports
        • WebFOCUS
          • Highlights
          • Analysis
          • Downloadable Reports
      • Analytics
        • SAS
          • Highlight
          • Analysis
          • Lineage
          • Downloadable Reports
        • Alteryx
          • Highlights
          • Analysis
          • Lineage
          • Downloadable Reports
      • Integrated Assessment (EDW, ETL, Orchestration, BI)
        • Highlights
        • Analysis
        • Optimization
        • Lineage
        • Recommendations
    • Managing Assessment Reports
      • Downloading Report
      • Input Report Utility
      • View Configuration
    • Complexity Calculation Logic
    • Key Benefits
    • Ad hoc Query
  • Metadata Management
    • Overview
    • Introduction to Data Catalog
      • Managing Data Catalog
        • Building Data Catalog
        • Insights to Data Catalog
        • Managing the Repository and Data Source
      • Creating Repository (Repo)
      • Creating Data Source
    • Tag Management
    • Key benefits
  • Batch Processing using Pipeline
    • Introduction
    • Designing Pipeline
      • How to create a pipeline
        • Configuring Migration Stage
          • Schema Optimization
        • Configuring Transformation Stage
          • On-premises to Cloud
          • Cloud-to-Cloud
          • LeapLogic Express
        • Configuring Validation Stage
          • Data Validation
            • Table
            • File
            • File and Table
            • Cell-by-cell validation
          • Query Validation
            • Query Validation (When Data is Available)
            • Query Validation (When Data is Not Available)
          • Schema Validation
        • Configuring Execution Stage
        • Configuring ETL Conversion Stage
          • Ab Initio
          • Informatica
          • Informatica BDM
          • Matillion
          • DataStage
          • SSIS
          • IICS
          • Talend
          • Oracle Data Integrator
          • Pentaho
          • SnapLogic
        • Configuring Mainframe Conversion Stage
          • Cobol
          • JCL
        • Configuring Orchestration Stage
          • AutoSys
          • Control-M
        • Configuring BI Conversion Stage
          • OBIEE to Power BI
          • OBIEE to AWS QuickSight
          • Tableau to Amazon QuickSight
          • Tableau to Power BI
          • Tableau to Superset
          • Tableau to Looker
          • IBM Cognos to Power BI
        • Configuring Analytics Conversion Stage
          • SAS
          • Alteryx
        • Configuring Script Conversion Stage
    • Key Features
      • How to schedule a pipeline
      • Configuring Parameters
  • Pipeline Reports
    • Overview of Pipeline Report
    • Pipeline Listing
    • Reports and Insights
      • Migration
      • Transformation
        • On-premises to Cloud
        • Cloud-to-Cloud
        • LeapLogic Express
      • Validation
        • Data
          • File
          • Table
          • File and Table
        • Query
          • Query Validation Report (When Data is Available)
          • Query Validation Report (When Data is not Available)
        • Schema
      • Execution
      • ETL
        • Ab Initio
        • Informatica
        • Informatica BDM
        • Matillion
        • DataStage
        • SSIS
        • IICS
        • Talend
        • Oracle Data Integrator
        • Pentaho
        • SnapLogic
      • Mainframe
        • Cobol
        • JCL
      • Orchestration
        • AutoSys
        • Control-M
      • BI
        • OBIEE to Power BI
        • OBIEE to Amazon QuickSight
        • Tableau to Amazon QuickSight
        • Tableau to Power BI
        • Tableau to Superset
        • Tableau to Looker
        • IBM Cognos to Power BI
      • Analytics
        • SAS
        • Alteryx
      • Shell Script
      • Common Model
    • Automation Level Indicator
      • ETL
        • Informatica
        • Matillion
        • DataStage
        • Informatica BDM
        • SnapLogic
        • IICS
        • Ab Initio
        • SSIS
        • Talend
        • Pentaho
      • Orchestration
        • AutoSys
        • Control-M
      • EDW
      • Analytics
        • SAS
        • Alteryx
      • BI
      • Shell Script
    • Error Specifications & Troubleshooting
  • SQL Transformation
    • Overview
    • Creating and Executing the Online Notebook
      • How to Create and Execute the Notebook
      • Supported Features
    • Configuring the Notebook
      • Transformation
      • Unit Level Validation
      • Script Level Validation
    • Notebook Listing
  • Operationalization
    • Overview
      • Basic
      • Advanced
      • Cron Expression
    • Parallel Run Pipeline Listing
  • Transformation Source
    • Introduction
    • Creating Transformation Source Type
  • Governance
    • Summary of Governance - Roles and Permissions
    • User Creation
      • Creating a new User Account
    • Adding Roles and permissions
      • How to add Roles and Permissions to a new user?
    • Adding Group Accounts
    • Default Quota Limits
    • Product Usage Metrics
  • License
    • EDW
    • ETL
  • Script/ Query Log/ Code Extraction Prerequisites
    • Cloud
      • Azure Data Factory
      • Snowflake
      • Azure Synapse
      • Google BigQuery
      • Redshift
      • Azure SQL Database Hyperscale
      • Aurora PostgreSQL
    • EDW
      • Oracle
      • Netezza
      • Teradata
      • Vertica
      • SQL Server
      • Db2
      • MySQL
      • PostgreSQL
    • ETL
      • DataStage
      • Informatica
      • SSIS
      • Talend
      • ODI
      • IICS
      • DBT
      • Pentaho
      • Matillion
      • SnapLogic
      • Ab Initio
      • SAP BODS
      • TAC
      • WebFOCUS
    • BI
      • IBM Cognos
      • OBIEE
      • Tableau
      • Metabase
      • MicroStrategy
      • PowerBI
      • LeapLogic Utility for SAP BO
      • SAP BO Universe and Web Intelligence
      • SSRS
    • Analytics
      • SAS
      • Alteryx
    • Orchestration
      • AutoSys
      • Control-M
      • SQL Server
    • Mainframe
  • LeapLogic Desktop Version
    • Overview
    • Registration and Installation
    • Getting Started
    • Creating Assessment
      • ETL
      • DML
      • Procedure
      • Analytics
      • Hadoop
    • Reports and Insights
      • Downloadable Reports
      • Reports for Estimation
    • Logging and Troubleshooting
    • Sample Scripts
    • Desktop vs. Web Version
    • Getting Help
  • LeapLogic (Version 4.9) Deployment
    • System Requirements
    • Prerequisites
    • Deployment
      • Extracting Package
      • Placing License Key
      • Executing Deployment Script
      • Accessing LeapLogic
    • Uploading License
    • Appendix
    • Getting Help
  • Removed Features
    • Configuring File Validation Stage
    • Variable Extractor Stage
      • Variable Extractor Report
    • Configuring Meta Diff Stage
      • Meta Diff
    • Configuring Data Load Stage
      • Data Load
    • Configuring Multi Algo Stage
  • FAQs
  • Tutorial Videos
  • Notice
Home   »  Script/ Query Log/ Code Extraction Prerequisites   »  Cloud  »  Snowflake Artifact Extraction Prerequisites

Snowflake Artifact Extraction Prerequisites

This topic briefs about the prerequisites required for Snowflake artifact extraction.

In This Topic:

  • Introduction
  • Artifact Extraction
  • Getting Help


Introduction

LeapLogic’s Assessment profiles existing inventory, identifies complexity, performs dependency analysis, and provides recommendations for migration to modern data platform.


Artifact Extraction

LeapLogic requires certain artifacts to perform an assessment. As a prerequisite, you need to have an ACCOUNTADMIN privilege to get data. You can copy them from your GIT instance or the Snowflake repository where all artifacts are stored. LeapLogic needs the following artifacts in the form of files such as .sql.

  • DDL scripts
  • Snowflake Stored Procedures
  • Functions
  • Query execution logs
  • DML scripts
  • Snowpark scripts
  • Snowflake Scripting files

Bulk Export – DDL Scripts

To export the required DDLs, refer to the below script and change the following parameters as applicable.

  • Provide a comma-separated list of database names to extract the required DDLs. Keep it empty to extract DDLs from all the databases.
  • Provide a path where you want to store the exported DDLs.
  • Provide the Snowflake connection details to establish a connection to the environment.

This script exports all the database objects such as databases, external tables, file formats, schemas, sequences, stored procedures, tables, tags, tasks, UDFs, including external functions, views etc.

BulkExportSnowflakeDDLDownload

Bulk Export – Query Execution Logs

There are two options available for extracting query execution logs. The first option involves utilizing the COPY INTO command, while the second option involves using the Snowflake UI (Snowsight or Classic UI). It is recommended to use the approach of COPY INTO command, especially if the query log size is substantial.

Bulk Export – Using COPY INTO Command

To utilize the COPY INTO command, it is necessary to create a named stage to store the output of the COPY command. Refer to the following command to create a stage in your database.

CREATE OR REPLACE STAGE database_name.schema_name.querylog stage;

where:

  • database_name –represents the name of the database where the stage object is to be created.
  • schema_name – represents the name of the schema where the stage object is to be created.

To export the necessary query execution logs, you can use the following COPY INTO command as a reference. Change the start date au.START_TIME <= current_date – 30 as required. The query’s value of -30 will retrieve records from the past 30 days.

COPY INTO @querylog_stage

FROM (SELECT

      ‘IDWWM’ || ‘~~’ ||

au.USER_NAME || ‘~~’ ||

s.CLIENT_APPLICATION_ID || ‘~~’ ||

‘client_u’ || ‘~~’ ||

START_TIME || ‘~~’ ||

(QUEUED_PROVISIONING_TIME + QUEUED_REPAIR_TIME + QUEUED_OVERLOAD_TIME)::varchar || ‘~~’ ||

(BYTES_SCANNED + BYTES_WRITTEN + BYTES_WRITTEN_TO_RESULT + BYTES_READ_FROM_RESULT)::varchar || ‘~~’ ||

COMPILATION_TIME::varchar || ‘~~’ ||

START_TIME || ‘~~’ ||

START_TIME || ‘~~’ ||

0::varchar || ‘~~’ ||

au.QUERY_ID || ‘~~’ ||

(max(QUEUED_PROVISIONING_TIME + QUEUED_REPAIR_TIME + QUEUED_OVERLOAD_TIME) OVER(

      ORDER BY NULL))::varchar || ‘~~’ ||

(max(BYTES_SCANNED + BYTES_WRITTEN + BYTES_WRITTEN_TO_RESULT + BYTES_READ_FROM_RESULT) OVER(

      ORDER BY NULL))::varchar || ‘~~’ ||

(sum(QUEUED_PROVISIONING_TIME) OVER (

      ORDER BY NULL) + sum(QUEUED_REPAIR_TIME)OVER (

      ORDER BY NULL)

+ sum(QUEUED_OVERLOAD_TIME) OVER(

      ORDER BY NULL))::varchar || ‘~~’ ||

(sum(BYTES_SCANNED) OVER(

      ORDER BY NULL) + sum(BYTES_WRITTEN) OVER (

      ORDER BY NULL) +

sum(BYTES_WRITTEN_TO_RESULT) OVER(

      ORDER BY NULL) + sum(BYTES_READ_FROM_RESULT) OVER(

      ORDER BY NULL))::varchar || ‘~~’ ||

(TOTAL_ELAPSED_TIME / 1000)::varchar || ‘~~’ ||

COALESCE(DATABASE_NAME,

      ‘Not Present’) || ‘~~’ ||

0::varchar || ‘~~’ ||

QUERY_TYPE || ‘~~’ ||

l.REPORTED_CLIENT_TYPE || ‘~~’ ||

‘ ‘ || ‘~~’ ||

QUERY_TEXT || ‘~~’ ||

au.execution_time/(1000*60*60)*wh.credits_per_hour || ‘~~’ ||

au.WAREHOUSE_SIZE || ‘~~’ ||

au.WAREHOUSE_TYPE || ‘~~’ ||

nvl(au.warehouse_name, ‘NA’) || ‘~~’ ||

nvl(au.warehouse_id, 0)

FROM

      SNOWFLAKE.ACCOUNT_USAGE.QUERY_HISTORY AS AU

JOIN snowflake.account_usage.sessions s ON

      AU.session_id = s.session_id

JOIN snowflake.account_usage.login_history l ON

      l.event_id = s.login_event_id

JOIN (

    SELECT ‘X-Small’ AS warehouse_size, 1 AS credits_per_hour UNION ALL

    SELECT ‘Small’ AS warehouse_size, 2 AS credits_per_hour UNION ALL

    SELECT ‘Medium’  AS warehouse_size, 4 AS credits_per_hour UNION ALL

    SELECT ‘Large’ AS warehouse_size, 8 AS credits_per_hour UNION ALL

    SELECT ‘X-Large’ AS warehouse_size, 16 AS credits_per_hour UNION ALL

    SELECT ‘2X-Large’ AS warehouse_size, 32 AS credits_per_hour UNION ALL

    SELECT ‘3X-Large’ AS warehouse_size, 64 AS credits_per_hour UNION ALL

    SELECT ‘4X-Large’ AS warehouse_size, 128 AS credits_per_hour

) AS wh

        ON AU.warehouse_size=wh.warehouse_size 

WHERE

      au.START_TIME <= current_date – 30)

FILE_FORMAT = ( TYPE = CSV)

OVERWRITE = TRUE

max_file_size=4900000000;

Keep in mind that the COPY INTO command operates in parallel, which means it will generate multiple files in the stage.

  1. Download Query Log Files from Named Stage.
    There are two options available for downloading query execution log files from the name stage. The first option to use Snowflake UI (Snowsight), while the second option involves using the Snowflake SnowSQL CLI (command Line Interface).
  2. Download Query Log Files using Snowsight.
    Follow the steps below to download query log files from the named stage using the Snowflake Snowsight UI:
    1. Go to the ‘Data’ section in the Snowsight left panel.
    2. Choose ‘Databases’ under the Data section.
    3. Search for the Database and Schema that were used to create the Stage object.
    4. Select ‘Stages’ under the chosen Schema.
    5. Click on the stage name, e.g., QUERYLOG_STAGE.
    6. (Optional) If prompted, ‘Enable Directory Table.’
    7. (Optional) Select the warehouse name from the dropdown next to the search option if not already selected.
    8. Click on the three dots (…) on the file row and choose the download option.
  1. Download Query Log Files using snowsql CLI.
    To download files from the named stage using snowsql, it is necessary to install the SnowSQL CLI on your system. Proceed with the following steps to download query log files from the named stage using the Snowflake snowsql CLI.
    1. Access the Snowflake by Logging in using Snowsql.

    snowsql -a account.us-east-1 -u username -d database -s schema -w warehouse -r role;

      Here is the key information for the parameters:

      • -a: Snowflake account name
      • -u: Snowflake username
      • -d: Snowflake database used to create the stage object
      • -s: Snowflake Schema used to create the stage object
      • -w: Warehouse name to use
      • -r: Role used to create the stage object.
    1. Verify the content of stage object.

    list @QUERYLOG_STAGE;

    1. Use the GET command to download the query log files.

      Windows:

    GET @QUERYLOG_STAGE file://C:\\path\\tmp\\data;

      The command will download all files in the stage for the QUERYLOG_STAGE table to the C:\path\tmp\data local windows directory.

      Linux:

    GET @QUERYLOG_STAGE file:///path/tmp/data/;

    The command will download all files in the stage for the QUERYLOG_STAGE table to the /path/tmp/data/ local Linux directory.

    Bulk Export – Using Snowflake UI (Snowsight or Classic UI)

    To export the required query execution logs, refer to the below script. Change the start date au.START_TIME <= current_date – 30 as required. The query’s value of -30 will retrieve records from the past 30 days.

    The recommended time frame is six months. Next, execute the given script below on the Snowflake UI console and download the output after execution.

    SELECT

          ‘IDWWM’ || ‘~~’ ||

    au.USER_NAME || ‘~~’ ||

    s.CLIENT_APPLICATION_ID || ‘~~’ ||

    ‘client_u’ || ‘~~’ ||

    START_TIME || ‘~~’ ||

    (QUEUED_PROVISIONING_TIME + QUEUED_REPAIR_TIME + QUEUED_OVERLOAD_TIME)::varchar || ‘~~’ ||

    (BYTES_SCANNED + BYTES_WRITTEN + BYTES_WRITTEN_TO_RESULT + BYTES_READ_FROM_RESULT)::varchar || ‘~~’ ||

    COMPILATION_TIME::varchar || ‘~~’ ||

    START_TIME || ‘~~’ ||

    START_TIME || ‘~~’ ||

    0::varchar || ‘~~’ ||

    au.QUERY_ID || ‘~~’ ||

    (max(QUEUED_PROVISIONING_TIME + QUEUED_REPAIR_TIME + QUEUED_OVERLOAD_TIME) OVER(

          ORDER BY NULL))::varchar || ‘~~’ ||

    (max(BYTES_SCANNED + BYTES_WRITTEN + BYTES_WRITTEN_TO_RESULT + BYTES_READ_FROM_RESULT) OVER(

          ORDER BY NULL))::varchar || ‘~~’ ||

    (sum(QUEUED_PROVISIONING_TIME) OVER (

          ORDER BY NULL) + sum(QUEUED_REPAIR_TIME)OVER (

          ORDER BY NULL)

    + sum(QUEUED_OVERLOAD_TIME) OVER(

          ORDER BY NULL))::varchar || ‘~~’ ||

    (sum(BYTES_SCANNED) OVER(

          ORDER BY NULL) + sum(BYTES_WRITTEN) OVER (

          ORDER BY NULL) +

    sum(BYTES_WRITTEN_TO_RESULT) OVER(

          ORDER BY NULL) + sum(BYTES_READ_FROM_RESULT) OVER(

          ORDER BY NULL))::varchar || ‘~~’ ||

    (TOTAL_ELAPSED_TIME / 1000)::varchar || ‘~~’ ||

    COALESCE(DATABASE_NAME,

          ‘Not Present’) || ‘~~’ ||

    0::varchar || ‘~~’ ||

    QUERY_TYPE || ‘~~’ ||

    l.REPORTED_CLIENT_TYPE || ‘~~’ ||

    ‘ ‘ || ‘~~’ ||

    QUERY_TEXT || ‘~~’ ||

    au.execution_time/(1000*60*60)*wh.credits_per_hour || ‘~~’ ||

    au.WAREHOUSE_SIZE || ‘~~’ ||

    au.WAREHOUSE_TYPE || ‘~~’ ||

    nvl(au.warehouse_name, ‘NA’) || ‘~~’ ||

    nvl(au.warehouse_id, 0)

    FROM

          SNOWFLAKE.ACCOUNT_USAGE.QUERY_HISTORY AS AU

    JOIN snowflake.account_usage.sessions s ON

          AU.session_id = s.session_id

    JOIN snowflake.account_usage.login_history l ON

          l.event_id = s.login_event_id

    JOIN (

        SELECT ‘X-Small’ AS warehouse_size, 1 AS credits_per_hour UNION ALL

        SELECT ‘Small’ AS warehouse_size, 2 AS credits_per_hour UNION ALL

        SELECT ‘Medium’  AS warehouse_size, 4 AS credits_per_hour UNION ALL

        SELECT ‘Large’ AS warehouse_size, 8 AS credits_per_hour UNION ALL

        SELECT ‘X-Large’ AS warehouse_size, 16 AS credits_per_hour UNION ALL

        SELECT ‘2X-Large’ AS warehouse_size, 32 AS credits_per_hour UNION ALL

        SELECT ‘3X-Large’ AS warehouse_size, 64 AS credits_per_hour UNION ALL

        SELECT ‘4X-Large’ AS warehouse_size, 128 AS credits_per_hour

    ) AS wh

            ON AU.warehouse_size=wh.warehouse_size 

    WHERE

          au.START_TIME between ‘2023-01-01’ AND ‘2023-12-29’

    ;

    Bulk Export – Warehouse Artifacts

    To export the query results as separated delimited files, refer to the below script. Additional artifacts are needed for better assessment.

    – – Warehouse with credit used till date

    SELECT WAREHOUSE_NAME, sum(CREDITS_USED) AS TOTAL_CREDIT_USED

    FROM SNOWFLAKE.ACCOUNT_USAGE.WAREHOUSE_METERING_HISTORY

    GROUP BY 1

    ORDER BY 2 DESC;

    – – Warehouse load

    SELECT WAREHOUSE_NAME, sum(AVG_RUNNING+AVG_QUEUED_LOAD) Avg_warehouse_load

    FROM SNOWFLAKE.ACCOUNT_USAGE.WAREHOUSE_LOAD_HISTORY

    GROUP BY 1

    ORDER BY 2 DESC;

    – – Warehouse Usage Month over month

    SELECT WAREHOUSE_NAME, to_char(START_TIME, ‘MON-YYYY’) AS “MONTH”, sum(CREDITS_USED) AS TOTAL_CREDIT_USED

    FROM SNOWFLAKE.ACCOUNT_USAGE.WAREHOUSE_METERING_HISTORY

    GROUP BY 1, 2

    ORDER BY 1,3 DESC;

    – – 2.4      Database Objects

    SELECT

          TABLE_CATALOG AS databasename,

          CASE

                WHEN IS_TRANSIENT = ‘YES’ THEN ‘TRANSIENT’

                ELSE TABLE_TYPE

          END tablekind,

          COUNT(TABLE_NAME)

    FROM

          SNOWFLAKE.ACCOUNT_USAGE.”TABLES”

    GROUP BY 1,2     

    ORDER BY 1;

    – – 2.5      Database Volume

    select TABLE_CATALOG AS databasename, cast (sum(ACTIVE_BYTES)/1024/1024/1024 as decimal(18,2))

    from SNOWFLAKE.ACCOUNT_USAGE.TABLE_STORAGE_METRICS

    GROUP BY 1

    HAVING cast (sum(ACTIVE_BYTES)/1024/1024/1024 as decimal(18,2)) > 10;

    – – 2.6      High Data Volume Tables, last DML/COPY INTO tables

    SELECT t1.TABLE_CATALOG AS databasename,

                t1.TABLE_NAME AS tablename,

                t1.row_count AS num_rows,

                cast (ACTIVE_BYTES/1024/1024/1024 as decimal(18,2)) AS currentsize_gb,

                NULL AS ConstraintType,

                NULL AS ConstraintText,

                TO_CHAR(TO_TIMESTAMP(GET(GREATEST(ARRAY_CONSTRUCT(T1.LAST_ALTERED),

                ARRAY_CONSTRUCT(L1.LAST_LOAD_TIME)),0)), ‘YYYY-MON-DD’) AS LAST_DML_COPY

    FROM   snowflake.account_usage.”TABLES” AS t1

    JOIN SNOWFLAKE.ACCOUNT_USAGE.TABLE_STORAGE_METRICS AS t2

    ON (t1.TABLE_ID = t2.ID)

    LEFT JOIN SNOWFLAKE.ACCOUNT_USAGE.LOAD_HISTORY L1

    ON (T1.TABLE_ID = L1.TABLE_ID)

    WHERE table_type = ‘BASE TABLE’

    AND T1.TABLE_CATALOG IN (‘NEWTPCDS’, ‘TEST_DB’) — Provide DATABASE names

    ORDER BY 3 DESC ;

    – – 2.7      Databases and Users

    SELECT DATABASE_NAME, count(DISTINCT USER_NAME)

    FROM SNOWFLAKE.ACCOUNT_USAGE.QUERY_HISTORY

    GROUP BY DATABASE_NAME;

    – – 2.8 Warehouse details

    SELECT

    WAREHOUSE_NAME, to_char(“TIMESTAMP”, ‘MON-YYYY’) AS “MONTH”,

    SUM(CASE WHEN EVENT_NAME = ‘SPINUP_CLUSTER’ THEN 1 ELSE 0 END )  AS NO_TIMES_SPINUP,

    SUM(CASE WHEN EVENT_NAME = ‘RESIZE_CLUSTER’ THEN 1 ELSE 0 END )  AS NO_TIMES_RESIZED,

    SUM(CASE WHEN EVENT_NAME = ‘RESUME_CLUSTER’ THEN 1 ELSE 0 END )  AS NO_TIMES_RESUMED

    FROM SNOWFLAKE.ACCOUNT_USAGE.WAREHOUSE_EVENTS_HISTORY

    WHERE EVENT_STATE LIKE ‘%COMPLETED’

    GROUP BY 1,2

    ORDER BY NO_TIMES_RESUMED DESC

    ;

    Other Scripts/Artifacts

    Copy any other scripts such as DML scripts, Snowpark scripts, SnowScript files etc. from your GIT instance or the Snowflake repository and provide these artifacts to LeapLogic Team.


    Getting Help

    Contact LeapLogic technical support at info@leaplogic.io


    To learn more, contact our support team or write to: info@leaplogic.io

    Copyright © 2025 Impetus Technologies Inc. All Rights Reserved

    • Terms of Use
    • Privacy Policy
    • License Agreement
    To the top ↑ Up ↑