Category |
Types |
Input |
Big Data |
Databricks Lakehouse |
- Host address: Provide host address to connect the server instance.
- Port number: Provide the port number such as 443.
- Cluster name: Specify the cluster name.
- JDBC URL: Provide the connection URL to identify the database and to connect to it, for example, jdbc:spark://<address>;<transportMode>;<httpPath>;<Authentication Mechanism>;<UID>; <Password>.
|
Databricks |
Google Cloud BigQuery |
- Project ID: Specify the Project ID.
- Authentication Email Address: Provide a valid email id that is used for authentication.
- Authentication Key Files: Upload the .json file that contains the authentication credentials.
|
Hive |
- Variant: Provide the Hive variant such as EMR Hive, Azure HDInsight.
- Hive version: Provide Hive version such as 1.1.x, 1,2.x, 2.1.x, 3.x.
- Metastore URL: Provides the URL of the remote server from which metadata is obtained. For example, thrift://impetus-dsrv13.impetus.co.in:9083.
- JDBC URL: Provide the JDBC URL to identify the database and to connect to it.
- Edge Node Host: Provide edge node host address for communication with other nodes in the cluster.
- Edge Node Port: Provide edge node port number.
- Edge Node Username: Provide edge node username.
- Edge Node Password: Provide edge node password.
- Authentication type: Choose the authentication type:
- Kerberos: It is a network authentication protocol.
- Non-Kerberos: Authentication protocol type other than Kerberos.
|
Spark |
- Distro version: Specify the distro version such as HDP-2.6, HDP 3.1, CDH-6, CDH-7.
- Metastore URL: Provides the URL of the remote server from which metadata is obtained.
- Hive JDBC URL: Provide the Hive JDBC connection URL.
- Spark JDBC URL: Provide the Spark JDBC connection URL.
- Edge Node Host: Provide edge node host address for communication with other nodes in the cluster.
- Edge Node Port: Provide edge node port number.
- Edge Node Username: Provide edge node username.
- Edge Node Password: Provide edge node port password.
- Authentication parameters: Provide additional parameters like key and its values.
- Authentication type: Choose the authentication type:
- Kerberos: It is a network authentication protocol.
- Non-Kerberos: Authentication protocol type other than Kerberos.
|
DDL |
Greenplum |
- DDL files: Upload the DDL files.
- Schema name: Provide the required schema name otherwise the default schema name is used. By default, all the tables are created in the ‘Default schema’.
|
Netezza |
Oracle |
SQL Server |
Teradata |
Vertica |
ETL |
AWS Glue |
Nil |
File System |
Amazon S3 |
Nil |
Azure Data Lake Storage |
Choose the required version. |
DBFS |
- API Version: Provide the API version such as 2.0.
- Instance Name: Prove the instance name.
|
File Transfer Protocol |
Provide Host address and Port number. |
Secured File Transfer Protocol |
Unix File System |
Google Cloud Storage |
Upload the authentication key file (JSON file format). |
HDFS |
- URI: Provide URI (Uniform Resource Identifier) to access HDFS.
- Authentication Type: Choose the Authentication Type:
- Kerberos: It is a network authentication protocol.
- Non-Kerberos: Authentication protocol type other than Kerberos.
|
MPP |
Teradata |
Provide Host address and Port number. |
Netezza |
RDBMS |
Azure Synapse |
Provide the JDBC connection URL to identify the database and to connect to it. |
SQL Server |
Oracle |
Provide Host address and Port number. |
Redshift |
Vertica |
Other |
Greenplum |
Snowflake |
Provide the Host address. |
Cloud SQL for Postgres |
- Public IP: Provide the public IP address.
- Instance Name: Provide the instance name.
- Authentication Key File: Provide the authentication key file (JSON file format).
|
Business Intelligence |
Power BI |
Provide workspace name. |
Amazon QuickSight |
Nil |