Excerpt |
---|
This section describes how to create a connection to your datawarehouse. |
- is an S3-based data warehouse service hosted in the cloud. Auto-scaling, automatic failover, and other features simplify the deployment and management of your enterprise's data warehouse. For more information, see https://www.snowflake.com.
Tip |
---|
Tip: This connection type can be used to connect to instances hosted in Azure. |
- Read: Supported
- Write: Supported
Prerequisites
If you are connecting the
to any relational source of data, such as Redshift or Oracle, you must add the
to your whitelist for those resources. See
Whitelist Platform Service..
- PUBLIC schema: If you do not create an external staging database:
- A
PUBLIC
schema is required in your default database. - If you do not provide a stage database, then a temporary stage is created for you under the
PUBLIC
schema in the default database.
- S3 bucket: The user-created stage must point to the same S3 bucket as the default bucket in use by the .
- IAM role requirements: If you are accessing AWS and using IAM roles, please verify that the appropriate permissions have been assigned to the role to access and its backing S3 buckets. For more information, see Required AWS Account Permissions.
- Staging database: supports the use of a stage for reading and writing data to S3 during job executions.
Info |
---|
NOTE: If a stage is not deployed, then the user must have write permissions to the default database, which is used instead for staging your data in . These permissions must be included in the AWS credentials applied to the user account. |
Prerequisites for OAuth 2.0
If you are connecting to your
deployment using OAuth 2.0 authentication, additional configuration is required:
- OAuth 2.0 must be enabled and configured for use in the product. For more information, see Enable OAuth 2.0 Authentication.
- OAuth 2.0 requirements:
- Create a security integration in your deployment.
- Create an OAuth 2.0 client in the that connects using the security integration.
- For more information, see OAuth 2.0 for Snowflake.
Limitations
- You cannot perform ad-hoc publication to .
- SSO connections are not supported.
- The cluster with which you are integrating must be hosted in a public subnet.
- To ingest data from a table, one of the following must be enabled:
- A named stage must be created for the table. For more information, see the documentation.
- must be permitted to create a temporary stage, which requires:
- Write permissions on the table's database, and
- A schema named PUBLIC must exist and be accessible.
- No schema validation is performed as part of writing results to .
- Credentials and permissions are not validated when you are modifying the destination for a publishing job.
For
, no validation is performed to determine if the target is a view and is therefore not a supported target.
Create Connection
You can create
connections through the following methods.
Create through application
Any user can create a
connection through the application.
Steps:
- Login to the application.
- In the left nav bar, click the Connections icon.
- In the Create Connection page, click the connection card.
Specify the properties for your
database connection. The following parameters are specific to connections: Info |
---|
NOTE: In connections, property values are case-sensitive. -related locations are typically specified in capital letters. |
Property | Description |
---|
Account Name | account to use. Suppose your hostname is the following: Code Block |
---|
mycompany.snowflakecomputing.com |
Your account name is the following: Info |
---|
NOTE: Your full account name might include additional segments that identify the region and cloud platform where your account is hosted. |
|
Warehouse | The name of the warehouse to use when connected. This value can be an empty string. If specified, the warehouse should be an existing warehouse for which the default role has privileges.
|
Stage | If you have deployed a stage for managing file conversion to tables, you can enter its name here. A stage is a database object that points to an external location on S3. It must be an external stage containing access credentials. If a stage is used, then this value is typically the schema and the name of the stage. Example value: Code Block |
---|
MY_SCHEMA.MY_STAGE |
If a stage is not specified, a temporary stage is created using the current user's AWS credentials. Info |
---|
NOTE: Without a defined stage, you must have write permissions to the database from which you import. This database is used to create the temporary stage. |
For more information on stages, see https://docs.snowflake.net/manuals/sql-reference/sql/create-stage.html. |
Credential Type | Select the type of credentials to provide with the connection: |
Database for Stage | (optional) If you are using a stage, you can specify a database other than the default one to host the stage. Info |
---|
NOTE: If you are creating a read-only connection to , this field is required. The accessing user must have write permission to the specified database. |
If no value is specified, then your stage must be in the default database. |
For more information, see Create Connection Window.
Connection URL
The properties that you provide are inserted into the following URL, which connects the
to the connection:
Code Block |
---|
jdbc:snowflake://<account_name>.snowflakecomputing.com/?db=<database>&warehouse=<warehouse><connect-string-options> |
where:
<database>
= name of the default database to which to connect. This value can be empty.
Connect string options
The connect string options are optional. If you are passing additional properties and values to complete the connection, the connect string options must be structured in the following manner:
Code Block |
---|
&<prop1>=<val1>&<prop2>=<val2>... |
where:
<prop>
: the name of the property<val>
: the value for the property
Delimiters:
&
: any set of connect string options must begin with an ampersand (&
).=
: property names and values must be separated with an equal sign (=
).
Disable SSL connections
By default, connections to
use SSL. To disable, please add the following string to your Connect String Options:
Connect through proxy
If you require connection to
through a proxy server, additional Connect String Options are required. For more information, see
https://docs.snowflake.net/manuals/user-guide/jdbc-configure.html#specifying-a-proxy-server-in-the-jdbc-connection-string.
This connection uses the following driver:
Create via API
For more information, see
D s api refdoclink |
---|
operation/createConnection |
API:
- Type:
snowflake
- vendor:
snowflake
Troubleshooting
Error Message | Description |
---|
Null values in some columns for all rows | When there are spaces/special characters in columns names, null values can be inserted for all rows in the column. The workaround is to remove any special characters and spaces from column names. |
Using
Connections
Uses of
The
can use
for the following tasks:
- Create datasets by reading from tables.
Write to
tables with your job results.
Before you begin using
Secure access
SSL is the default connection method.
Storing data in
Your
administrator should provide database access for storing datasets. Users should know where shared data is located and where personal data can be saved without interfering with or confusing other users.
Info |
---|
NOTE: The does not modify source data in . Datasets sourced from are read without modification from their source locations. |
Reading from
You can create a
from a table stored in
.
Using
as a running environment
If your source data and output targets are located in
, you may be able to execute your transformations inside
.
Writing to
You can write back data to
using one of the following methods:
Data Validation issues:
- No validation is performed for the connection and any required permissions during job execution. So, you can be permitted to launch your job even if you do not have sufficient connectivity or permissions to access the data. The corresponding publish job fails at runtime.
- Prior to publication, no validation is performed on whether a target is a table or a view, so the job that was launched fails at runtime.