After installation of the software and databases in your Microsoft Azure infrastructure, please complete these steps to perform the basic integration between the and Azure resources like the backend storage layer and running environment cluster.
NOTE: This section includes only basic configuration for required platform functions and integrations with Azure. Please use the links in this section to access additional details on these key features.
Tip: When you save changes from within the , your configuration is automatically validated, and the platform is automatically restarted.
These steps require admin access to your Azure deployment.
For additional details, see Configure for Azure.
For additional details, see Configure Azure Key Vault.
In the Azure console, you must create or modify the backend datastore for use with the . Supported datastores:
NOTE: You should review the limitations for your selected datastore before configuring the platform to use it. After the base storage layer has been defined in the platform, it cannot be modified.
Supported for use with Azure Databricks cluster only.
|ADLS||See Enable ADLS Access.|
Only WASBS protocol is supported only.
See Enable WASB Access.
In the Azure console, you must create or modify the running environment where jobs are executed by the . Supported running environments:
NOTE: You should review the limitations for your selected running environment before configuring the platform to use it.
|HDI||See Configure for HDInsight.|
Please complete the following steps to configure the and to integrate it with Azure resources.
For additional details:
The supports integration with the following backend datastores on Azure.
For additional details, see Enable ADLS Gen2 Access.
For additional details, see Enable ADLS Access.
For additional details, see Enable WASB Access.
Checkpoint: At this point, you should be able to load data from your backend datastore, if data is available. You can try to run a small job on Photon, which is native to the . You cannot yet run jobs on an integrated cluster.
The can run jobs on the following running environments.
NOTE: You may integrate with only one of these environments.
For additional details, see Configure for Azure Databricks.
For additional details, see Configure for HDInsight.
Checkpoint: At this point, you should be able to load data from your backend datastore and run jobs on an integrated cluster.
The supports the following methods of authentication when hosted in Azure.
The platform can be configured to integrate with your enterprise's Azure Active Directory provider. For more information, see Configure SSO for Azure AD.
If you are not applying your enterprise SSO authentication to the , platform users must be created and managed through the application.
Users can be permitted to self-register their accounts and manage their password reset requests:
NOTE: Self-created accounts are permitted to import data, generate samples, run jobs, and generate and download results. Admin roles must be assigned manually through the application.
If users are not permitted to create their accounts, an admin must do so:
For more information on creating user accounts via API, see API People Create v4 in the Developer Guide.
Checkpoint: Users who are authenticated or have been provisioned user accounts should be able to login to the and begin using the product.
You can access complete product documentation online and in PDF format. From within the product, select Help menu > Documentation.