D toc |
---|
This section describes how you interact through the
D s platform | ||
---|---|---|
|
- Databricks Tables enables interaction with flat-format files as schematized datasets.
- For more information, see https://docs.microsoft.com/en-us/azure/databricks/data/tables.
Info |
---|
NOTE: Use of Azure Databricks Tables requires installation on Azure, integration with Azure Databricks, and an Azure Databricks connection. For more information, see Configure for Azure Databricks. |
Uses of Databricks Tables
The
D s platform |
---|
- Create datasets by reading from Databricks Tables tables.
- Write data to Databricks Tables.
Table Type | Support |
---|---|
Databricks managed tables | Read/Write |
Databricks unmanaged tables | Read |
Delta Tables (managed and unmanaged tables) | Read |
The underlying format for Databricks Tables is Parquet.
Limitations
- Access to external Hive metastores is not supported.
- Ad-hoc publishing to Azure Databricks is not supported.
- Creation of datasets with custom SQL is not supported.
- Use of partitioned tables in Databricks Tables is not supported.
Before You Begin Using Databricks Tables
Databricks Tables deployment: Your
must enable use of Databricks Tables. See Create Databricks Tables Connections.D s item item administrator Databricks Personal Access Token: You must acquire and save a Databricks Personal Access Token into your
. For more information, see Databricks Personal Access Token Page.D s item item account
Storing Data in Databricks Tables
Info | |
---|---|
NOTE: The
|
Reading from Databricks Tables
You can create a
D s item | ||
---|---|---|
|
- Read support is also available for Databricks Delta Lake.
- For more information, see Databricks Tables Browser.
For more information on how data types are imported from Databricks Tables, see Databricks Tables Data Type Conversions.
Writing to Databricks Tables
You can write data back to Databricks Tables using one of the following methods:
- Job results can be written directly to Databricks Tables as part of the normal job execution.
- Data is written as a managed table to DBFS in Parquet format.
- Create a new publishing action to write to Databricks Tables.
- See Run Job Page.
- For more information on how data is converted to Databricks Tables, see Databricks Tables Data Type Conversions.
Ad-hoc Publishing to Databricks Tables
Not supported.