Note
Access to this page requires authorization. You can try signing in or changing directories.
Access to this page requires authorization. You can try changing directories.
A table resides in a schema and contains rows of data. The default table type created in Azure Databricks is a Unity Catalog managed table.
The primary differentiator for table types in Azure Databricks is the owning catalog, as described in the following table:
Table type | Managing catalog | Read/write support | Performance optimization | Storage cost optimization |
---|---|---|---|---|
Managed | Unity Catalog | Yes | Yes | Yes |
External | None (files only) | Yes | Manual only | Manual only |
Foreign | An external system or catalog service | Read only | No | No |
The following example shows a table named prod.people_ops_employees
that contains data about five employees. The metadata is registered in Unity Catalog and the data is stored in cloud storage.
Storage formats: Delta Lake and Apache Iceberg
Table types in Azure Databricks define how data is owned and accessed. Separately, the storage format defines how the data is physically structured and tracked on disk.
Azure Databricks supports two primary open table formats: Delta Lake and Apache Iceberg. These formats add a transactional storage layer that tracks metadata and enables ACID compliance, time travel, and other features.
- Delta Lake is the default storage format for managed and external tables in Azure Databricks.
- Apache Iceberg is supported on managed and foreign tables in Azure Databricks. This format is useful when you're integrating with the Iceberg ecosystem.
Managed tables
Managed tables manage underlying data files alongside the metastore registration. Databricks recommends that you use managed tables whenever you create a new table. Unity Catalog managed tables are the default when you create tables in Azure Databricks. See Unity Catalog managed tables in Azure Databricks for Delta Lake and Apache Iceberg.
External tables
External tables, sometimes called unmanaged tables, reference data stored outside of Databricks in an external storage system, such as cloud object storage. They decouple the management of underlying data files from metastore registration. Unity Catalog supports external tables in several formats, including Delta Lake. Unity Catalog external tables can store data files using common formats readable by external systems. See Work with external tables.
Foreign tables
Foreign tables represent data stored in external systems connected to Azure Databricks through Lakehouse Federation. Foreign tables are read-only on Azure Databricks. See Work with foreign tables.
Tables in Unity Catalog
In Unity Catalog, tables sit at the third level of the three-level namespace (catalog.schema.table
), as shown in the following diagram.
Basic table permissions
Most table operations require USE CATALOG
and USE SCHEMA
permissions on the catalog and schema containing a table.
The following table summarizes the additional permissions needed for common table operations in Unity Catalog:
Operation | Permissions |
---|---|
Create a table | CREATE TABLE on the containing schema |
Query a table | SELECT on the table |
Update, delete, merge, or insert data to a table | SELECT and MODIFY on the table |
Drop a table | MANAGE on the table |
Replace a table | MANAGE on the table, CREATE TABLE on the containing schema |
For more on Unity Catalog permissions, see Manage privileges in Unity Catalog.