Enable Serverless SQL Warehouses and Serverless Notebooks with private connectivity to customer storage and Azure services.
Applies to: Non-PL and Full-Private deployment patterns
Serverless compute runs in Databricks-managed Azure VNet (not your VNet like classic clusters).
| Aspect | Classic Clusters | Serverless Compute |
|---|---|---|
| Where Runs | Customer VNet | Databricks SaaS VNet |
| Storage Access | Direct (via Service Endpoints in VNet) | Via NCC configuration |
| Approval | N/A (runs in your VNet) | Varies by connectivity option |
| Use Cases | ETL, ML training, batch jobs | SQL Warehouses, ad-hoc queries |
After terraform apply, your workspace has:
Before Starting:
terraform output ncc_id)What You’ll Configure:
Serverless Compute → NCC → Service Endpoint → Storage Account
(Databricks VNet) ↓ (Azure backbone) (Your subscription)
Firewall Rules
Traffic Flow:
Workspace → Settings → Network → Serverless Compute
<workspace-prefix>-nccDatabricks serverless will access your storage from specific subnets. You need to allow these subnets in your storage firewall.
After enabling serverless, Databricks will display the serverless subnet IDs in the UI:
Workspace → Settings → Network → Serverless Compute → View Details
Example Subnet IDs:
/subscriptions/.../resourceGroups/databricks-rg-<workspace>/providers/Microsoft.Network/virtualNetworks/workers-vnet/subnets/serverless-public
/subscriptions/.../resourceGroups/databricks-rg-<workspace>/providers/Microsoft.Network/virtualNetworks/workers-vnet/subnets/serverless-private
For UC Metastore Storage:
# Get storage account name
UC_METASTORE_STORAGE=$(terraform output -raw external_storage_account_name)
# Add serverless subnets to firewall
az storage account network-rule add \
--account-name $UC_METASTORE_STORAGE \
--subnet "<SERVERLESS_PUBLIC_SUBNET_ID>" \
--resource-group <rg-name>
az storage account network-rule add \
--account-name $UC_METASTORE_STORAGE \
--subnet "<SERVERLESS_PRIVATE_SUBNET_ID>" \
--resource-group <rg-name>
For External Customer Storage (if applicable):
Workspace → SQL Warehouses → Create SQL Warehouse
Serverless Test Warehouse-- Test Unity Catalog access
SHOW CATALOGS;
-- Test external location access
SELECT * FROM <catalog>.<schema>.<table> LIMIT 10;
If you want to disable public access completely:
# Disable public network access (use with caution!)
az storage account update \
--name $UC_METASTORE_STORAGE \
--resource-group <rg-name> \
--public-network-access Disabled
⚠️ Warning: This will break classic clusters unless you also add your VNet subnets to the firewall or use Private Endpoints.
Recommended Approach:
Serverless Compute → NCC → Private Endpoint → Storage Account
(Databricks VNet) ↓ (Private Link) (Your subscription)
Manual Approval
Key Difference: Databricks creates Private Endpoint connections from its managed VNet to your storage, which requires manual approval.
Workspace → Settings → Network → Serverless Compute
What Happens:
Azure Portal → Storage Accounts → <uc-metastore-storage> → Networking
Networking → Private endpoint connections
databricks-*Approval Timeline: ~2-5 minutes per storage account
# List pending private endpoint connections
az storage account private-endpoint-connection list \
--account-name <storage-account-name> \
--resource-group <rg-name> \
--query "[?properties.privateLinkServiceConnectionState.status=='Pending']"
# Approve connection
az storage account private-endpoint-connection approve \
--account-name <storage-account-name> \
--resource-group <rg-name> \
--name <connection-name> \
--description "Approved for Databricks serverless"
In Databricks UI:
Workspace → Settings → Network → Serverless Compute → View Private Link Status
Expected: All connections show “Connected” or “Approved”
Once Private Link is working:
# Disable public network access
az storage account update \
--name <storage-account-name> \
--resource-group <rg-name> \
--public-network-access Disabled
⚠️ Important:
Same as Option A Step 3.
-- 1. Test catalog access
SHOW CATALOGS;
-- 2. Test schema access
SHOW SCHEMAS IN <catalog>;
-- 3. Test table access
SELECT * FROM <catalog>.<schema>.<table> LIMIT 10;
-- 4. Test write operations
CREATE TABLE <catalog>.<schema>.test_table AS
SELECT 1 as id, 'test' as name;
# Test Unity Catalog access
catalogs = spark.sql("SHOW CATALOGS").collect()
print(f"Found {len(catalogs)} catalogs")
# Test external location read
df = spark.read.table("<catalog>.<schema>.<table>")
display(df.limit(10))
# Test external location write
df.write.mode("overwrite").saveAsTable("<catalog>.<schema>.test_table")
Symptoms:
Error: Unable to connect to storage
Error: Network connectivity issue
Check:
terraform output ncc_id
# Should return: ncc-<id>
Possible Causes:
Solution:
Symptoms:
Error: Access denied to path abfss://...
Error: Permission denied
Check:
# Verify Access Connector has "Storage Blob Data Contributor"
az role assignment list \
--assignee <access-connector-principal-id> \
--scope <storage-account-id>
-- Verify storage credential exists
SHOW STORAGE CREDENTIALS;
-- Verify external location exists
SHOW EXTERNAL LOCATIONS;
-- Grant permissions
GRANT USE CATALOG ON CATALOG <catalog> TO <user>;
GRANT CREATE SCHEMA ON CATALOG <catalog> TO <user>;
Possible Causes:
Solution:
Microsoft.Storage/storageAccounts/privateEndpointConnectionsApproval/actionAzure Documentation:
Databricks Documentation:
Applies to: Non-PL and Full-Private patterns Status: âś… Serverless Ready (requires post-deployment setup)