How to Configure Azure Databricks Unity Catalog with Terraform Part 4

Guillermo Musumeci
10 min readOct 19, 2023

In this story, we will learn how to configure Azure Databricks Unity Catalog with Terraform, and we will talk about how to design External Storage Accounts for Multiple Applications.

In particular, we will learn:

  • Creating Databricks External Storage Account for Multiple Applications
  • Creating the Databricks Access Connector for the External Azure Storage Account
  • Creating Databricks Storage Credentials
  • Creating the External Azure Storage Account
  • Creating Azure Storage Container for the External Azure Storage Account
  • Assigning Storage Blob Data Contributor Permissions to the Databricks Access Connector
  • Creating a Databricks External Location
  • Assigning Permissions to the Databricks External Location (Optional Step)

List of my Azure Databricks-related stories:

1. Intro: What is Azure Databricks, and What is it Used For?

Azure Databricks is a unified, open analytics platform for building, deploying, sharing, and maintaining enterprise-grade data, analytics, and AI solutions at scale.

The Azure Databricks Lakehouse Platform integrates with cloud storage and security in our cloud account and manages and deploys cloud infrastructure on your behalf.

Companies use Azure Databricks to process, store, clean, share, analyze, model, and monetize their datasets with solutions from BI to machine learning.

They use the Azure Databricks platform to build and deploy data engineering workflows, machine learning models, analytics dashboards, and more.



Guillermo Musumeci

Certified AWS, Azure & GCP Architect | HashiCorp Ambassador | Terraform SME | KopiCloud Founder | ex-AWS | Entrepreneur | Book Author | Husband & Dad of ✌