How to Configure Azure Databricks Unity Catalog with Terraform Part 3

Guillermo Musumeci
9 min readOct 15, 2023

In this story, we will learn how to configure Azure Databricks Unity Catalog with Terraform.

In particular, we will learn:

  • Creating Databricks Access Connector for the External Storage Account
  • Creating Databricks Storage Credential
  • Creating the External Azure Storage Account
  • Creating Azure Storage Container for the External Storage Account
  • Assigning Permissions to the Databricks Access Connector to the Azure External Storage Account
  • Creating a Databricks External Location
  • Assigning Permissions to the Databricks External Location
  • Validating the Creation Azure Databricks External Location
  • Creating a Databricks Catalog for the External Table
  • Creating a Databrick Schema for the External Table
  • Creating a Databricks External Table
  • Validating the Creation of the Databricks External Table

List of my Azure Databricks-related stories:

1. Intro: What is Azure Databricks, and What is it Used For?

Azure Databricks is a unified, open analytics platform for building, deploying, sharing, and maintaining enterprise-grade data, analytics, and AI solutions at scale.

The Azure Databricks Lakehouse Platform integrates with cloud storage and security in our cloud account and manages and deploys cloud infrastructure on your behalf.

Companies use Azure Databricks to process, store, clean, share, analyze, model, and monetize their datasets with solutions from BI to machine learning.

--

--

Guillermo Musumeci

Certified AWS, Azure & GCP Architect | HashiCorp Ambassador | Terraform SME | KopiCloud Founder | ex-AWS | Entrepreneur | Book Author | Husband & Dad of ✌