Logo Designed by KopiCloud

How to Optimize and Reduce the Cost of Azure Databricks Clusters up to 90%

Guillermo Musumeci
9 min readApr 26, 2024

--

Over the last few months, I have been helping a FinOps team to identify and optimize the cost of hundreds of Azure Databricks Clusters.

I decided to share the optimization of a real Databricks Cluster, where I reduced expenses by 92% and saved around 190K/year.

I hope this story can help you optimize and reduce your costs.

Related Stories:

Intro — The Importance of the Data

When I worked as a Solution Architect for Amazon Web Services (AWS), they told me we cannot make educated decisions without good data.

Before we start making changes to our Databricks Clusters, we need to really understand how to calculate the real cost of the cluster we want to optimize, which is not easy (Part 1 of the article).

Then, we need to speak with the cluster/application owner, understand their needs (e.g., do you need to run 24/7, do you need lots of processing, or can you wait a few minutes), and optimize the Databricks cluster (Part 2 of the article).

--

--

Guillermo Musumeci

Certified AWS, Azure & GCP Architect | HashiCorp Ambassador | Terraform SME | KopiCloud Founder | ex-AWS | Entrepreneur | Book Author | Husband & Dad of ✌