Only pay for what you use
No upfront costs. Only pay for the compute resources you use at per-second granularity with simple pay-as-you-go, or discounted usage commitment pricing options.

Databricks Unit (DBU)
A Databricks Unit (“DBU”) is a normalized unit of processing power on the Databricks unified platform used for measurement and pricing purposes. The number of DBUs a workload consumes is driven by processing metrics which may include the compute resources used and the amount of data processed. For example, 1 DBU is the equivalent of Databricks running on an i3.xlarge machine with the Databricks 8.1 standard runtime for an hour. See the full list of supported instances and details.
Standard
|
Premium
|
|||
---|---|---|---|---|
One platform for your data analytics and ML workloads |
Data analytics and ML at scale across your business |
|||
Classic Compute |
Jobs Light Compute
Run data engineering pipelines on Databricks’ equivalent of open source Apache Spark for simple, non-critical workloads. |
$0.07/
DBU
|
$0.22/
DBU
|
|
Jobs Compute Run data engineering pipelines to build data lakes and manage data at scale |
$0.15/
DBU
|
$0.30/
DBU
|
||
sql compute (preview) Run sql queries for BI reporting, analytics, and visualization to get timely insights from data lakes. |
-
|
$0.22/
DBU
|
||
all-purpose compute Run interactive data science and machine learning workloads. Also good for data engineering, BI and data analytics. |
$0.40/
DBU
|
$0.55/
DBU
|
||
![]() ![]() |
Workspace for production jobs, analytics, and ML
|
Workspace for production jobs, analytics, and ML
|
||
Managed Apache Spark
|
![]() |
![]() |
||
Optimized Delta Lake
|
![]() |
![]() |
||
Cluster Autopilot
|
![]() |
![]() |
||
Jobs Scheduling & Workflow
|
![]() |
![]() |
||
Databricks SQL Workspace (preview)
|
|
![]() |
||
Databricks SQL Optimization (preview)
|
|
![]() |
||
Notebooks & Collaboration
|
![]() |
![]() |
||
Connectors & Integration
|
![]() |
![]() |
||
Databricks Runtime for ML
|
![]() |
![]() |
||
Managed MLflow
|
![]() |
![]() |
||
![]() ![]() |
Up to 50x faster than Apache Spark
|
Autoscaling for optimized performance
|
||
Optimized Runtime Engine
|
![]() |
![]() |
||
Autoscaling
|
![]() |
![]() |
||
Optimized Autoscaling
|
![]() |
![]() |
||
![]() ![]() |
Databricks Workspace administration
|
Audit logs & automated policy controls
|
||
Administration Console
|
![]() |
![]() |
||
Clusters for running production jobs
|
![]() |
![]() |
||
Alerting and monitoring with retries
|
![]() |
![]() |
||
|
![]() |
|||
|
![]() |
|||
![]() ![]() |
Secure network architecture
|
Extend your cloud native security for company wide adoption
|
||
Single Sign-On (SSO)
|
![]() |
![]() |
||
VNET Injection
|
![]() |
![]() |
||
Secure Cluster Connectivity
|
![]() |
![]() |
||
|
![]() |
|||
|
![]() |
|||
|
![]() |
|||
|
![]() |
|||
|
![]() |
|||
|
![]() |
One platform for your data analytics and ML workloads
Jobs Light Compute
Run data engineering pipelines on Databricks’ equivalent of open source Apache Spark for simple, non-critical workloads.
Jobs Compute
Run data engineering pipelines to build data lakes and manage data at scale
all-purpose compute
Run interactive data science and machine learning workloads. Also good for data engineering, BI and data analytics.


Workspace for production jobs, analytics, and ML


Up to 50x faster than Apache Spark


Databricks Workspace administration


Secure network architecture
Data analytics and ML at scale across your business
Jobs Light Compute
Run data engineering pipelines on Databricks’ equivalent of open source Apache Spark for simple, non-critical workloads.
Jobs Compute
Run data engineering pipelines to build data lakes and manage data at scale
sql compute (preview)
Run sql queries for BI reporting, analytics, and visualization to get timely insights from data lakes.
all-purpose compute
Run interactive data science and machine learning workloads. Also good for data engineering, BI and data analytics.


Workspace for production jobs, analytics, and ML


Autoscaling for optimized performance


Audit logs & automated policy controls


Extend your cloud native security for company wide adoption
Jobs Light Compute is Databricks’ equivalent of open source Apache Spark. It targets simple, non-critical workloads that don’t need the benefits provided by Jobs Compute.
The pricing shown above is for informational purposes for Azure Databricks services only. It does not include pricing for any other required Azure resources (e.g. compute instances). Please visit the Microsoft Azure Databricks pricing page for more details including official pricing by instance type. * Usage will be metered as Standard Jobs Compute DBUs
Pay-as-you-go with a 14 day free trial, or contact us for commitment based discounting or custom requirements.