Only pay for what you use
No up front costs. Only pay for the compute resources you use at per-second granularity with simple pay-as-you-go, or discounted usage commitment pricing options.

Databricks Unit (DBU)
A Databricks Unit (“DBU”) is a unit of processing capability per hour, billed on per-second usage. Databricks supports many GCP virtual machine types. The larger the VM is, the more DBUs you will be consuming on an hourly basis. See the full list of supported VMs and details.
STANDARD
|
PREMIUM
|
|||
---|---|---|---|---|
One platform for your data analytics and ML workloads |
Data analytics and ML at scale and for mission critical enterprise workloads |
|||
Classic Compute |
|
-
|
-
|
|
JOBS COMPUTE Run data engineering pipelines to build data lakes and manage data at scale. |
$0.15/
DBU
|
$0.22/
DBU
|
||
ALL-PURPOSE COMPUTE Run interactive data science and machine learning workloads. Also good for data engineering, BI and data analytics. |
$0.40/
DBU
|
$0.55/
DBU
|
||
![]() ![]() |
Workspace for production jobs, analytics, and ML
|
Workspace for production jobs, analytics, and ML
|
||
Managed Apache Spark
|
![]() |
![]() |
||
Optimized Delta Lake
|
![]() |
![]() |
||
Cluster Autopilot
|
![]() |
![]() |
||
Jobs Scheduling & Workflow
|
![]() |
![]() |
||
Notebooks & Collaboration
|
![]() |
![]() |
||
Databricks Runtime for ML
|
![]() |
![]() |
||
![]() ![]() |
Up to 50x faster than Apache Spark
|
Up to 50x faster than Apache Spark
|
||
Optimized Runtime Engine
|
![]() |
![]() |
||
![]() ![]() |
Databricks Workspace administration
|
Databricks Workspace administration
|
||
Administration Console
|
![]() |
![]() |
||
![]() ![]() |
Single sign-on
|
Extend your cloud native security for company wide adoption
|
||
Single Sign-On (SSO)
|
![]() |
![]() |
||
Role-based Access Control
|
|
![]() |
||
Token Management API
|
|
![]() |
One platform for your data analytics and ML workloads
JOBS COMPUTE
Run data engineering pipelines to build data lakes and manage data at scale.
ALL-PURPOSE COMPUTE
Run interactive data science and machine learning workloads. Also good for data engineering, BI and data analytics.


Workspace for production jobs, analytics, and ML


Up to 50x faster than Apache Spark


Databricks Workspace administration


Single sign-on
Data analytics and ML at scale and for mission critical enterprise workloads
JOBS COMPUTE
Run data engineering pipelines to build data lakes and manage data at scale.
ALL-PURPOSE COMPUTE
Run interactive data science and machine learning workloads. Also good for data engineering, BI and data analytics.


Workspace for production jobs, analytics, and ML


Up to 50x faster than Apache Spark


Databricks Workspace administration


Extend your cloud native security for company wide adoption
The pricing is for Databricks platform only. It does not include pricing for any required GCP resources (e.g. compute instances). Databricks Unit (DBU) is a unite of processing capability per hour, billed on a per-second usage. View the supported instance types.
Pay-as-you-go with a 14 day free trial, or contact us for commitment based discounting.
Customer success offerings
Databricks provides a range of customer success plans and support to maximize your return on investment with realized impact.
Training
Build data and AI experts
Support
World class production operations at scale
Professional services
Accelerate your business outcomes
Estimate Your Price
Use our comprehensive price calculator to estimate your Databricks pricing for different workloads and about the supported instance types.
GCP pricing FAQ


What is a DBU?
A Databricks Unit (“DBU”) is a unit of processing capability per hour, billed on per-second usage. Databricks supports many GCP virtual machine types. The larger the VM is, the more DBUs you will be consuming on an hourly basis. See the full list of supported VMs and details.


What is the difference between Jobs workloads and All-Purpose workloads?
Jobs workloads are workloads running on Jobs clusters. Jobs clusters are clusters that are both started and terminated by the same Job. Only one job can be run on a Jobs cluster for isolation purposes. All-Purpose workloads are workloads running on All-Purpose clusters. All-Purpose clusters are clusters that are not classified as Jobs clusters. They can be used for various purposes such as running commands within Databricks notebooks, connecting via JDBC/ODBC for BI workloads, running MLflow experiments on Databricks. Multiple users can share an All-Purpose cluster for doing interactive analysis in a collaborative way.


Do you provide technical support?
We offer technical support with annual commitments. Contact us to learn more or get started.


For intra-node encryption for Spark, is there anything I need to do to turn it on?
Yes. We provide our customers with the ability to decide for themselves whether the tradeoffs for additional encryption are necessary given the workloads being processed. Please contact us to enable it.