Only pay for what you use

No up front costs. Only pay for the compute resources you use at per-second granularity with simple pay-as-you-go, or discounted usage commitment pricing options.

STANDARD
PREMIUM

One platform for your data analytics and ML workloads

Data analytics and ML at scale and for mission critical enterprise workloads

Classic Compute

-
-
-

JOBS COMPUTE

Run data engineering pipelines to build data lakes and manage data at scale.

$0.15/ DBU
$0.22/ DBU
-

-
-
-

ALL-PURPOSE COMPUTE

Run interactive data science and machine learning workloads. Also good for data engineering, BI and data analytics.

$0.40/ DBU
$0.55/ DBU
-
Databricks Workspace
Workspace for production jobs, analytics, and ML
Workspace for production jobs, analytics, and ML
Managed Apache Spark
Optimized Delta Lake
Cluster Autopilot
Jobs Scheduling & Workflow
Notebooks & Collaboration
Databricks Runtime for ML
Performance
Up to 50x faster than Apache Spark
Up to 50x faster than Apache Spark
Optimized performance
Optimized Runtime Engine
Governance & manageability
Databricks Workspace administration
Databricks Workspace administration
Audit logs & automated policy controls
Administration Console
Enterprise security
Single sign-on
Extend your cloud native security for company wide adoption
Advanced compliance and security for mission critical data
Single Sign-On (SSO)
Role-based Access Control
Token Management API
STANDARD

One platform for your data analytics and ML workloads

$0.15/ DBU

JOBS COMPUTE

Run data engineering pipelines to build data lakes and manage data at scale.

$0.40/ DBU

ALL-PURPOSE COMPUTE

Run interactive data science and machine learning workloads. Also good for data engineering, BI and data analytics.

Databricks Workspace

Workspace for production jobs, analytics, and ML

Managed Apache Spark Optimized Delta Lake Cluster Autopilot Jobs Scheduling & Workflow Notebooks & Collaboration Databricks Runtime for ML
Performance

Up to 50x faster than Apache Spark

Optimized Runtime Engine
Governance & manageability

Databricks Workspace administration

Administration Console
Enterprise security

Single sign-on

Single Sign-On (SSO)
PREMIUM

Data analytics and ML at scale and for mission critical enterprise workloads

$0.22/ DBU

JOBS COMPUTE

Run data engineering pipelines to build data lakes and manage data at scale.

-

$0.55/ DBU

ALL-PURPOSE COMPUTE

Run interactive data science and machine learning workloads. Also good for data engineering, BI and data analytics.

Databricks Workspace

Workspace for production jobs, analytics, and ML

Managed Apache Spark Optimized Delta Lake Cluster Autopilot Jobs Scheduling & Workflow Notebooks & Collaboration Databricks Runtime for ML
Performance

Up to 50x faster than Apache Spark

Optimized Runtime Engine
Governance & manageability

Databricks Workspace administration

Administration Console
Enterprise security

Extend your cloud native security for company wide adoption

Single Sign-On (SSO) Role-based Access Control Token Management API

-

-

JOBS COMPUTE

Run data engineering pipelines to build data lakes and manage data at scale.

-

-

ALL-PURPOSE COMPUTE

Run interactive data science and machine learning workloads. Also good for data engineering, BI and data analytics.

Databricks Workspace

Managed Apache Spark Optimized Delta Lake Cluster Autopilot Jobs Scheduling & Workflow Notebooks & Collaboration Databricks Runtime for ML
Performance

Optimized performance

Optimized Runtime Engine
Governance & manageability

Audit logs & automated policy controls

Administration Console
Enterprise security

Advanced compliance and security for mission critical data

Single Sign-On (SSO) Role-based Access Control Token Management API

The pricing is for Databricks platform only. It does not include pricing for any required GCP resources (e.g. compute instances). Databricks Unit (DBU) is a unite of processing capability per hour, billed on a per-second usage. View the supported instance types.

Pay-as-you-go with a 14 day free trial, or contact us for commitment based discounting.

Customer success offerings

Databricks provides a range of customer success plans and support to maximize your return on investment with realized impact.

Training

Build data and AI experts

Support

World class production operations at scale

Professional services

Accelerate your business outcomes

Estimate Your Price

Use our comprehensive price calculator to estimate your Databricks pricing for different workloads and about the supported instance types.

GCP pricing FAQ

What is a DBU?

A Databricks Unit (“DBU”) is a unit of processing capability per hour, billed on per-second usage. Databricks supports many GCP virtual machine types. The larger the VM is, the more DBUs you will be consuming on an hourly basis. See the full list of supported VMs and details.

What is the difference between Jobs workloads and All-Purpose workloads?

Jobs workloads are workloads running on Jobs clusters. Jobs clusters are clusters that are both started and terminated by the same Job. Only one job can be run on a Jobs cluster for isolation purposes. All-Purpose workloads are workloads running on All-Purpose clusters. All-Purpose clusters are clusters that are not classified as Jobs clusters. They can be used for various purposes such as running commands within Databricks notebooks, connecting via JDBC/ODBC for BI workloads, running MLflow experiments on Databricks. Multiple users can share an All-Purpose cluster for doing interactive analysis in a collaborative way.

Do you provide technical support?

We offer technical support with annual commitments. Contact us to learn more or get started.

For intra-node encryption for Spark, is there anything I need to do to turn it on?

Yes. We provide our customers with the ability to decide for themselves whether the tradeoffs for additional encryption are necessary given the workloads being processed. Please contact us to enable it.