Only pay for what you use

No up front costs. Only pay for the compute resources you use at per-second granularity with simple pay-as-you-go, or discounted usage commitment pricing options.

Jobs Light Compute is Databricks’ equivalent of open source Apache Spark. It targets simple, non-critical workloads that don’t need the benefits provided by Jobs Compute.
STANDARD
PREMIUM
ENTERPRISE

One platform for your data analytics and ML workloads

Data analytics and ML at scale across your business

Data analytics and ML for your mission critical workloads

Classic Compute

JOBS LIGHT COMPUTE

Run data engineering pipelines to build data lakes.

$0.07/ DBU
$0.10/ DBU
$0.13/ DBU

JOBS COMPUTE

Run data engineering pipelines to build data lakes and manage data at scale.

$0.10/ DBU
$0.15/ DBU
$0.20/ DBU

SQL Compute (PREVIEW)

Run SQL queries for BI reporting, analytics, and visualization to get timely insights from data lakes.

-
$0.22/ DBU
$0.22/ DBU

ALL-PURPOSE COMPUTE

Run interactive data science and machine learning workloads. Also good for data engineering, BI and data analytics.

$0.40/ DBU
$0.55/ DBU
$0.65/ DBU
Serverless Compute

Serverless SQL Compute (PREVIEW)

Serverless SQL provides instant and managed compute that is hosted in Databricks cloud provider account

-
$0.55/ DBU
(VM costs included)

Contact us
$0.55/ DBU
(VM costs included)

Contact us
Databricks Workspace
Workspace for production jobs, analytics, and ML
Workspace for production jobs, analytics, and ML
Workspace for production jobs, analytics, and ML
Managed Apache Spark
Optimized Delta Lake
Cluster Autopilot
Jobs Scheduling & Workflow
Databricks SQL Workspace (preview)
Databricks SQL Optimization (preview)
Notebooks & Collaboration
Connectors & Integration
Databricks Runtime for ML
Managed MLflow
Performance
Up to 50x faster than Apache Spark
Autoscaling for optimized performance
Optimized performance
Optimized Runtime Engine
Optimized Autoscaling
Governance & manageability
Databricks Workspace administration
Audit logs & automated policy controls
Audit logs & automated policy controls
Administration Console
Audit Logs
Cluster Policies
Enterprise security
Single sign-on
Extend your cloud native security for company wide adoption
Advanced compliance and security for mission critical data
Single Sign-On (SSO)
Role-based Access Control
Federated IAM
Customer Managed VPC
Secure Cluster Connectivity
Token Management API
Customer Managed Keys
IP Access List
HIPAA Compliance
STANDARD

One platform for your data analytics and ML workloads

Classic Compute
$0.07/ DBU

JOBS LIGHT COMPUTE

Run data engineering pipelines to build data lakes.

$0.10/ DBU

JOBS COMPUTE

Run data engineering pipelines to build data lakes and manage data at scale.

$0.40/ DBU

ALL-PURPOSE COMPUTE

Run interactive data science and machine learning workloads. Also good for data engineering, BI and data analytics.

Databricks Workspace

Workspace for production jobs, analytics, and ML

Managed Apache Spark Optimized Delta Lake Cluster Autopilot Jobs Scheduling & Workflow Notebooks & Collaboration Connectors & Integration Databricks Runtime for ML Managed MLflow
Performance

Up to 50x faster than Apache Spark

Optimized Runtime Engine
Governance & manageability

Databricks Workspace administration

Administration Console
Enterprise security

Single sign-on

Single Sign-On (SSO)
PREMIUM

Data analytics and ML at scale across your business

Classic Compute
$0.10/ DBU

JOBS LIGHT COMPUTE

Run data engineering pipelines to build data lakes.

$0.15/ DBU

JOBS COMPUTE

Run data engineering pipelines to build data lakes and manage data at scale.

$0.22/ DBU

SQL Compute (PREVIEW)

Run SQL queries for BI reporting, analytics, and visualization to get timely insights from data lakes.

$0.55/ DBU

ALL-PURPOSE COMPUTE

Run interactive data science and machine learning workloads. Also good for data engineering, BI and data analytics.

Serverless Compute
$0.55/ DBU
(VM costs included)

Serverless SQL Compute (PREVIEW)

Serverless SQL provides instant and managed compute that is hosted in Databricks cloud provider account

Databricks Workspace

Workspace for production jobs, analytics, and ML

Managed Apache Spark Optimized Delta Lake Cluster Autopilot Jobs Scheduling & Workflow Databricks SQL Workspace (preview) Databricks SQL Optimization (preview) Notebooks & Collaboration Connectors & Integration Databricks Runtime for ML Managed MLflow
Performance

Autoscaling for optimized performance

Optimized Runtime Engine Optimized Autoscaling
Governance & manageability

Audit logs & automated policy controls

Administration Console Audit Logs Cluster Policies
Enterprise security

Extend your cloud native security for company wide adoption

Single Sign-On (SSO) Role-based Access Control Federated IAM Customer Managed VPC Secure Cluster Connectivity Token Management API
ENTERPRISE

Data analytics and ML for your mission critical workloads

Classic Compute
$0.13/ DBU

JOBS LIGHT COMPUTE

Run data engineering pipelines to build data lakes.

$0.20/ DBU

JOBS COMPUTE

Run data engineering pipelines to build data lakes and manage data at scale.

$0.22/ DBU

SQL Compute (PREVIEW)

Run SQL queries for BI reporting, analytics, and visualization to get timely insights from data lakes.

$0.65/ DBU

ALL-PURPOSE COMPUTE

Run interactive data science and machine learning workloads. Also good for data engineering, BI and data analytics.

Serverless Compute
$0.55/ DBU
(VM costs included)

Serverless SQL Compute (PREVIEW)

Serverless SQL provides instant and managed compute that is hosted in Databricks cloud provider account

Databricks Workspace

Workspace for production jobs, analytics, and ML

Managed Apache Spark Optimized Delta Lake Cluster Autopilot Jobs Scheduling & Workflow Databricks SQL Workspace (preview) Databricks SQL Optimization (preview) Notebooks & Collaboration Connectors & Integration Databricks Runtime for ML Managed MLflow
Performance

Optimized performance

Optimized Runtime Engine Optimized Autoscaling
Governance & manageability

Audit logs & automated policy controls

Administration Console Audit Logs Cluster Policies
Enterprise security

Advanced compliance and security for mission critical data

Single Sign-On (SSO) Role-based Access Control Federated IAM Customer Managed VPC Secure Cluster Connectivity Token Management API Customer Managed Keys IP Access List HIPAA Compliance

Jobs Light Compute is Databricks’ equivalent of open source Apache Spark. It targets simple, non-critical workloads that don’t need the benefits provided by Jobs Compute.

The pricing is for Databricks platform only. It does not include pricing for any required AWS resources (e.g. compute instances). Databricks Unit (DBU) is a unit of processing capability per hour, billed on a per-second usage. View the supported instances types.

Pay-as-you-go with a 14 day free trial, or contact us for commitment based discounting or custom requirements (e.g. dedicated deployments like Private Cloud).

Customer success offerings

Databricks provides a range of customer success plans and support to maximize your return on investment with realized impact.

Training

Build data and AI experts

Support

World class production operations at scale

Professional services

Accelerate your business outcomes

Estimate your price

Use our comprehensive price calculator to estimate your Databricks pricing for different workloads and about the supported instance types.

AWS pricing FAQ

What is a DBU?

A Databricks Unit (“DBU”) is a normalized unit of processing power on the Databricks unified platform used for measurement and pricing purposes. The number of DBUs a workload consumes is driven by processing metrics which may include the compute resources used and the amount of data processed. For example, 1 DBU is the equivalent of Databricks running on an i3.xlarge machine with the Databricks 8.1 standard runtime for an hour. See the full list of supported instances and details.

What is the difference between Jobs workloads and All-Purpose workloads?

Jobs workloads are workloads running on Jobs clusters. Jobs clusters are clusters that are both started and terminated by the same Job. Only one job can be run on a Jobs cluster for isolation purposes. All-Purpose workloads are workloads running on All-Purpose clusters. All-Purpose clusters are clusters that are not classified as Jobs clusters. They can be used for various purposes such as running commands within Databricks notebooks, connecting via JDBC/ODBC for BI workloads, running MLflow experiments on Databricks. Multiple users can share an All-Purpose cluster for doing interactive analysis in a collaborative way.

What is the difference between Serverless compute and Classic compute?

For Classic compute, Databricks deploys cluster resources into your AWS VPC and you are responsible for paying for EC2 charges. For Serverless compute, Databricks deploys the cluster resources into a VPC in Databricks' AWS account and you are not required to separately pay for EC2 charges. Please see here for more details.

There are two cluster options for jobs – Jobs cluster and Jobs Light cluster. How do I decide which one to use?

Jobs Light cluster is Databricks’ equivalent of open-source Apache Spark. It targets simple, non-critical workloads that don’t need the performance, reliability, or autoscaling benefits provided by Databricks’ proprietary technologies. In comparison, the Jobs cluster provides you with all of the aforementioned benefits to boost your team productivity and reduce your total cost of ownership.

What does the free trial include?

The 14-day free trial gives you access to either Standard or Premium feature sets depending on your choice of the plan. Contact us if you are interested in Databricks Enterprise or Dedicated plan for custom deployment and other enterprise customizations.

What happens after the free trial?

At the end of the trial, you are automatically subscribed to the plan that you have been on during the free trial. You can cancel your subscription at any time.

What is Databricks Community Edition?

Databricks Community Edition is a free, limited functionality platform designed for anyone who wants to learn Spark. Sign up here.

How will I be billed?

By default, you will be billed monthly based on per-second usage on your credit card. Contact us for more billing options, such as billing by invoice or an annual plan.

Do you provide technical support?

We offer technical support with our annual commitments. For self-serve options customers are encouraged to also check the technical documentation. Contact us to learn more.

I want to process protected health information (PHI) within Databricks / I want a HIPAA-compliant deployment. Is there anything I need to know to get started?

You must contact us for a HIPAA-compliant deployment. Please note that prior to processing any PHI data in Databricks, a signed business associate agreement (BAA) must be in place between your organization and (a) Databricks, Inc.; and (b) because you must have your own account with AWS to deploy Databricks on AWS, Amazon Web Services. Please see here for more details.

For features marked as “(Preview)”, what does that mean? Will these features be automatically turned on?

Please contact us to get access to preview features.

For intra-node encryption for Spark, is there anything I need to do to turn it on?

Yes. We provide our customers with the ability to decide for themselves whether the tradeoffs for additional encryption are necessary given the workloads being processed. Please contact us to enable it.