The way to the Lakehouse just got faster Image

The way to the Lakehouse just got faster

Try Databricks as you’ve never tried it before – get your 14-day trial up and running via AWS Marketplace

AWS Pricing

Databricks is deeply integrated with AWS security and data services to manage all your AWS data on a simple, open lakehouse


Try for freeLearn more

Only pay for what you use

No up-front costs. Only pay for the compute resources you use at per second granularity with simple pay-as-you-go pricing or committed-use discounts.

Standard

One platform for your data analytics and ML workloads

Premium

Data analytics and ML at scale across your business

Enterprise

Data analytics and ML for your mission critical workloads

Classic Compute

Jobs Light Compute

Run data engineering pipelines to build data lakes.

Jobs Light Compute is Databricks' equivalent of open source Apache Spark™. It targets simple, non-critical workloads that don't need the benefits provided by Jobs Compute.

$0.07 / DBU
$0.10 / DBU
$0.13 / DBU

Jobs Compute
Jobs Compute Photon

Run data engineering pipelines to build data lakes and manage data at scale.

$0.10 / DBU
$0.15 / DBU
$0.20 / DBU

Delta Live Tables
Delta Live Tables Photon

Easily build high quality streaming or batch ETL pipelines using Python or SQL with the DLT Edition that is best for your workload. Learn more

$0.20 - $0.36 / DBU
$0.20 - $0.36 / DBU
$0.20 - $0.36 / DBU

SQL Compute

Run SQL queries for BI reporting, analytics and visualization to get timely insights from data lakes.

-
$0.22 / DBU
$0.22 / DBU

All-Purpose Compute
All-Purpose Compute Photon

Run interactive data science and machine learning workloads. Also good for data engineering, BI and data analytics.

$0.40 / DBU
$0.55 / DBU
$0.65 / DBU
Serverless Compute

Serverless SQL Compute (Preview)

Serverless SQL provides instant and managed compute that is hosted in Databricks cloud provider account.

-
$0.70 / DBU
(Compute Infrastructure included; network egress charges may apply)
$0.70 / DBU
(Compute Infrastructure included; network egress charges may apply)
Add On Products

Enhanced Security and Compliance

Provides enhanced security and controls for your compliance needs

-
-
15% of Product Spend
(FREE during Preview)
Databricks Workspace
Workspace for production jobs, analytics, and ML
Workspace for production jobs, analytics, and ML
Workspace for production jobs, analytics, and ML
Managed Apache Spark™
Optimized Delta Lake
Cluster Autopilot
Jobs Scheduling & Workflow
Databricks SQL Workspace
Databricks SQL Optimization
Notebooks & Collaboration
Connectors & Integration
Databricks Runtime for ML
Managed MLflow
Performance
Up to 50x faster than Apache Spark™
Autoscaling for optimized performance
Optimized performance
Optimized Runtime Engine
Optimized Autoscaling
Governance and Manageability
Databricks Workspace administration
Audit logs & automated policy controls
Audit logs & automated policy controls
Administration Console
Audit Logs
Cluster Policies
Enterprise Security
Secured cloud & network architecture with authentications like single sign-on
Extend your cloud-native security for company-wide adoption
Advanced compliance and security for mission critical data
Single Sign-On (SSO)
Role-based Access Control
Federated IAM
Customer Managed VPC
Secure Cluster Connectivity
Token Management API
Customer Managed Keys
IP Access List
Enhanced Security Monitoring
1
HIPAA Compliance Controls
1
PCI-DSS Compliance Controls
1
FedRAMP-Moderate Compliance Controls
1

Standard

One platform for your data analytics and ML workloads

Classic compute
$0.07 / DBU

Jobs Light Compute

Run data engineering pipelines to build data lakes.

Jobs Light Compute is Databricks' equivalent of open source Apache Spark™. It targets simple, non-critical workloads that don't need the benefits provided by Jobs Compute.

$0.10 / DBU

Jobs Compute
Jobs Compute Photon

Run data engineering pipelines to build data lakes and manage data at scale.

$0.20 - $0.36 / DBU

Delta Live Tables
Delta Live Tables Photon

Easily build high quality streaming or batch ETL pipelines using Python or SQL with the DLT Edition that is best for your workload. Learn more

$0.40 / DBU

All-Purpose Compute
All-Purpose Compute Photon

Run interactive data science and machine learning workloads. Also good for data engineering, BI and data analytics.

Compare compute options
Calculate price
Databricks Workspace

Workspace for production jobs, analytics, and ML

Managed Apache Spark™
Optimized Delta Lake
Cluster Autopilot
Jobs Scheduling & Workflow
Notebooks & Collaboration
Connectors & Integration
Databricks Runtime for ML
Managed MLflow
Performance

Up to 50x faster than Apache Spark™

Optimized Runtime Engine
Governance and Manageability

Databricks Workspace administration

Administration Console
Enterprise Security

Secured cloud & network architecture with authentications like single sign-on

Single Sign-On (SSO)
Premium

Data analytics and ML at scale across your business

Classic compute
$0.10 / DBU

Jobs Light Compute

Run data engineering pipelines to build data lakes.

Jobs Light Compute is Databricks' equivalent of open source Apache Spark™. It targets simple, non-critical workloads that don't need the benefits provided by Jobs Compute.

$0.15 / DBU

Jobs Compute
Jobs Compute Photon

Run data engineering pipelines to build data lakes and manage data at scale.

$0.20 - $0.36 / DBU

Delta Live Tables
Delta Live Tables Photon

Easily build high quality streaming or batch ETL pipelines using Python or SQL with the DLT Edition that is best for your workload. Learn more

$0.22 / DBU

SQL Compute

Run SQL queries for BI reporting, analytics and visualization to get timely insights from data lakes.

$0.55 / DBU

All-Purpose Compute
All-Purpose Compute Photon

Run interactive data science and machine learning workloads. Also good for data engineering, BI and data analytics.

Compare compute options
Calculate price
Serverless compute
$0.70 / DBU

All-Purpose Compute
All-Purpose Compute Photon

Run interactive data science and machine learning workloads. Also good for data engineering, BI and data analytics.

Databricks Workspace

Workspace for production jobs, analytics, and ML

Managed Apache Spark™
Optimized Delta Lake
Cluster Autopilot
Jobs Scheduling & Workflow
Databricks SQL Workspace
Databricks SQL Optimization
Notebooks & Collaboration
Connectors & Integration
Databricks Runtime for ML
Managed MLflow
Performance

Autoscaling for optimized performance

Optimized Runtime Engine
Optimized Autoscaling
Governance and Manageability

Audit logs & automated policy controls

Administration Console
Audit Logs
Cluster Policies
Enterprise Security

Extend your cloud-native security for company-wide adoption

Single Sign-On (SSO)
Role-based Access Control
Federated IAM
Customer Managed VPC
Secure Cluster Connectivity
Token Management API
Enterprise

Data analytics and ML for your mission critical workloads

Classic compute
$0.13 / DBU

Jobs Light Compute

Run data engineering pipelines to build data lakes.

Jobs Light Compute is Databricks' equivalent of open source Apache Spark™. It targets simple, non-critical workloads that don't need the benefits provided by Jobs Compute.

$0.20 / DBU

Jobs Compute
Jobs Compute Photon

Run data engineering pipelines to build data lakes and manage data at scale.

$0.20 - $0.36 / DBU

Delta Live Tables
Delta Live Tables Photon

Easily build high quality streaming or batch ETL pipelines using Python or SQL with the DLT Edition that is best for your workload. Learn more

$0.22 / DBU

SQL Compute

Run SQL queries for BI reporting, analytics and visualization to get timely insights from data lakes.

$0.65 / DBU

All-Purpose Compute
All-Purpose Compute Photon

Run interactive data science and machine learning workloads. Also good for data engineering, BI and data analytics.

Compare compute options
Calculate price
Serverless compute
$0.70 / DBU

All-Purpose Compute
All-Purpose Compute Photon

Run interactive data science and machine learning workloads. Also good for data engineering, BI and data analytics.

15% of Product Spend
(FREE during Preview)

Enhanced Security and Compliance

Provides enhanced security and controls for your compliance needs

Databricks Workspace

Workspace for production jobs, analytics, and ML

Managed Apache Spark™
Optimized Delta Lake
Cluster Autopilot
Jobs Scheduling & Workflow
Databricks SQL Workspace
Databricks SQL Optimization
Notebooks & Collaboration
Connectors & Integration
Databricks Runtime for ML
Managed MLflow
Enhanced Security Monitoring1
HIPAA Compliance Controls1
PCI-DSS Compliance Controls1
FedRAMP-Moderate Compliance Controls1
Performance

Optimized performance

Optimized Runtime Engine
Optimized Autoscaling
Enhanced Security Monitoring1
HIPAA Compliance Controls1
PCI-DSS Compliance Controls1
FedRAMP-Moderate Compliance Controls1
Governance and Manageability

Audit logs & automated policy controls

Administration Console
Audit Logs
Cluster Policies
Enhanced Security Monitoring1
HIPAA Compliance Controls1
PCI-DSS Compliance Controls1
FedRAMP-Moderate Compliance Controls1
Enterprise Security

Advanced compliance and security for mission critical data

Single Sign-On (SSO)
Role-based Access Control
Federated IAM
Customer Managed VPC
Secure Cluster Connectivity
Token Management API
Customer Managed Keys
IP Access List
Enhanced Security Monitoring1
HIPAA Compliance Controls1
PCI-DSS Compliance Controls1
FedRAMP-Moderate Compliance Controls1

background-image

1Available as Add-on

Pay as you go with a 14-day free trial or contact us for committed-use discounts or custom requirements (e.g., dedicated deployments like Private Cloud).

The pricing is for the Databricks platform only. It does not include pricing for any required AWS resources (e.g., compute instances).

A Databricks Unit (DBU) is a unit of processing capability per hour, billed on a per second usage. View the types of supported instances.

Customer success offerings

Databricks provides a range of customer success plans and support to maximize your return on investment with realized impact.

Training

Training

Build data and AI experts

Learn more →

Support

Support

World-class production operations at scale

Professional services

Professional services

Accelerate your business outcomes

Learn more →

Estimate your price

Use our comprehensive price calculator to estimate your cost for different Databricks workloads and the types of supported instances.

Calculate price →

AWS pricing FAQ

A Databricks Unit (DBU) is a normalized unit of processing power on the Databricks Lakehouse Platform used for measurement and pricing purposes. The number of DBUs a workload consumes is driven by processing metrics which may include the compute resources used and the amount of data processed. For example, 1 DBU is the equivalent of Databricks running on an i3.xlarge machine with the Databricks 8.1 standard runtime for an hour. See the full list of supported instances and details.

Jobs workloads are workloads running on Jobs clusters. Jobs clusters are clusters that are both started and terminated by the same Job. Only one job can be run on a Jobs cluster for isolation purposes. All-Purpose workloads are workloads running on All-Purpose clusters. All-Purpose clusters are clusters that are not classified as Jobs clusters. They can be used for various purposes such as running commands within Databricks notebooks, connecting via JDBC/ODBC for BI workloads, running MLflow experiments on Databricks. Multiple users can share an All-Purpose cluster for doing interactive analysis in a collaborative way.

For Classic compute, Databricks deploys cluster resources into your AWS VPC and you are responsible for paying for EC2 charges. For Serverless compute, Databricks deploys the cluster resources into a VPC in Databricks’ AWS account and you are not required to separately pay for EC2 charges. Please see here for more details.

If your source data is in a different AWS cloud region than the Databricks Serverless environment, AWS may charge you network egress charges. Databricks is currently waiving charges for egress from the Serverless environment to your destination region, but we may charge for such egress at market-competitive rates in the future.

Jobs Light cluster is Databricks’ equivalent of open-source Apache Spark. It targets simple, non-critical workloads that don’t need the performance, reliability, or autoscaling benefits provided by Databricks’ proprietary technologies. In comparison, the Jobs cluster provides you with all of the aforementioned benefits to boost your team productivity and reduce your total cost of ownership.

The 14-day free trial gives you access to either Standard or Premium feature sets depending on your choice of the plan. Contact us if you are interested in Databricks Enterprise or Dedicated plan for custom deployment and other enterprise customizations. Please note that you will still be charged by your cloud provider for resources (e.g. compute instances) used within your account during the free trial.

At the end of the trial, you are automatically subscribed to the plan that you have been on during the free trial. You can cancel your subscription at any time.

Databricks Community Edition is a free, limited functionality platform designed for anyone who wants to learn Spark. Sign up here.

By default, you will be billed monthly based on per-second usage on your credit card. Contact us for more billing options, such as billing by invoice or an annual plan.

We offer technical support with our annual commitments. For self-serve options customers are encouraged to also check the technical documentation. Contact us to learn more.

You must contact us for a HIPAA-compliant deployment. Please note that prior to processing any PHI data in Databricks, a signed business associate agreement (BAA) must be in place between your organization and (a) Databricks, Inc.; and (b) because you must have your own account with AWS to deploy Databricks on AWS, Amazon Web Services. Please see here for more details.

Please contact us to get access to preview features.

Yes. We provide our customers with the ability to decide for themselves whether the tradeoffs for additional encryption are necessary given the workloads being processed. Please contact us to enable it.

Ready to get started?