Only pay for what you use
No up front costs. Only pay for the compute resources you use at per-second granularity with simple pay-as-you-go, or discounted usage commitment pricing options.

Jobs Light Compute vs. Jobs Compute
Databricks offers two flavors of its data processing engine to run production jobs - Jobs Compute and Jobs Light Compute. Choose either one or use both to maximize your ROI.
JOBS LIGHT COMPUTE
Jobs Light Compute is Databricks’ equivalent of open source Apache Spark. It targets simple, non-critical workloads that don’t need the benefits provided by Jobs Compute.
JOBS COMPUTE
Jobs Compute runs on Databricks’ optimized engine that provides advanced performance, reliability, and autoscaling benefits to boost your team’s productivity and reduce total cost of ownership.

Databricks Unit (DBU)
A Databricks Unit (“DBU”) is a unit of processing capability per hour, billed on per-second usage. Databricks supports many AWS EC2 instance types. The larger the instance is, the more DBUs you will be consuming on an hourly basis. For example, 1 DBU is the equivalent of Databricks running on a c4.2xlarge machine for an hour. See the full list of supported instances and details.
STANDARD
|
Premium
|
ENTERPRISE
|
|
---|---|---|---|
One platform for your data analytics and ML workloads |
Data analytics and ML at scale across your business |
Data analytics and ML for your mission critical workloads |
|
JOBS COMPUTE Run data engineering pipelines to build data lakes and manage data at scale See details. |
From
$0.07/
DBU
|
From
$0.10/
DBU
|
From
$0.13/
DBU
|
SQL Compute (PREVIEW) Run SQL queries for BI reporting, analytics, and visualization to get timely insights from data lakes. |
-
|
$0.15/
DBU
|
$0.15/
DBU
|
ALL-PURPOSE COMPUTE Run interactive data science and machine learning workloads. Also good for data engineering, BI and data analytics. |
$0.40/
DBU
|
$0.55/
DBU
|
$0.65/
DBU
|
![]() ![]() |
Workspace for production jobs, analytics, and ML
|
Workspace for production jobs, analytics, and ML
|
Workspace for production jobs, analytics, and ML
|
Apache Spark managé
|
![]() |
![]() |
![]() |
Optimized Delta Lake
|
![]() |
![]() |
![]() |
Cluster Autopilot
|
![]() |
![]() |
![]() |
Jobs Scheduling & Workflow
|
![]() |
![]() |
![]() |
SQL Analytics Workspace (preview)
|
|
![]() |
![]() |
SQL Analytics Optimization (preview)
|
|
![]() |
![]() |
Notebooks & Collaboration
|
![]() |
![]() |
![]() |
Connectors & Integration
|
![]() |
![]() |
![]() |
Runtime Databricks pour le ML
|
![]() |
![]() |
![]() |
Mlflow administré
|
![]() |
![]() |
![]() |
![]() ![]() |
Up to 50x faster than Apache Spark
|
Autoscaling for optimized performance
|
Optimized performance
|
Optimized Runtime Engine
|
![]() |
![]() |
![]() |
Optimized Autoscaling
|
|
![]() |
![]() |
![]() ![]() |
Databricks Workspace administration
|
Audit logs & automated policy controls
|
Audit logs & automated policy controls
|
Administration Console
|
![]() |
![]() |
![]() |
Journaux d'audit
|
|
![]() |
![]() |
Cluster Policies
|
|
![]() |
![]() |
![]() ![]() |
Single sign-on
|
Extend your cloud native security for company wide adoption
|
Advanced compliance and security for mission critical data
|
Single Sign-On (SSO)
|
![]() |
![]() |
![]() |
Contrôle d'accès basé sur les rôles
|
|
![]() |
![]() |
Federated IAM
|
|
![]() |
![]() |
Customer Managed VPC
|
|
![]() |
![]() |
Secure Cluster Connectivity
|
|
![]() |
![]() |
Token Management API
|
|
![]() |
![]() |
Customer Managed Keys
|
|
|
![]() |
IP Access List
|
|
|
![]() |
HIPAA Compliance
|
|
|
![]() |
One platform for your data analytics and ML workloads
JOBS COMPUTE
Run data engineering pipelines to build data lakes and manage data at scale See details.
ALL-PURPOSE COMPUTE
Run interactive data science and machine learning workloads. Also good for data engineering, BI and data analytics.


Workspace for production jobs, analytics, and ML


Up to 50x faster than Apache Spark


Databricks Workspace administration


Single sign-on
Data analytics and ML at scale across your business
JOBS COMPUTE
Run data engineering pipelines to build data lakes and manage data at scale See details.
ALL-PURPOSE COMPUTE
Run interactive data science and machine learning workloads. Also good for data engineering, BI and data analytics.


Workspace for production jobs, analytics, and ML


Autoscaling for optimized performance


Audit logs & automated policy controls


Extend your cloud native security for company wide adoption
Data analytics and ML for your mission critical workloads
JOBS COMPUTE
Run data engineering pipelines to build data lakes and manage data at scale See details.
ALL-PURPOSE COMPUTE
Run interactive data science and machine learning workloads. Also good for data engineering, BI and data analytics.


Workspace for production jobs, analytics, and ML


Optimized performance


Audit logs & automated policy controls


Advanced compliance and security for mission critical data
The pricing is for Databricks platform only. It does not include pricing for any required AWS resources (e.g. compute instances). Databricks Unit (DBU) is a unit of processing capability per hour, billed on a per-second usage. View the supported instances types.
Pay-as-you-go with a 14 day free trial, or contact us for commitment based discounting or custom requirements (e.g. dedicated deployments like Private Cloud).
Customer success offerings
Databricks provides a range of customer success plans and support to maximize your return on investment with realized impact.
Formation
Build data and AI experts
Aide
World class production operations at scale
Services professionnels
Accelerate your business outcomes
Estimate your price
Use our comprehensive price calculator to estimate your Databricks pricing for different workloads and about the supported instance types.
AWS pricing FAQ


What is a DBU?
Une Databricks Unit (DBU) est une unité de capacité de traitement par heure, facturée à la seconde d'utilisation. Databricks prend en charge de nombreux types d'instances AWS EC2. Plus l'instance est grosse, plus cela consommera de DBU par heure. Par exemple, 1 DBU correspond à l'utilisation de Databricks sur une machine c4.2xlarge pendant une heure. Consultez la liste complète des instances prises en charge et leurs caractéristiques.


What is the difference between Jobs workloads and All-Purpose workloads?
Jobs workloads are workloads running on Jobs clusters. Jobs clusters are clusters that are both started and terminated by the same Job. Only one job can be run on a Jobs cluster for isolation purposes. All-Purpose workloads are workloads running on All-Purpose clusters. All-Purpose clusters are clusters that are not classified as Jobs clusters. They can be used for various purposes such as running commands within Databricks notebooks, connecting via JDBC/ODBC for BI workloads, running MLflow experiments on Databricks. Multiple users can share an All-Purpose cluster for doing interactive analysis in a collaborative way.


There are two cluster options for jobs – Jobs cluster and Jobs Light cluster. How do I decide which one to use?
Jobs Light cluster is Databricks’ equivalent of open-source Apache Spark. It targets simple, non-critical workloads that don’t need the performance, reliability, or autoscaling benefits provided by Databricks’ proprietary technologies. In comparison, the Jobs cluster provides you with all of the aforementioned benefits to boost your team productivity and reduce your total cost of ownership.


Que contient l'essai gratuit ?
The 14-day free trial gives you access to either Standard or Premium feature sets depending on your choice of the plan. Contact us if you are interested in Databricks Enterprise or Dedicated plan for custom deployment and other enterprise customizations.


Que se passe-t-il à la fin de l'essai gratuit ?
At the end of the trial, you are automatically subscribed to the plan that you have been on during the free trial. You can cancel your subscription at any time.


En quoi consiste Databricks Community Edition ?
Databricks Community Edition est une plateforme gratuite, aux fonctionnalités limitées, conçue pour quiconque souhaite s'initier à Spark. Inscrivez-vous ici.


Comment serais-je facturé ?
By default, you will be billed monthly based on per-second usage on your credit card. Contact us for more billing options, such as billing by invoice or an annual plan.


Proposez-vous une assistance technique ?
We offer technical support with our annual commitments. For self-serve options customers are encouraged to also check the technical documentation. Contact us to learn more.


I want to process protected health information (PHI) within Databricks / I want a HIPAA-compliant deployment. Is there anything I need to know to get started?
You must contact us for a HIPAA-compliant deployment. Please note that prior to processing any PHI data in Databricks, a signed business associate agreement (BAA) must be in place between your organization and (a) Databricks, Inc.; and (b) because you must have your own account with AWS to deploy Databricks on AWS, Amazon Web Services. Please see here for more details.


For features marked as “(Preview)”, what does that mean? Will these features be automatically turned on?
Please contact us to get access to preview features.


For intra-node encryption for Spark, is there anything I need to do to turn it on?
Yes. We provide our customers with the ability to decide for themselves whether the tradeoffs for additional encryption are necessary given the workloads being processed. Please contact us to enable it.