Databricks on Google Cloud
Databricks is deeply integrated with Google Cloud security and data services to manage all your Google Cloud data on a simple, open lakehouse

Only pay for what you use
No up-front costs. Only pay for the compute resources you use at per second granularity with simple pay-as-you-go pricing or committed-use discounts.
Standard One platform for your data analytics and ML workloads | Premium Data analytics and ML at scale and for mission critical enterprise workloads | |
---|---|---|
Jobs ComputeJobs Compute Photon Run data engineering pipelines to build data lakes and manage data at scale. | 0.15 / DBU | 0.22 / DBU |
SQL Compute (Preview) Run SQL queries for BI reporting, analytics and visualization to get timely insights from data lakes with your favorite SQL and BI tools. | - | 0.22 / DBU |
DLT Advanced ComputeDLT Advanced Compute Photon (Preview) Easily build high quality streaming or batch ETL pipelines using Python or SQL, perform CDC, and trust your data with quality expectations and monitoring. Learn more | 0.40 / DBU | 0.40 / DBU |
All-Purpose ComputeAll-Purpose Compute Photon Run interactive data science and machine learning workloads. Also good for data engineering, BI and data analytics. | 0.40 / DBU | 0.55 / DBU |
Add On Products | ||
GCP HIPAA Compliance Provides enhanced security and controls for your HIPAA compliance needs | - | 10% of Product Spend (FREE during Preview) |
Databricks Workspace | Workspace for production jobs, analytics, and ML | Workspace for production jobs, analytics, and ML |
Managed Apache Spark™ | ||
Optimized Delta Lake | ||
Cluster Autopilot | ||
Jobs Scheduling & Workflow | ||
Databricks SQL Workspace (Preview) | ||
Databricks SQL Optimization (Preview) | ||
Notebooks & Collaboration | ||
Databricks Runtime for ML | ||
Connectors & Integration | ||
Managed MLflow | ||
Performance | Up to 50x faster than Apache Spark™ | Up to 50x faster than Apache Spark™ |
Optimized Runtime Engine | ||
Optimized Autoscaling | ||
Governance and Manageability | Databricks Workspace administration | Databricks Workspace administration |
Administration Console | ||
Audit Logs | ||
Cluster Policies | ||
Enterprise Security | Single sign-on | Extend your cloud-native security for company-wide adoption |
Single Sign-On (SSO) | ||
Role-based Access Control | ||
Token Management API | ||
Secure Cluster Connectivity | ||
IP Access List | ||
HIPAA Compliance Controls | ![]() |
One platform for your data analytics and ML workloads
Jobs ComputeJobs Compute Photon
Run data engineering pipelines to build data lakes and manage data at scale.
DLT Advanced ComputeDLT Advanced Compute Photon (Preview)
Easily build high quality streaming or batch ETL pipelines using Python or SQL, perform CDC, and trust your data with quality expectations and monitoring. Learn more
All-Purpose ComputeAll-Purpose Compute Photon
Run interactive data science and machine learning workloads. Also good for data engineering, BI and data analytics.
Compare compute optionsCalculate price
Workspace for production jobs, analytics, and ML
Optimized Delta Lake
Cluster Autopilot
Jobs Scheduling & Workflow
Notebooks & Collaboration
Databricks Runtime for ML
Connectors & Integration
Managed MLflow
Up to 50x faster than Apache Spark™
Optimized Autoscaling
Databricks Workspace administration
Audit Logs
Cluster Policies
Single sign-on
Secure Cluster Connectivity
Data analytics and ML at scale and for mission critical enterprise workloads
Jobs ComputeJobs Compute Photon
Run data engineering pipelines to build data lakes and manage data at scale.
SQL Compute (Preview)
Run SQL queries for BI reporting, analytics and visualization to get timely insights from data lakes with your favorite SQL and BI tools.
DLT Advanced ComputeDLT Advanced Compute Photon (Preview)
Easily build high quality streaming or batch ETL pipelines using Python or SQL, perform CDC, and trust your data with quality expectations and monitoring. Learn more
All-Purpose ComputeAll-Purpose Compute Photon
Run interactive data science and machine learning workloads. Also good for data engineering, BI and data analytics.
Compare compute optionsCalculate price
(FREE during Preview)
GCP HIPAA Compliance
Provides enhanced security and controls for your HIPAA compliance needs
Workspace for production jobs, analytics, and ML
Optimized Delta Lake
Cluster Autopilot
Jobs Scheduling & Workflow
Databricks SQL Workspace (Preview)
Databricks SQL Optimization (Preview)
Notebooks & Collaboration
Databricks Runtime for ML
Connectors & Integration
Managed MLflow
Up to 50x faster than Apache Spark™
Optimized Autoscaling
Databricks Workspace administration
Audit Logs
Cluster Policies
Extend your cloud-native security for company-wide adoption
Role-based Access Control
Token Management API
Secure Cluster Connectivity
IP Access List
HIPAA Compliance Controls1

1Available as Add-on
Pay as you go with a 14-day free trial or contact us
for committed-use discounts.
The pricing is for the Databricks platform only. It does not include pricing for any required GCP resources (e.g., compute instances).
A Databricks Unit (DBU) is a unit of processing capability per hour, billed on a per second usage. View the types of supported instances.
Customer success offerings
Databricks provides a range of customer success plans and support to maximize your return on investment with realized impact.
Training
Building data and AI experts
Support
World-class production operations at scale
Professional services
Accelerating your business outcomes
Estimate your price
Use our comprehensive price calculator to estimate your cost for
different Databricks workloads and the types of supported instances.
GCP pricing FAQ
A Databricks Unit (DBU) is a normalized unit of processing power on the Databricks Lakehouse Platform used for measurement and pricing purposes. The number of DBUs a workload consumes is driven by processing metrics, which may include the compute resources used and the amount of data processed.
Jobs workloads are workloads running on Jobs clusters. Jobs clusters are clusters that are both started and terminated by the same Job. Only one job can be run on a Jobs cluster for isolation purposes. All-Purpose workloads are workloads running on All-Purpose clusters. All-Purpose clusters are clusters that are not classified as Jobs clusters. They can be used for various purposes such as running commands within Databricks notebooks, connecting via JDBC/ODBC for BI workloads, running MLflow experiments on Databricks. Multiple users can share an All-Purpose cluster for doing interactive analysis in a collaborative way.
Jobs Light cluster is Databricks’ equivalent of open-source Apache Spark. It targets simple, non-critical workloads that don’t need the performance, reliability, or autoscaling benefits provided by Databricks’ proprietary technologies. In comparison, the Jobs cluster provides you with all of the aforementioned benefits to boost your team productivity and reduce your total cost of ownership.
The 14-day free trial gives you access to either Standard or Premium feature sets depending on your choice of the plan. Contact us if you are interested in Databricks Enterprise or Dedicated plan for custom deployment and other enterprise customizations.
At the end of the trial, you are automatically subscribed to the plan that you have been on during the free trial. You can cancel your subscription at any time.
By default, you will be billed monthly based on per-second usage on your credit card. Contact us for more billing options, such as billing by invoice or an annual plan.
We offer technical support with annual commitments. Contact us to learn more or get started.
Please contact us to get access to preview features.
Product Spend is calculated based on GCP product spend at list, before the application of any discounts, usage credits, add-on uplifts, or support fees.
Ready to get started?

