Igniting the app economy with better ad experiences
Digital Turbine uses Databricks on Google Cloud to know their customers better
Monthly savings on hosting and other costs
Less time spent debugging issues between data systems
Less work for the data management team

Digital Turbine offers multiple technology solutions that drive superior mobile consumer experiences and results for the world’s leading telcos, advertisers and mobile publishers. Wishing to optimize the high maintenance requirements of their extract, transform, load (ETL)–based data warehouses, the company switched to Databricks and then migrated to Google Cloud to increase efficiency and reduce costs. It’s now easy for teams across Digital Turbine to access data in a data lake — and they use Google Cloud tools such as BigQuery and Vertex AI to maximize their productivity within the Databricks Platform.
Separate data warehouses hinder data delivery
As Digital Turbine (DT) provides mobile advertisers with better outcomes, the company manages vast amounts of on-device data. DT initially processed this data in several ETL-based data warehouses. But most of them ran on-premises and required heavy maintenance.
“When you have to manage multiple technologies to get a data pipeline to work, it puts tremendous strain on a data team,” Dan Ferrante, Director of Data Management, Digital Turbine, explained. “We often couldn’t deliver data in the format our data science and business intelligence (BI) teams needed. And when we faced challenges with revenue tracking tasks, there was confusion as everyone tried to figure out what was going on.”
DT needed a comprehensive, sustainable data vision. The company chose Apache SparkTM as their new data technology foundation and implemented it with Databricks on Google Cloud. Next, Dan looked for ways to maximize team efficiency while reducing hosting costs.
“Our vision was to build a data lake where we could scale up our business without increasing our costs,” Dan recalled. “We chose Google Cloud because we wanted a cloud partner that would help us interface with Databricks through a variety of data tools.”
Migrating to Google Cloud boosts efficiency for downstream users
DT appreciated that Google Cloud offered a host of tools that could help maximize team efficiency. The company used Databricks on Google Cloud to build a comprehensive data lake for all stakeholders. And because Google Cloud tools work well with Databricks Unity Catalog, DT has gained the flexibility to be technology-agnostic. “Our data scientists can use BigQuery in the Databricks environment — and tie that data out to any data in Databricks,” Dan explained. “Seeing the interactions between all our datasets helps drive business insights across product boundaries and opens new opportunities for driving intelligence. We wanted our product teams to concentrate on what they know best and use our data platforms to tie information together and increase its value.”
The data lakehouse enables downstream users to access data easily and build datasets across the company’s products. By building machine learning pipelines and using Google Cloud technology such as Vertex AI, DT has launched several other use cases that are helping the company deliver the right applications to the right users. For example, their Install Intelligence service relies on near real-time streaming data from Databricks to serve up applications to users. It can detect errors quickly and optimize applications on user devices to boost satisfaction. DT uses Vertex AI to decide in real time which applications users should see next.
When a series of acquisitions introduced new people and systems to DT, the company’s federated dataset allowed new employees to continue using BigQuery, Parquet and Delta Lake to get the data they needed. “New employees were able to use familiar tools to run queries and do exploratory analysis, which made the entire process more friendly and efficient,” Dan said.
Cloud solutions increase efficiency — and drive results
DT can now trust the datasets and data streams running on Structured Streaming on Google Cloud. As a result, the company’s data team has seen their workload reduced by about 20%.
When DT first began using Databricks SQL, it took 10 to 15 minutes to start up a cluster. On Databricks SQL Serverless, it takes just seconds. The company can now set aggressive time-outs for clusters rather than leaving them running continuously. “This makes it much easier for our downstream users to run queries,” Dan said. “And by optimizing our SQL queries with Databricks SQL Serverless and working with our Google Cloud partners, we’ve saved tens of thousands of dollars per month in hosting costs and in overall Databricks usage. That opens up our ability to bring in new features and launch new use cases.”
With Databricks on Google Cloud, DT has streamlined their delivery time from three weeks to less than a week for most new releases. The company’s product managers, data analysts and data scientists are more productive because they have the information they need. Some engineers have even used Gemini to create a chatbot that allows them to intelligently query their primary ticketing mechanism when building applications and deploying tickets. “Databricks on Google Cloud enables us to be technology-agnostic so that everyone can use the best tool for each task,” Dan said. “We’ve become far more efficient, and we’re improving our integrations. We’ve already reduced the time we spend debugging issues between data systems by 10 to 20%.”
Dan emphasized the importance of choosing the right solutions for the right data challenges. “The platform proved to be the right choice for the variety of goals we set, enhancing cost efficiency and strengthening our ability to deliver,” he explained. “We’re looking forward to leveraging Vertex AI and Spark to develop more machine learning tools that drive even better results for our customers.”