customer story
Building omnichannel loyalty and engagement for retailers

INDUSTRY: Technology and software

SOLUTION: Customer retention, customer segmentation

PLATFORM USE CASE: Lakehouse, Delta LakeETL

CLOUD: AWS

 

Punchh, now part of PAR Technology Corporation, empowers retailers to deliver a comprehensive omnichannel experience through personalized loyalty programs, individual buying incentives and user-friendly mobile apps. But with significant data growth, Punchh quickly outgrew their production system and sought a cost-effective, centralized lakehouse environment that enabled reliable data analytics and AI across teams. Using the Databricks Lakehouse Platform on AWS, Punchh has accelerated ETL pipelines, democratized data and analytics, and improved BI and reporting—increasing customer retention and loyalty.

Needing faster and more accessible analytics at scale

Punchh originally used AWS MySQL, but they quickly outgrew the system and needed a data-driven, scalable and cost-effective data solution. “We wanted to have a data warehouse in place wherein we can do faster analytics. It was critical that we have the data available in the platform in a much faster and equitable way so that our analytics team can easily create intuitive reports to help us with decision-making,” explains Vibhor, Senior Big Data Engineer. The team landed on the idea of an enterprise data warehouse, but the costs were high and it did not support all forms of analytics, so they explored different data lake tools.

Hadoop-like systems seemed attractive, but they were difficult to scale and failed to provide ACID compliance and data integrity. The data warehouse struggled to support ML reliably because of the disparate storage and computing systems involved, and performance was poor, with models having to run for days to deliver insights. Last but not least were siloed data, siloed teams and slow raw data processing through Tableau. Vibhor said, “Some of the developers would use their own local version of Python or PySpark, which was not collaborative nor scalable.” Poor data access was the root cause of Punchh’s disjointed infrastructure, so they sought a centralized lakehouse platform on AWS to democratize data for analytics and ML.

Unifying data and analytics in a centralized lakehouse

With the Databricks Lakehouse Platform, Punchh is able to solve its number one problem: data access. Delta Lake, a data storage and management layer on the Lakehouse Platform, is able to deliver bronze, silver and gold levels of data to a variety of users with reliability and speed. From there, Punchh leverages the compute power of Databricks and Delta Lake to build fast ETL pipelines that provide clean data for SQL analytics and efficient ML. Intuitive Databricks notebooks further collaboration across data teams, with access to all data and support for different programming interfaces in the same environment. Jagan Mangalampalli, Director of Big Data at Punchh, explained, “We are able to store raw data as well as refined data in the same lakehouse. We have some use cases where we need the raw data — for example, our ML team needs raw data for their models — and some other teams need the refined data, and we were able to meet those two use cases with this lakehouse architecture.”

The analytics team uses Databricks SQL to QA new tables at a much faster rate for cost reduction and competitive service-level agreements (SLAs). The convenient SQL-native interface allows Punchh’s data team to query their data directly within Databricks and then share insights through rich visualizations and fast reporting via Tableau. Vibhor said, “Now with the Databricks environment, the analytics have become quite savvy to use. The Databricks notebooks and Databricks SQL allow us to do all analytics in one place. So that removes these silos as well.”

Delivering insights with speed and accuracy

With Databricks and Delta Lake, Punchh has increased processing performance 90x on its high-traffic campaign analytics dashboard. On their dashboard landing page they reduced costs by 50% per request by building an ETL job that pre-aggregates 30 out of the 40 raw data metrics. Phylis Savari, Manager of Analytics at Punchh, said, “When you’re joining 12 tables on the fly every time a report is run by a customer, the cost to generate the report can be high. Databricks not only increased ETL performance but lowered overall costs, which solved those issues.” Anil Raparla, Director of Analytics at Punchh, added, “Now we are able to easily ingest and prepare data on a daily basis for customer reporting, and it’s all happening on Databricks.”

Reports are loading in seconds rather than minutes, Punchh is exceeding customer SLAs, DevOps is seamless and development time is down. The boost in productivity at Punchh has increased customer satisfaction and retention, improving their customer net promoter score (NPS) by 12%, or 40 basis points. “Databricks is core to our business because its lakehouse architecture provides us a unified way to access, store and share actionable data,” Jagan said. “It’s going to be the core of our analytics strategy going forward. No question about that.”

  • 10x
    Faster time-to-insight
  • 30%
    Lower operational costs
  • 12%
    Increase in customer satisfaction

Databricks is core to our business because its lakehouse architecture provides us a unified way to access, store and share actionable data. It’s going to be the core of our analytics strategy going forward. No question about that.”

– Jagan Mangalampalli, Director of Big Data, Punchh

databricks-tableau-logo
Punchh leverages various data sources to generate reports via Tableau to inform strategic decisions that better engage and convert consumers — all powered by Databricks.