customer story
Revolutionizing Fashion with AI

INDUSTRY: Retail and consumer goods

VERTICAL USE CASE: Demand forecasting, safety stock analysis

PLATFORM USE CASE: Delta Lake, data science, machine learning, ETL

CLOUD: Azure


As a major disruptor and innovator in the fashion and retail industry, H&M relies on data as the core for everything they do. With stores opening up globally at a rapid pace, they needed to improve supply chain and forecasting operations to streamline costs and maximize revenues. But their on-premise Hadoop system crippled their ability to ingest and analyze data generated by millions of customers needed to power predictive models. Understanding they had reached their scalability ceiling, H&M moved to the Databricks Lakehouse Platform to simplify infrastructure management, enable performant data pipelines at scale, and simplify the machine learning lifecycle — allowing them to make data-driven decisions that accelerate business growth.

Legacy Architecture Unable to Support Company Growth

In order to improve supply chain efficiencies, they chose to utilize data and AI to improve decisioning and operations. However, their legacy Hadoop based architecture was inefficient and wasn’t able to scale to meet their rapid business requirements.

  • Massive volumes of data from over 5,000 stores in over 70 markets, with millions of customers every day.
  • Data engineering was challenged with fixed size clusters, complex infrastructure that was resource intensive and costly to scale, and data security issues.
  • Struggled to scale operations to support data science efforts against all of this data coming from various siloed data sources.
  • Time-to-market suffered because of significant DevOps delays, which impacted the ability for their data scientists to build, train, and deploy models quickly. It would take a whole year to go from ideation to productionization.

Simplifying Data Operations Boosts ML Innovations

Databricks provides H&M with a Unified Data Analytics Platform on Azure that has fostered a scalable and collaborative environment across data science and engineering, allowing data engineers and scientists to focus on the entire data lifecycle instead of managing clusters, to train and operationalize models rapidly with the goal of accelerating supply chain decisions for the business.

  • Fully managed platform with automated cluster management simplifies infrastructure management and operations at scale.
  • Collaborative notebook environment with support for multiple languages (SQL, Scala, Python, R) enables a diverse team of users to work together in their preferred language — creating a unified cross-team environment to fuel productivity.
  • Integrated Databricks platform with Azure and other technologies like Apache Airflow and Kubernetes, so elastic model training at massive scale can be achieved.

Smarter Decisioning, Dramatic Cost Savings

At H&M, even a 0.1% improvement in accuracy of a single model has a huge impact on the business. With Databricks, H&M is making data more accessible for each and every decision maker, making business grow faster and more relevant.

  • Improved operational efficiency: Features such as auto-scaling clusters has improved operations from data ingest to managing the entire machine learning lifecycle — reducing operational costs by 70%.
  • Better cross-team collaboration: Unified analytics environment for both data scientists and engineers has dramatically reduced the number of components needed to go in production with easy setup and management.
  • Huge business impact with faster time-to-insight: The ability to be more granular in decision making has allowed them to improve strategic decisioning and business forecasting.
  • 70%
    Reduction in operational costs

Databricks is the core of our data business, it’s the place we go for insights.”

– Errol Koolmeister, Head of AI Technology and Architecture, H&M

Related Content

Technical Talk at Spark + AI Summit EU 2019