Skip to main content

Overall Equipment Effectiveness

Try Databricks for free

What is Overall Equipment Effectiveness?

Overall Equipment Effectiveness(OEE) is a measure of how well a manufacturing operation is utilized (facilities, time and material) compared to its full potential, during the periods when it is scheduled to run. It identifies the percentage of manufacturing time that is truly productive. An OEE is a dashboard that shows the total performance of a discrete or continuous process. OEE is maximized at 100% and means that only good parts are produced (100% quality), at the maximum speed (100% performance), and without interruption (100% availability).

Measuring OEE is a manufacturing best practice. By measuring OEE and the underlying losses, important insights can be gained on how to systematically improve the manufacturing process. OEE is an effective metric for identifying losses, bench-marking progress, and improving the productivity of manufacturing equipment (i.e., eliminating waste).

Why Is this important?

OEE has been used as the prime manufacturing metric for decades. Historically, OEE monitoring used manual data collection directly from the machines on a batch basis and calculated the metric. In itself, OEE is a reactive dashboard that monitors the past and historical performance - the true power of OEE is to guide to implement use cases that improve the components of OEE - the deployment of predictive maintenance to improve availability, or quality control leveraging computer vision to improve quality that are predictive in function.

What are Databricks’ differentiated capabilities?

  • Databricks’ Lakehouse uses technologies that include Delta, Delta Live Tables, Autoloader and Photon to enable customers to make data available for real-time decisions.
  • Lakehouse for MFG supports the largest data jobs at near real-time intervals. For example, customers are bringing nearly 400 million events per day from transactional log systems at 15-second intervals. Because of the disruption to reporting and analysis that occurs during data processing, most retail customers load data to their data warehouse during a nightly batch. Some companies are even loading data weekly or monthly.
  • A Lakehouse event-driven architecture provides a simpler method of ingesting and processing batch and streaming data than legacy approaches, such as lambda architectures. This architecture handles the change data capture and provides ACID compliance to transactions.
  • Delta Live Tables simplifies the creation of data pipelines and automatically builds in lineage to assist with ongoing management.
  • The Lakehouse allows for real-time stream ingestion of data and analytics on streaming data. Data warehouses require the extraction, transformation, loading, and additional extraction from the data warehouse to perform any analytics.
  • Photon provides record-setting query performance, enabling users to query even the largest of data sets to power real-time decisions in BI tools.

What data challenges need to be addressed to build predictive OEE capabilities?

  • Handling the volume and variety of IoT data—To enable predictive use cases embedded in OEE manufactures, the Lakehouse handles all types of diverse data structures and schemas, including everything from intermittent readings of temperature, pressure, and vibrations per second to handling fully unstructured data (e.g., images, video, text, spectral data) or other forms such as thermographic or acoustic signals, from the edge delivered through diverse supported drivers and protocols.
  • Managing the complexity of real-time data: In order to drive continuous process monitoring, throughput optimization or predictive maintenance, the Lakehosue enables real-time analytics on streaming data. The lakehouse effectively ingests, stores, and processes streaming data in real time or near-real time in order to instantly deliver insights and action.
  • Freeing data from independent silos—Specialized processes (innovation platforms, QMS, MES, etc) within the value chain reward disparate data sources and data management platforms that tailor to unique siloed solutions. These narrow point solutions limit enterprise value by considering only a fraction of the insight cross-enterprise data can offer, in addition, duplicate siloed solutions divide the business, limiting collaboration opportunities. Also, the Lakehouse ingests, stores, manages, and processes streaming data from all points in the value chain, combines it with Data Historians, ERP, MES and QMS sources and leverages it into actionable insights.
  • Diverse analytical capabilities—Legacy data warehouses offer limited ability to provide insights and analytics into platform usage and performance. For Connected Manufacturing IoT solutions, the Lakehouse provides a wide range of analytical options—including everything from SQL analytics and search capabilities, to tools to support machine learning and modeling, along with tight integration with leading business intelligence (BI) solutions that offer specialized dashboard and business analytics capabilities.
  • Predictive modeling capabilities—Predictive modeling capabilities are key to delivering insights, and the Lakehouse provides notebook driven machine learning capabilities used to predict and prevent disturbances before they impact operations.

Additional Resources

Back to Glossary