MLflow lets you run experiments with any ML library, framework, or language, and automatically keeps track of parameters, results, code, and data from each experiment so that you can compare results and find the best performing runs.
With Managed MLflow on Databricks, you can now track, share, visualize, and manage experiments securely from within the Databricks Workspace and notebooks.
MLflow lets you package projects with a standard format that integrates with Git and Anaconda and capture dependencies like libraries, parameters, and data.
With Managed MLflow on Databricks, now you can quickly launch reproducible runs remotely from your laptop as a Databricks job.
MLflow lets you quickly deploy production models for batch inference on Apache SparkTM, or as REST APIs using built-in integration with Docker containers, Azure ML, or Amazon SageMaker.
With Managed MLflow on Databricks, now you can operationalize and monitor production models using Databricks Jobs Scheduler and auto-managed Clusters to scale as needed based on business needs.