Near Real-Time Anomaly Detection with Delta Live Tables and Databricks Machine Learning
Why is Anomaly Detection Important? Whether in retail, finance, cyber security, or any other industry, spotting anomalous behavior as soon as it happens...
Two weeks ago we held a live webinar – Databricks' Data Pipeline: Journey and Lessons Learned – to show how Databricks used Apache Spark to simplify our own log ETL pipeline. The webinar describes an architecture where you can develop your pipeline code in notebooks, create Jobs to productionize your notebooks, and utilize REST APIs to turn all of this into a continuous integration workflow.
We have answered the common questions raised by webinar viewers below. If you have additional questions, please check out the Databricks Forum.
Click on the question to see answer: