In this session you will learn about how H&M have created a reference architecture for deploying their machine learning models on azure utilizing databricks following devOps principles. The architecture is currently used in production and has been iterated over multiple times to solve some of the discovered pain points. The team that are presenting is currently responsible for ensuring that best practices are implemented on all H&M use cases covering 100”s of models across the entire H&M group.
This architecture will not only give benefits to data scientist to use notebooks for exploration and modeling but also give the engineers a way to build robust production grade code for deployment. The session will in addition cover topics like lifecycle management, traceability, automation, scalability and version control.
With 10+ years of experience from a wide variety of industries Errol have been able to reach an expert level in working with and extracting value from data. Both hands on in the data and from a strategic perspective by creating data products and leading large teams. He does so by leveraging, in the majority of the cases, open source technologies such as Spark, R, Python, Tensorflow and more. Currently Errol is the Lead data scientist for H&M where he manages the data science and ML engineering teams delivering models into production
With almost 15 years’ experience in both software engineering and AI domain, Keven becomes a specialist in big data and machine learning, from prototype to production. Keven started career as software engineer in telecom industry and took different technical lead-ship roles afterwards, also had pleasure to file 30+ patents as researcher in AI and IoT domain. Before joint H&M, Keven took data engineering practice lead role in ThinkBigAnalytics and worked with a few large enterprises in Nordics. Currently in H&M, Keven enjoys both hands on work in developing machine learning products and also build engineering team and culture.