With almost 15 years’ experience in both software engineering and AI domain, Keven becomes a specialist in big data and machine learning, from prototype to production. Keven started career as software engineer in telecom industry and took different technical lead-ship roles afterwards, also had pleasure to file 30+ patents as researcher in AI and IoT domain. Before joint H&M, Keven took data engineering practice lead role in ThinkBigAnalytics and worked with a few large enterprises in Nordics. Currently in H&M, Keven enjoys both hands on work in developing machine learning products and also build engineering team and culture.
In this session you will learn about how H&M have created a reference architecture for deploying their machine learning models on azure utilizing databricks following devOps principles. The architecture is currently used in production and has been iterated over multiple times to solve some of the discovered pain points. The team that are presenting is currently responsible for ensuring that best practices are implemented on all H&M use cases covering 100''s of models across the entire H&M group.
This architecture will not only give benefits to data scientist to use notebooks for exploration and modeling but also give the engineers a way to build robust production grade code for deployment. The session will in addition cover topics like lifecycle management, traceability, automation, scalability and version control.