Data Brew
Season 2, Episode 4

Hyperparameter and Neural Architecture Search

Liam Li is a leading researcher in the fields of hyperparameter optimization and neural architecture search, and is the author of the seminal Hyperband paper. In this session, Liam discusses the evolution of hyperparameter optimization techniques and illustrates how every data scientist can benefit from neural architecture search.

Listen to the audio

Back to all episodes

Guest


Liam Li

Liam Li recently completed his PhD in Machine Learning from Carnegie Mellon University, where he was advised by Ameet Talwalkar. His thesis on efficient methods for automating machine learning showcases his work on Hyperband, large-scale hyperparameter tuning, and efficient neural architecture search. Since then, he joined Determined AI as a machine learning engineer to build a cutting-edge platform for deep learning, enabling users to be vastly more productive and happier. He continues to be involved in the research and AutoML community and is a co-chair for the 2nd ICLR workshop on Neural Architecture Search.

Denny LeePlayPlay hover00:06

Welcome to Data Brew by Databricks with Denny and Brooke. This series allows us to explore various topics in the data and AI community. Whether we’re talking about data engineering or data science, we’ll interview subject matter experts to dive deeper into those topics, and while we’re at it, please do enjoy your morning brew. My name is Denny Lee and I’m a developer advocate at Databricks.

Expand full transcript