Deep Learning and Modern NLP - Databricks

Deep Learning and Modern NLP

In this tutorial, we’ll cover the fundamental building blocks of neural network architectures and how they are utilized to tackle problems in modern natural language processing. Topics covered will include an overview of language vector representations, text classification, named entity recognition, and sequence to sequence modeling approaches. An emphasis will be placed on the shape of these types of problems from the perspective of deep learning architectures. This will help to develop an intuition for identifying which neural network techniques are the most applicable to new problems that practitioners may encounter.

This tutorial is targeted towards those interested in either natural language processing or deep learning. I’ll assume little experience with NLP or deep learning, and will try to build up an intuition from the ground up using a highly visual approach to describe neural networks.

This tutorial would be ideal for data scientists currently working or interested in NLP or deep learning, or analytic or business professionals interested in learning about what types of problems can be solved with modern NLP techniques.

This will be both instructor-led and hands-on interactive session. Instructions in how to get tutorial material will be covered in class.

What you will learn:

– How to train neural networks for text classification tasks

– How to use neural networks to create vector representations of words and sentences

– How to build neural networks for sequence tagging tasks

– How to build a simple neural network for machine translation tasks


– A fully-charged laptop (8-16GB memory) with Chrome or Firefox and Anaconda Python 3

– Intermediate knowledge of Python, Machine Learning and Deep Learning concepts


Try Databricks
See More Spark + AI Summit in San Francisco 2019 Videos

« back
About Zachary S. Brown

Zachary is currently a Lead Data Scientist at S&P Global Market Intelligence, where he leads a small team with a focus on modern natural language processing and its application to content classification and data extraction. Zachary received his Ph.D. in Computational Physics from The College of William & Mary in 2014, where he calculated features of the strong force using simulations on high-performance computing clusters. He has a passion for education, and has led and contributed to data science education initiatives at Capital One, Cloudera, and most recently at S&P Global. In his free time, he helps to organize the local Data Science Community Meetup and occasionally teaches college physics courses in Richmond, Virginia.