Just Enough Scala for Spark - Databricks

Just Enough Scala for Spark

Download Slides

Apache Spark is written in Scala. Hence, many if not most data engineers adopting Spark are also adopting Scala, while Python and R remain popular with data scientists. Fortunately, you don’t need to master Scala to use Spark effectively. This session teaches you the core features of Scala you need to know to be effective with Spark’s Scala API. Topics include: 1) classes, methods, and functions, 2) immutable vs. mutable values, 3) type inference, 4) pattern matching, 5) Scala collections and the common operations on them (the basis of Spark’s RDD API), 6) really useful Scala types, like case classes, tuples, and options, 7) effective use of the Spark shell (Scala interpreter), and 8) common mistakes and how to avoid them.