This 1-day course provides an overview of the core features of Scala that you need to know to use Apache Spark effectively. You’ll learn the most important Scala syntax, idioms, and APIs for Spark development.
Each topic includes lecture content along with hands-on use of Scala through an elegant web-based notebook environment. Students may keep the notebooks and continue to use them with the free Databricks Community Edition offering; all examples are guaranteed to run in that environment. Alternatively, each notebook can be exported as source code and run within any Spark environment.
- Understand the basics of Scala programming, without delving into the more advanced areas of Scala that aren’t necessary for Spark.
- Compare and contrast Scala with languages like Python and Java.
- Learn how to write classes, functions, and full programs in Scala.
- Understand the basics of Scala’s compile time typing.
- Learn how Scala’s compact, powerful syntax can help you write better Spark jobs with less code.
- Brief comparison of Scala and Java
- Brief overview of the Scala language
- How to compile a Scala program
- The Scala shell (interpreter)
- Brief overview of tooling
- developing Scala in an IDE
- SBT, the Scala Build Tool
- Basic Scala syntax
- Scala variables, including mutable vs. immutable values
- Basic Scala types (primitives, tuples)
- Control flow (loops, conditionals)
- Functions, and lambdas
- Object-oriented programming in Scala
- classes, traits and inheritance
- Scala collections and the common operations on them
- Type inference
- Overview of functional vs. imperative programming
- Case classes
- Pattern matching