Course Outline
-
Scala primer
- A quick introduction to Scala
- Labs : Getting know Scala
-
Spark Basics
- Background and history
- Spark and Hadoop
- Spark concepts and architecture
- Spark eco system (core, spark sql, mlib, streaming)
- Labs : Installing and running Spark
-
First Look at Spark
- Running Spark in local mode
- Spark web UI
- Spark shell
- Analyzing dataset – part 1
- Inspecting RDDs
- Labs: Spark shell exploration
-
RDDs
- RDDs concepts
- Partitions
- RDD Operations / transformations
- RDD types
- Key-Value pair RDDs
- MapReduce on RDD
- Caching and persistence
- Labs : creating & inspecting RDDs; Caching RDDs
-
Spark API programming
- Introduction to Spark API / RDD API
- Submitting the first program to Spark
- Debugging / logging
- Configuration properties
- Labs : Programming in Spark API, Submitting jobs
-
Spark SQL
- SQL support in Spark
- Dataframes
- Defining tables and importing datasets
- Querying data frames using SQL
- Storage formats : JSON / Parquet
- Labs : Creating and querying data frames; evaluating data formats
-
MLlib
- MLlib intro
- MLlib algorithms
- Labs : Writing MLib applications
-
GraphX
- GraphX library overview
- GraphX APIs
- Labs : Processing graph data using Spark
-
Spark Streaming
- Streaming overview
- Evaluating Streaming platforms
- Streaming operations
- Sliding window operations
- Labs : Writing spark streaming applications
-
Spark and Hadoop
- Hadoop Intro (HDFS / YARN)
- Hadoop + Spark architecture
- Running Spark on Hadoop YARN
- Processing HDFS files using Spark
-
Spark Performance and Tuning
- Broadcast variables
- Accumulators
- Memory management & caching
-
Spark Operations
- Deploying Spark in production
- Sample deployment templates
- Configurations
- Monitoring
- Troubleshooting
Requirements
PRE-REQUISITES
familiarity with either Java / Scala / Python language (our labs in Scala and Python)
basic understanding of Linux development environment (command line navigation / editing files using VI or nano)
Testimonials (6)
Doing similar exercises different ways really help understanding what each component (Hadoop/Spark, standalone/cluster) can do on its own and together. It gave me ideas on how I should test my application on my local machine when I develop vs when it is deployed on a cluster.
Thomas Carcaud - IT Frankfurt GmbH
Course - Spark for Developers
Ajay was very friendly, helpful and also knowledgable about the topic he was discussing.
Biniam Guulay - ICE International Copyright Enterprise Germany GmbH
Course - Spark for Developers
Ernesto did a great job explaining the high level concepts of using Spark and its various modules.
Michael Nemerouf
Course - Spark for Developers
The trainer made the class interesting and entertaining which helps quite a bit with all day training.
Ryan Speelman
Course - Spark for Developers
We know a lot more about the whole environment.
John Kidd
Course - Spark for Developers
Richard is very calm and methodical, with an analytic insight - exactly the qualities needed to present this sort of course.