Skip to content

Apache Spark Training in Singapore and India

Save 22% Save 22%
Original price Rs. 290,000.00
Original price Rs. 290,000.00 - Original price Rs. 290,000.00
Original price Rs. 290,000.00
Current price Rs. 225,000.00
Rs. 225,000.00 - Rs. 225,000.00
Current price Rs. 225,000.00
Apache Spark Training by Cloud Enabled Pte Ltd in Singapore and India
Course Summary

This five-day hands-on training course delivers the key concepts and expertise developers need to use Apache Spark to develop high-performance parallel applications. Participants will learn how to use Spark Core and Spark SQL to query structured data and Spark Streaming to perform real-time processing on streaming data from a variety of sources.

Developers will also practice writing applications that use core Spark to perform ETL processing and iterative algorithms. The course covers how to work with “big data” stored in a distributed file system and execute Spark applications on a Hadoop cluster. After taking this course, participants will be prepared to face real-world challenges and build applications to execute faster decisions, better decisions, and interactive analysis, applied to a wide variety of use cases, architectures, and industries.

Course Objectives
  • Utilize How the Apache Hadoop ecosystem fits in with the data processing lifecycle
  • How data is distributed, stored, and processed in a Hadoop cluster
  • How to write, configure, and deploy Apache Spark applications on a Hadoop cluster
  • How to use the Spark shell and Spark applications to explore, process, and analyze distributed data
  • How to query data using RDD, Spark SQL, DataFrames, and Datasets
  • How to use Spark Streaming to process a live data stream
Course Pre- requisites
  • Basic understanding of distributed frameworks and any object-oriented language
Course Duration
  • 35 hours – 5 days
Course Outline

Scala primer

  • A quick introduction to Scala
  • Labs : Getting know Scala

Spark Basics

  • Background and history
  • Spark and Hadoop
  • Spark concepts and architecture
  • Spark eco system (core, spark sql, mlib, streaming)
  • Labs : Installing and running Spark

First Look at Spark

  • Running Spark in local mode
  • Spark web UI
  • Spark shell
  • Analyzing dataset – part 1
  • Inspecting RDDs
  • Labs: Spark shell exploration


  • RDDs concepts
  • Partitions
  • RDD Operations / transformations
  • RDD types
  • Key-Value pair RDDs
  • MapReduce on RDD
  • Caching and persistence
  • Labs : creating & inspecting RDDs; Caching RDDs

Spark API programming

  • Introduction to Spark API / RDD API
  • Submitting the first program to Spark
  • Debugging / logging
  • Configuration properties
  • Labs : Programming in Spark API, Submitting jobs

Spark SQL

  • SQL support in Spark
  • Dataframes
  • Defining tables and importing datasets
  • Querying data frames using SQL
  • Storage formats : JSON / Parquet
  • Labs : Creating and querying data frames; evaluating data formats


  • MLlib intro
  • MLlib algorithms
  • Labs : Writing MLib applications


  • GraphX library overview
  • GraphX APIs
  • Labs : Processing graph data using Spark

Spark Streaming

  • Streaming overview
  • Evaluating Streaming platforms
  • Streaming operations
  • Sliding window operations
  • Labs : Writing spark streaming applications

Spark and Hadoop

  • Hadoop Intro (HDFS / YARN)
  • Hadoop + Spark architecture
  • Running Spark on Hadoop YARN
  • Processing HDFS files using Spark

Spark Performance and Tuning

  • Broadcast variables
  • Accumulators
  • Memory management & caching

Spark Operations

  • Deploying Spark in production
  • Sample deployment templates
  • Configurations
  • Monitoring
  • Troubleshooting
Training Delivery Mode

Online - Live Instructor Led training 

Due to Covid - we dont engage classroom training till situations are ok

Got Questions

Please email to and we will be happy to help

This course is designed , developed and delivered by Cloud Enabled Pte Ltd