Overview:

Big Data needs proper tools and skills, and this workshop brings you “from zero to hero,” that is, provides the student with the necessary knowledge of Hadoop, Spark, and NoSQL. With these three fundamentals, you will be able to build systems processing massive amounts of data, in archival, batch, interactive and finally real-time manner. The workshop also lays foundations for proper analytics, allowing to extract insights from data.

What you will learn:

  • Hadoop: HDFS, MapReduce, Pig, Hive
  • Spark: Spark core, SparkSQL, Spark Java API, Spark Streaming
  • NoSQL: Cassandra/HBase architecture, Java API, drivers, data modeling

Pre-requisite:

  • comfortable with Java programming language (most programming exercises are in java)
  • comfortable in Linux environment (be able to navigate Linux command line, edit files using vi / nano)

Course Outline: Please download PDF[sc:pdficon ]

Who should attend:

Developers




Need Help Signing Up For Training?

[bs_row class=”row”]
[bs_col class=”col-md-6″]
Please complete below form for help


* These fields are required.

[/bs_col]
[bs_col class=”col-md-6″]

[/bs_col]
[/bs_row]