HDPCD:Spark using Scala

Prepare for Hortonworks HDP Certified Developer - Spark using Scala as programming language

HDPCD:Spark using Scala
HDPCD:Spark using Scala

HDPCD:Spark using Scala udemy course

Prepare for Hortonworks HDP Certified Developer - Spark using Scala as programming language

What you'll learn:

  • Learn Scala, Spark, HDFS etc for the preparation of HDPCD Spark certification

Requirements:

  • Basic programming skills
  • Hortonworks Sandbox or valid account for IT Versity Big Data labs or any Hadoop clusters where Hadoop, Hive and Spark are well integrated.
  • Minimum memory required based on the environment you are using with 64 bit operating system
 

Description:

Course cover the overall syllabus of HDPCD:Spark Certification.

  • Scala Fundamentals - Basic Scala programming required using REPL HDPCD:Spark using Scala Udemy

  • Getting Started with Spark - Different setup options, setup process

  • Core Spark - Transformations and Actions to process the data

  • Data Frames and Spark SQL - Leverage SQL skills on top of Data Frames created from Hive tables or RDD

  • One week complementary lab access

  • Exercises - A set of self evaluated exercises to test skills for certification purpose

After the course one will gain enough confidence to give the certification and crack it.

All the demos are given on our state of the art Big Data cluster. You can avail one week complementary lab access by filling this form which is provided as part of the welcome message.

Who this course is for:

Course Details:

  • 18 hours on-demand video
  • 2 downloadable resources
  • Full lifetime access
  • Access on mobile and TV
  • Certificate of completion

HDPCD:Spark using Scala udemy free download

Prepare for Hortonworks HDP Certified Developer - Spark using Scala as programming language

Demo Link: https://www.udemy.com/course/hdpcd-spark-using-scala/