Master Big Data - Apache Spark/Hadoop/Sqoop/Hive/Flume/Mongo
In-depth course on Big Data - Apache Spark , Hadoop , Sqoop , Flume & Apache Hive, MongoDB & Big Data Cluster setup
Description
In this course, you will start by learning what is hadoop distributed file system and most common hadoop commands required to work with Hadoop File system.
Then you will be introduced to Sqoop Import
Understand lifecycle of sqoop command.
Use sqoop import command to migrate data from Mysql to HDFS.
Use sqoop import command to migrate data from Mysql to Hive.
Use various file formats, compressions, file delimeter,where clause and queries while importing the data.
Understand split-by and boundary queries.
Use incremental mode to migrate the data from Mysql to HDFS.
Further, you will learn Sqoop Export to migrate data.
What is sqoop export
Using sqoop export, migrate data from HDFS to Mysql.
Using sqoop export, migrate data from Hive to Mysql.
Further, you will learn about Apache Flume
Understand Flume Architecture.
Using flume, Ingest data from Twitter and save to HDFS.
Using flume, Ingest data from netcat and save to HDFS.
Using flume, Ingest data from exec and show on console.
Describe flume interceptors and see examples of using interceptors.
Flume multiple agents
Flume Consolidation.
In the next section, we will learn about Apache Hive
Hive Intro
External & Managed Tables
Working with Different Files - Parquet,Avro
Compressions
Hive Analysis
Hive String Functions
Hive Date Functions
Partitioning
Bucketing
You will learn about Apache Spark
Spark Intro
Cluster Overview
RDD
DAG/Stages/Tasks
Actions & Transformations
Transformation & Action Examples
Spark Data frames
Spark Data frames - working with diff File Formats & Compression
Dataframes API's
Spark SQL
Dataframe Examples
Spark with Cassandra Integration
Running Spark on Intellij IDE
Running Spark on EMR
What You Will Learn!
- Hadoop distributed File system and commands. Lifecycle of sqoop command. Sqoop import command to migrate data from Mysql to HDFS. Sqoop import command to migrate data from Mysql to Hive. Working with various file formats, compressions, file delimeter,where clause and queries while importing the data. Understand split-by and boundary queries. Use incremental mode to migrate the data from Mysql to HDFS. Using sqoop export, migrate data from HDFS to Mysql. Using sqoop export, migrate data from Hive to Mysql. Understand Flume Architecture. Using flume, Ingest data from Twitter and save to HDFS. Using flume, Ingest data from netcat and save to HDFS. Using flume, Ingest data from exec and show on console. Flume Interceptors.
Who Should Attend!
- Who want to learn big data in detail