Please enable Javascript to correctly display the contents on Dot Net Tricks!

 

HADOOP DEVELOPMENT CUM ADMINISTRATION

Reviews | 316 Learners Course Agenda
The Hadoop Development cum Administration course is primarily designed for Software Professional(s)/ Manager(s)/ Architect(s) who want to learn how to manage large and complex data sets and scale up it from single servers to thousands of machines. In this course, you will learn the basic and advance in-depth concepts of Big-Data and Hadoop along with implementation. This course will give insight on Hadoop 2.0, Name Node, YARN, MapReduce, Spark, Oozie, Scala, Hadoop Cluster and Hadoop Administration etc.

COURSE PREVIEW

Type
: Advanced Training Program
Audience(s)
: Software Professional(s)/Manager(s)/ Architect(s)
Tools/IDE
: Eclipse, Cloudera
Delivery method(s)
: Instructor-led Classroom/Online Training
Duration
: 50 Hours
Language
: English

 

PRICING DETAILS

17,250/280
16,250/250

  • About the course
  • Course Curriculum
  • Assessment
  • Projects
  • F&Q's

About the course

The Hadoop Development cum Administration course is primarily designed for Software Professional(s)/ Manager(s)/ Architect(s) who want to learn how to manage large and complex data sets and scale up it from single servers to thousands of machines. In this course, you will learn the basic and advance in-depth concepts of Big-Data and Hadoop along with implementation. This course will give insight on Hadoop 2.0, Name Node, YARN, MapReduce, Spark, Oozie, Scala, Hadoop Cluster and Hadoop Administration etc.

This course helps you to learn how to deploy, configure, manage, monitor, and secure a Hadoop Cluster.

Course objectives

At the completion of this course, attendees will be able to;

  1. Explore Java basics with respect to Hadoop.
  2. Explore Hadoop 2.x Architecture.
  3. Configure Hadoop and its components.
  4. Get Mastering the concepts of HDFS and MapReduce framework.
  5. Setup Hadoop cluster and write complex MapReduce programs.
  6. Implement HBase and MapReduce integration.
  7. Performe data analytics using Pig, Hive and YARN.
  8. Job Schedule using Oozie.
  9. Implement the data storage layer using Hadoop.
  10. Do deployment, backup and recovery.

Who can do this course?

All Software Professional(s)/ Manager(s)/ Architect(s) who are keen to learn how to manage large and complex data sets and scale up it from single machine to thousands of machines should go for this course.

Pre-requisites

There are no pre-requisites to join this course but if you have knowledge of Java and Linux, it might help you.

Core Java

  1. Setup Environment
  2. Features
  3. Development Process
  4. Hello Word Program
  5. Phase of Java Program
  6. Working on Eclipse
  7. Classes
  8. Instance Variable
  9. Static variable
  10. Methods (Setter & getter)
  11. Method Overloading
  12. Constructor
  13. Access Modifier
  14. Coding Guidelines
  15. Exercise
  16. Packages & Class Path

Core Java (Contd.)

  1. Importing classes
  2. Using class or other packages
  3. Inheritance
  4. Final Classes
  5. Abstract Class
  6. Exception Handling
  7. Input/Output Option in Java
  8. Control Structure
  9. Java Collection

Unix/Linux

  1. History of Linux
  2. Key Feature
  3. Installing Ubuntu
  4. Linux Structure
  5. Commands
  6. Assignment
  7. About Amazon Cloud (EC2 and S3)

Core Hadoop

  1. Introduction
  2. Case Study
  3. Storage HDFS
  4. HDFS Features
  5. Architecture
  6. NameNode function
  7. HDFS High Availability
  8. HDFS Read
  9. HDFS Write
  10. Hadoop Setup
  11. SSH Installation
  12. JAVA Installation
  13. Hadoop Installation
  14. Hadoop Configuration
  15. Starting Hadoop
  16. Pig Installation
  17. Command Line Utilities
  18. JAVA API

Map Reduce

  1. Introduction to Map Reduce
  2. Case Study
  3. MRV1 & MRV2
  4. Yarn

Map Reduce Programming

  1. Mapper, Reduce
  2. Combiner, Sort, Shuffle etc.
  3. Assignments

Spark

  1. Big Data Analytics Methodologies
  2. Drawback of Map Reduce
  3. Overview of Spark
  4. Different of RDD
  5. Understanding of Scala
  6. Different type of operation in Spark
  7. Job Submission in Spark
  8. Spark Monitoring
  9. Assignment

Pig

  1. Introduction to Pig
  2. Execution Mode
  3. Pig Latin Basics
  4. Developers Alert
  5. Resources
  6. Running A Pig Script
  7. Assignment

Hive

  1. Introduction to Hive
  2. Features
  3. Hive Architecture
  4. Hive Query
  5. Internal vs. External Table
  6. Limitation of Hive
  7. Different type of engine.
  8. Assignment

Sqoop

  1. Introduction to Sqoop
  2. Advantages of Sqoop
  3. Workflow
  4. Import with Sqoop
  5. Export with Sqoop
  6. Debug issue with sqoop

HBase

  1. Introduction to HBase
  2. Hello Word of HBase
  3. Hbase Basics
  4. Connectivity with API
  5. HBase Design Consideration
  6. Commands
  7. Assignments

Impala

  1. Introduction to Impala
  2. Architecture of Impala
  3. Impala & HDFS
  4. Impala & Hbase
  5. Impala & Hive
  6. Impala Engine
  7. Query Execution step
  8. Different ways of accessing Impala

Oozie

  1. Introduction to Oozie
  2. Workflow
  3. Control Flow
  4. Oozie with different tools
  5. Case Study

Scala

  1. Introduction to Scala
  2. Properties
  3. HBase Basics
  4. Installing
  5. Programming Overview
  6. Assignments

Zookeeper

  1. Introduction to Zookeeper
  2. Why zookeeper
  3. Installation
  4. Modes of zookeeper
  5. Observer
  6. Follower
  7. Leader
  8. Use Case

Navigator, Falcon, Ranger

  1. Introduction
  2. Setup
  3. Practical example

Basic Analytics

  1. Introduction to R
  2. Setup studio
  3. Case study

Hotonworks and Ambari

  1. Introduction to Hotonworks
  2. Introduction to Ambari
  3. Hands-On Ambari

Mock-up Tests and Assignments

Dot Net Tricks's mock-up tests and assignments help the professionals to work on real-world projects and to get an edge in their careers and make their lives better. This training program includes 4 mock-up tests and 6 assignments.

Mockup-Tests objective

  1. Help you to monitor your learning progress.
  2. Help you to evaluate yourself.
  3. Help you to crack your technical interview first round with objective questions.

Assignments objective

  1. Gain confidence to work on Hadoop.
  2. Help you to evaluate your development skills.
  3. Prepare yourself for real-application development.

Projects

The primary goal of hands over projects is to understand how to use Hadoop in real application database management and analysis. Here we will be using PIG, HIVE, HBase and MapReduce to perform Data analytics.

Project #1 - Retail Data Analysis

  1. Data Analysis using Hive
  2. Demand of a given product
  3. Trends and Seasonality of sales
  4. Understanding performance of change
  5. Loyal customer identification

Project #2 - Messaging Service Analysis

  1. Data Analysis using HBase
  2. Finding a particular user follower
  3. Finding the average number of people within a year range
  4. Comparison of output of Hive and MapReduce

Frequently Asked Questions

Q1. Do you provide any course material?

Yes we do. All relevant course material and exercises you will be get through our mentors.

Q2. Do you provide any class video?

Yes we do. You will get the recorded sessions of your own online training classes, so that you can revise your class when you want.

Q3. What If I miss my online training class?

All online training classes are recorded. You will get the recorded sessions so that you can watch the online classes when you want. Also, you can join other class to do your missing classes.

Q4. What If I miss my classroom training class?

You can join other on going classroom batch classes to do your missing classes.

Q5. Do you prepare me for the job interview?

Yes, we do. We will discuss all possible technical interview questions and answers during the training program so that you can prepare yourself for interview.

Q6. Do you provide hands-on real application development?

Yes, we do provide you to do hands-on real application development.

Q7. Whom do I contact, if I have more queries?

You can give us a CALL at +91 113 303 4100 OR email us at enquiry@dotnettricks.com

CONTACT US

+91 11 330 34100

ABOUT MENTOR

Shubham Pandey
Author and Hadoop Evangelist


Shubham Pandey is a hadoop Evangelist. He has vast Experience on Java and Big Data Technologies like Hadoop, Yarn, Map-Reduce, Big Data Stack, Hive, Pig , Oozie, Amazon Alexa Echo, NPL, Google Glass Application Development, Android Application Development Cloud Computing and AWS.


Professional Speaks

+