Course Outline
1: HDFS (17%)
- Describe the function of HDFS Daemons
- Describe the normal operation of an Apache Hadoop cluster, both in data storage and in data processing.
- Identify current features of computing systems that motivate a system like Apache Hadoop.
- Classify major goals of HDFS Design
- Given a scenario, identify appropriate use case for HDFS Federation
- Identify components and daemon of an HDFS HA-Quorum cluster
- Analyze the role of HDFS security (Kerberos)
- Determine the best data serialization choice for a given scenario
- Describe file read and write paths
- Identify the commands to manipulate files in the Hadoop File System Shell
2: YARN and MapReduce version 2 (MRv2) (17%)
- Understand how upgrading a cluster from Hadoop 1 to Hadoop 2 affects cluster settings
- Understand how to deploy MapReduce v2 (MRv2 / YARN), including all YARN daemons
- Understand basic design strategy for MapReduce v2 (MRv2)
- Determine how YARN handles resource allocations
- Identify the workflow of MapReduce job running on YARN
- Determine which files you must change and how in order to migrate a cluster from MapReduce version 1 (MRv1) to MapReduce version 2 (MRv2) running on YARN.
3: Hadoop Cluster Planning (16%)
- Principal points to consider in choosing the hardware and operating systems to host an Apache Hadoop cluster.
- Analyze the choices in selecting an OS
- Understand kernel tuning and disk swapping
- Given a scenario and workload pattern, identify a hardware configuration appropriate to the scenario
- Given a scenario, determine the ecosystem components your cluster needs to run in order to fulfill the SLA
- Cluster sizing: given a scenario and frequency of execution, identify the specifics for the workload, including CPU, memory, storage, disk I/O
- Disk Sizing and Configuration, including JBOD versus RAID, SANs, virtualization, and disk sizing requirements in a cluster
- Network Topologies: understand network usage in Hadoop (for both HDFS and MapReduce) and propose or identify key network design components for a given scenario
4: Hadoop Cluster Installation and Administration (25%)
- Given a scenario, identify how the cluster will handle disk and machine failures
- Analyze a logging configuration and logging configuration file format
- Understand the basics of Hadoop metrics and cluster health monitoring
- Identify the function and purpose of available tools for cluster monitoring
- Be able to install all the ecosystem components in CDH 5, including (but not limited to): Impala, Flume, Oozie, Hue, Manager, Sqoop, Hive, and Pig
- Identify the function and purpose of available tools for managing the Apache Hadoop file system
5: Resource Management (10%)
- Understand the overall design goals of each of Hadoop schedulers
- Given a scenario, determine how the FIFO Scheduler allocates cluster resources
- Given a scenario, determine how the Fair Scheduler allocates cluster resources under YARN
- Given a scenario, determine how the Capacity Scheduler allocates cluster resources
6: Monitoring and Logging (15%)
- Understand the functions and features of Hadoop’s metric collection abilities
- Analyze the NameNode and JobTracker Web UIs
- Understand how to monitor cluster Daemons
- Identify and monitor CPU usage on master nodes
- Describe how to monitor swap and memory allocation on all nodes
- Identify how to view and manage Hadoop’s log files
- Interpret a log file
Requirements
- Basic Linux administration skills
- Basic programming skills
Testimonials
I mostly liked the trainer giving real live Examples.
Simon Hahn
I genuinely enjoyed the big competences of Trainer.
Grzegorz Gorski
I genuinely enjoyed the many hands-on sessions.
Jacek Pieczątka
It was very hands-on, we spent half the time actually doing things in Clouded/Hardtop, running different commands, checking the system, and so on. The extra materials (books, websites, etc. .) were really appreciated, we will have to continue to learn. The installations were quite fun, and very handy, the cluster setup from scratch was really good.
Ericsson
Lot of hands-on exercises.
- Ericsson
Ambari management tool. Ability to discuss practical Hadoop experiences from other business case than telecom.
- Ericsson
Related Courses
Programming with Big Data in R
21 hoursBig Data is a term that refers to solutions destined for storing and processing large data sets. Developed by Google initially, these Big Data solutions have evolved and inspired other similar projects, many of which are available as open-source. R
Marketing Analytics using R
21 hoursAudience Business owners (marketing managers, product managers, customer base managers) and their teams; customer insights professionals. Overview The course follows the customer life cycle from acquiring new customers, managing the
Introduction to R
21 hoursR is an open-source free programming language for statistical computing, data analysis, and graphics. R is used by a growing number of managers and data analysts inside corporations and academia. R has also found followers among
Neural Network in R
14 hoursThis course is an introduction to applying neural networks in real world problems using R-project software.
Advanced R Programming
7 hoursThis course is for data scientists and statisticians that already have basic R & C++ coding skills and R code and need advanced R coding skills. The purpose is to give a practical advanced R programming course to participants interested in
Data Mining with R
14 hoursR is an open-source free programming language for statistical computing, data analysis, and graphics. R is used by a growing number of managers and data analysts inside corporations and academia. R has a wide variety of packages for data
Econometrics: Eviews and Risk Simulator
21 hoursEconometrics is the application of economic data and statistical methods to provide quantitative analysis of economic phenomena. This instructor-led, live training (online or onsite) is aimed at anyone who wishes to learn and master the
Statistical Analysis using SPSS
21 hoursSPSS is software for editing and analyzing data.
Statistical and Econometric Modelling
21 hoursThe objective of the course is to enable participants to gain a mastery of the fundamentals of statistical and econometric modelling.
Forecasting with R
14 hoursThis course allows delegate to fully automate the process of forecasting with R
R for Data Analysis and Research
7 hoursAudience managers developers scientists students Format of the course on-line instruction and discussion OR face-to-face workshops
HR Analytics for Public Organisations
14 hoursThis instructor-led, live training (online or onsite) is aimed at HR professionals who wish to use analytical methods improve organisational performance. This course covers qualitative as well as quantitative, empirical and
Talent Acquisition Analytics
14 hoursThis instructor-led, live training (online or onsite) is aimed at HR professionals and recruitment specialists who wish to use analytical methods improve organisational performance. This course covers qualitative as well as
Machine Learning Fundamentals with R
14 hoursThe aim of this course is to provide a basic proficiency in applying Machine Learning methods in practice. Through the use of the R programming platform and its various libraries, and based on a multitude of practical examples this course teaches
Introduction to Data Visualization with Tidyverse and R
7 hoursThe Tidyverse is a collection of versatile R packages for cleaning, processing, modeling, and visualizing data. Some of the packages included are: ggplot2, dplyr, tidyr, readr, purrr, and tibble. In this instructor-led, live training,