Big Data and Hadoop Self Paced Course
|Self Paced Classes|
(25% off - Early Bird) (Incl. Taxes)
|Upcoming Live Courses|
Our Big Data and Hadoop course is designed to impart knowledge, skills and hands on experience required to become a successful Hadoop Developer, Administrator or Tester.
An in-depth knowledge of the various tools and technologies that come under the Hadoop Eco-System will be covered in details. The focus will be on hands-on experience.
Concepts Covered: Big Data, NoSQL, Streaming, Analytics
Tools Covered: HDFS, MapReduce, Pig, Hive, HBASE, Zookeeper, Flume, Sqoop, Oozie, Spark, Mahout
Why learn Big Data and Hadoop?
Big Data is a collection of massive and complex data sets that are very difficult to manage and process with the existing tools intended for that purpose. Data generation is becoming a more obvious result of our everyday devices becoming cheaper, more powerful, compact and connected. We are generating data all the time such as tweeting, using emails, using facebook, uploading photos etc. Similarly our devices are also connected and are generating data. The result is a gargantuan mass of data that needs to be looked at for informed decision making.
The only way ahead for organisations is to be able to store and process such large amounts of data, and for which, they use Big Data platforms like Hadoop. That proves the high demand of Hadoop Developers, Administrators, Tester and Scientists.
The other way to measure the demand for Big Data and Hadoop technologies is to look at the number of jobs being posted around the world on these technologies.
Also, Big Data features in the top #3 technology trends in organisations as per Forbes and Gartner.
Our classes are conducted live online by our instructors via webinar or hangout. These are not pre-recorded classes. The instructor delivers the class using presentations, collaborative drawing tools, screenshares. All attendees are usually muted during the class. However, they can ask questions in the webinar or hangout chat windows. The instructor answers any questions asked immediately after explaining a concept. The instructor also asks questions during the sessions to ensure maximum student engagement.
Every class is recorded, complete with the screen and the audio, and uploaded to the Learning Management System which is accessible to our attendees for life.
At the end of each session, assignments are provided which the attendees have to submit in the LMS (Learning Management System). The assignments are continuously reviewed by our instructors and teaching assistants. In case we conclude that an attendee requires extra detailing, we schedule extra one-on-one sessions with that attendee.
What makes Big Data and Hadoop course unique?
- Interactive Classes: More Questions. Less Lectures.
- Simple explanations to complex topics by industry experts
- Hands on workshops and real time projects.
- Quizzes & Assignments
- Certificate of Course at the end of course
- A real time project involving Hadoop
- Lifetime access to course content
CloudxLab™- Access to the cloud infrastructure if learners don't wish to install Hadoop on their computers
What are the prerequisites to join Big Data and Hadoop course?
To be able to take maximum benefit out of this course, you should have knowledge of the following:
- Basics Of SQL. You should know the basics of SQL and databases.
- A know-how of the basics of programming.We will be providing video classes covering the basics of Java. What is expected of the attendee is the ability to create a directory and see whats inside a file from the command line, and an understanding of 'loops' in any programming language.
In addition, the attendee should have the following hardware infrastructure:
- A good internet connection. An internet speed of 2mbps is good enough.
- Access to a computer. Since it is an online course, you would have to install webinar or hangout on your computer.
- Nice To Have: A power backup for your router as well as computer.
- Nice To Have: A good quality headphones.
What kind of project / real time experience?
After all sessions are over, we ask for the student's preference for a project. We form teams of 3-4 members and based on their interests we assign a project to each team. A project is usually of three weeks duration. If a team has an idea it wants to work on as a project, we screen the idea and the team can work on it, or we assign a project from the industry. Since it is not possible to provide real data from the industry, we provide data anonymously for projects. We continuously support and guide the teams during projects by conducting regular scheduled meetings and also provide individual assistance.
The projects assigned can also be based on public databases. There are various datasets available for free that can be found on any of the following websites:
A few examples of projects are as follows:
- Understanding the trends and patterns in BitCoin transaction graphs by qualitative analysis. BitCoin is a virtual currency. The way a coin is mined is based on transaction logs. BitCoin transaction logs keep growing almost every mili second, and therefore, processing these transaction logs is a real challenge.
- Understanding the correlation between the temperature of various cities and the stock market.
- Processing Apache Log for ERRORs. Preparing web analytics based on apache weblogs:
- Which services are slow
- Which services have a high number of users
- What is the failure rate of each service
- Preparing recommendations based on the apache logs.
- Using social media to compare a brand's marketing campaigns. The testing is basically done using sentiment analysis.
Global Solutions ArchitectLinkedIn Profile
Mohan DasQuora Profile
Sukhwinder SinghQuora Profile
Vivek AgarwalQuora Profile
VP - Engineering at CommonFloorLinkedIn Profile
Data Scientist at Data SemanticsLinkedIn Profile
Consultant - Smart Grid Communications at Essel Vidyut Vitaran Nigam
PhD, Wireless Mesh NetworksLinkedIn Profile
Director Engineering, Target Technology ServicesLinkedIn Profile
Dr. Makhan Virdi
Researcher, NASA - DAACLinkedIn Profile
Big Data with Spark: This is not a typical (online) classroom course. It is not just a series of videos with one way flow of information. Instead, it is a highly interactive setting where the instructor shares insightful details when any question/doubt is raised during the lecture. Sandeep passionately teaches complicated concepts in easy to understand language, supported with good analogies and effective examples. The course is well structured, covering the concepts of Big Data in width and depth. I am currently half-way through the course and I am already working on translating the concepts learned in the class to real world problems.
See More Reviews at FaceBook Page.
Big Data and Hadoop Introduction Session
What Certificate do we provide?
Based on your performance in Quizzes, Assignments and Projects, we provide the certificate in the following forms:
- 1. Hard Copy
We send a hard copy of the certificate to your address.
- 2. Digitally Signed Copy
We provide the PDF of the certificate that is digitally signed by KnowBigData.com.
- 3. Share Your Success
Share your course record with employers and educational institutions through a secure permanent URL.
- 4. LinkedIn Recommendation & Endorsements
We will provide a LinkedIn Recommendation based on your performance. Also, we will endorse you with tags such as Hadoop, Big Data.
- 5. Verifiable Certificate
We have provided an online form to validate whether the certificate is correct or not here. This assists recruiters to verify the certificate provided by us.
About the Team
Founder & Chief Instructor
Past Amazon.com, InMobi.com, Founder @ tBits Global, D.E.Shaw
Education Indian Institute of Technology, Roorkee
For last 12 years, Sandeep has been building products and churning large amounts of data for various product firms. He has an all around experience of software development and big data analysis.
Apart from digging data and technologies, Sandeep enjoys conducting interviews and explaining difficult concepts in simple ways.
Big Data and Hadoop - Frequently Asked Questions
Yes. Java is generally required for understanding MapReduce. MapReduce is a programming paradigm for writing your logic in the form of Mapper and reducer functions. We have worked hard to make it possible to understand MapReduce without the knowledge of Java. Java is only required for a class of 90 mins where we discuss advanced configuration of MapReduce. So, if you qualify the following three criteria, :
- Basics Of SQL. You should know the basics of SQL and databases. If you know about filters in SQL, you are expected to understand the course.
- A know-how of the basics of programming. If you understand 'loops' in any programming language, and if you are able to create a directory and see whats inside a file from the command line, you are good to get the concepts of this course even if you have not really touched programming for the last 10 years! In addition, we will be providing video classes on the basics of Java .
- Nevertheless, we provide a self paced 8 hours course on Java for free. As you as you signup, it would be available in your "My Courses" section.
- Using the our CloudxLabTo give our candidates a real experience of big data computing, we have provided a bunch of computers with all the big data technologies running on them since most of the big data technologies make sense only if done using multiple machines. You only have to use SSH Client (putty on windows) to connect to our cluster. Whether you are at home or office, and whether you are using a laptop or a tablet, you would be able to use hadoop. See more details about CloudxLab, here.
- Using Virtual MachinesSecond and the traditional way to experiment on Hadoop is to install a Virtual Machine. We will assist you in setting up Virtual Machine. However, most of our students are so happy with our CloudxLab that they hardly install a Virtual Machine.
Yes, we provide our own Certification. At the end of your course, you will work on a real time project. You will receive a Problem Statement along with a data-set to work on our CloudxLab. Once you are successfully through the project (Reviewed by an expert), you will be awarded a certificate with a performance-based grading.If your project is not approved in the first attempt, you can take extra assistance to understand concepts better and reattempt the project free of cost.
Hadoop is one of the hottest career options available today for software engineers. There are around 12,000 jobs currently in U.S. alone for Hadoop developers and the demand for Hadoop developers is far more than the availability. Learn more about career prospects in Hadoop at naukri.com and indeed.com.
Our cluster has all the softwares that are required for the course plus some more components such as GIT and R. In case you require a particular software to be installed on cluster which is not already there, please let us know.
|Self Paced Classes|
(25% off - Early Bird) (Incl. Taxes)
|Upcoming Live Courses|