Skip to product information
1 of 1

If your organization is about to enter the world of bigdata, you not only need to decide whether Apache Hadoop is the right platformto use, but also which of its many components are best suited to your task.This field guide makes the exercise manageable by breaking down the Hadoopecosystem into short, digestible sections. You’ll quickly understand howHadoop’s projects, subprojects, and related technologies work together.

Each chapter introduces a different topic—such as coretechnologies or data transfer—and explains why certain components may or maynot be useful for particular needs. When it comes to data, Hadoop is a wholenew ballgame, but with this handy reference, you’ll have a good grasp of theplaying field.

Topics include:

  • Core technologies—Hadoop Distributed File System (HDFS), MapReduce, YARN, and Spark
  • Database and data management—Cassandra, HBase, MongoDB, and Hive
  • Serialization—Avro, JSON, and Parquet
  • Management and monitoring—Puppet, Chef, Zookeeper, and Oozie
  • Analytic helpers—Pig, Mahout, and MLLib
  • Data transfer—Scoop, Flume, distcp, and Storm
  • Security, access control, auditing—Sentry, Kerberos, and Knox
  • Cloud computing and virtualization—Serengeti, Docker, and Whirr

View full details