Questions About This Book?
Why should I rent this book?
Renting is easy, fast, and cheap! Renting from eCampus.com can save you hundreds of dollars compared to the cost of new or used books each semester. At the end of the semester, simply ship the book back to us with a free UPS shipping label! No need to worry about selling it back.
How do rental returns work?
Returning books is as easy as possible. As your rental due date approaches, we will email you several courtesy reminders. When you are ready to return, you can print a free UPS shipping label from our website at any time. Then, just return the book to your UPS driver or any staffed UPS location. You can even use the same box we shipped it in!
What version or edition is this?
This is the edition with a publication date of 3/25/2015.
What is included with this book?
- The New copy of this book will include any supplemental materials advertised. Please check the title of the book to determine if it should include any CDs, lab manuals, study guides, etc.
- The Rental copy of this book is not guaranteed to include any supplemental materials. You may receive a brand new copy, but typically, only the book itself.
Working with big data for the first time? This unique guide shows you how to use simple, fun, and elegant tools working with Apache Hadoop. You'll learn how to break problems into efficient data transformations to meet most of your analysis needs. It's an approach that not only works well for programmers just beginning to tackle big data, but for anyone using Hadoop. Written by Philip Kromer, founder and CTO at Infochimps, this book presents real data to describe patterns found in many problem domains-such as statistical summaries and advanced queries against spatial or time-series data sets. You'll also learn how and when to integrate custom components or extend the toolkit. Learn from detailed example programs that apply Hadoop to interesting problems in context Gain advice and best practices for efficient software development Discover how to think at scale with a deep understanding of how data must flow through the cluster to effect transformations