BT

New Early adopter or innovator? InfoQ has been working on some new features for you. Learn more

You are now in FULL VIEW
CLOSE FULL VIEW

Running the Largest Hadoop DFS Cluster
Recorded at:

| by Hairong Kuang Follow 0 Followers on Mar 15, 2013 |
44:36

Summary
Hairong Kuang explains how Facebook uses HDFS to store and analyze over 100PB of user log data.

Sponsored Content

Bio

Hairong Kuang currently leads the development of Hadoop Distributed File System (HDFS) at Facebook. She has been a long time contributor and committer to the Apache Hadoop project since she joined Yahoo!. Prior to industry, she was an Assistant Professor at California State Polytechnic University, Pomona. She received Ph.D. in Computer Science from the University of California at Irvine.

Cloud Tech is an ops conference for ops people, Saturday, October 6th, from 9am to 6pm. Come join us at the Computer History Museum in Mountain View, CA for a full 8 hours of learning directly from great minds sharing their secrets!

Login to InfoQ to interact with what matters most to you.


Recover your password...

Follow

Follow your favorite topics and editors

Quick overview of most important highlights in the industry and on the site.

Like

More signal, less noise

Build your own feed by choosing topics you want to read about and editors you want to hear from.

Notifications

Stay up-to-date

Set up your notifications and don't miss out on content that matters to you

BT