InfoQ Homepage Presentations Running the Largest Hadoop DFS Cluster
Running the Largest Hadoop DFS Cluster
Summary
Hairong Kuang explains how Facebook uses HDFS to store and analyze over 100PB of user log data.
Bio
Hairong Kuang currently leads the development of Hadoop Distributed File System (HDFS) at Facebook. She has been a long time contributor and committer to the Apache Hadoop project since she joined Yahoo!. Prior to industry, she was an Assistant Professor at California State Polytechnic University, Pomona. She received Ph.D. in Computer Science from the University of California at Irvine.
About the conference
Cloud Tech is an ops conference for ops people, Saturday, October 6th, from 9am to 6pm. Come join us at the Computer History Museum in Mountain View, CA for a full 8 hours of learning directly from great minds sharing their secrets!
Community comments
中国mm
by Huang Alex,
I can't understand
by Badral S.,
Great!!
by Karl Reitschuster,
Good
by Robin Liu,
Good
by Robin Liu,
中国mm
by Huang Alex,
Your message is awaiting moderation. Thank you for participating in the discussion.
很中国的英语
I can't understand
by Badral S.,
Your message is awaiting moderation. Thank you for participating in the discussion.
I can't understand she. Does she speak chinese or english?
Great!!
by Karl Reitschuster,
Your message is awaiting moderation. Thank you for participating in the discussion.
I always was unclear about how facebook could handle such an amount of users and data growth; even they are already build on scalable data backend with hadoop there was still place for improvment. Thanky you
/Karl R.
Good
by Robin Liu,
Your message is awaiting moderation. Thank you for participating in the discussion.
Thank you
Good
by Robin Liu,
Your message is awaiting moderation. Thank you for participating in the discussion.
Thank you