BT
You are now in FULL VIEW
CLOSE FULL VIEW

Building a Data Pipeline with the Tools You Have - An Orbitz Case Study
Recorded at:

| by Steve Hoffman Follow 0 Followers on Nov 22, 2014 |
26:18

Summary
Steve Hoffman, Ken Dallmeyer share their experience integrating Hadoop into the existing environment at Orbitz, creating a reusable data pipeline, ingesting, transporting, consuming and storing data.

Bio

Steve Hoffman is a Senior Principal Engineer at Orbitz. Prior to Orbitz, Steve was Senior Software Architect for Cleversafe, an industry-leading object-based storage provider. Ken Dallmeyer is a Lead Software Engineer on the Machine Learning team at Orbitz. He and his team build data-oriented projects at Orbitz and are major producers and consumers of big data at Orbitz.

This one-day event focused on practical business applications of Hadoop and technologies in the Hadoop ecosystem. Industry thought leaders shared emerging trends and best practices with users and developers to deliver valuable insights from some of today's most successful Hadoop deployments.

BT