BT
You are now in FULL VIEW
CLOSE FULL VIEW

Building a Data Pipeline with the Tools You Have - An Orbitz Case Study
Recorded at:

| by Steve Hoffman Follow 0 Followers on Nov 22, 2014 |
26:18

Summary
Steve Hoffman, Ken Dallmeyer share their experience integrating Hadoop into the existing environment at Orbitz, creating a reusable data pipeline, ingesting, transporting, consuming and storing data.

Sponsored Content

Bio

Steve Hoffman is a Senior Principal Engineer at Orbitz. Prior to Orbitz, Steve was Senior Software Architect for Cleversafe, an industry-leading object-based storage provider. Ken Dallmeyer is a Lead Software Engineer on the Machine Learning team at Orbitz. He and his team build data-oriented projects at Orbitz and are major producers and consumers of big data at Orbitz.

This one-day event focused on practical business applications of Hadoop and technologies in the Hadoop ecosystem. Industry thought leaders shared emerging trends and best practices with users and developers to deliver valuable insights from some of today's most successful Hadoop deployments.

Login to InfoQ to interact with what matters most to you.


Recover your password...

Follow

Follow your favorite topics and editors

Quick overview of most important highlights in the industry and on the site.

Like

More signal, less noise

Build your own feed by choosing topics you want to read about and editors you want to hear from.

Notifications

Stay up-to-date

Set up your notifications and don't miss out on content that matters to you

BT