BT

New Early adopter or innovator? InfoQ has been working on some new features for you. Learn more

Top Docker Misconceptions

| by Carlos Sanchez Follow 0 Followers on Jul 03, 2014. Estimated reading time: 2 minutes |

Based on his experience as a system administrator evaluating Docker, Matt Jaynes has written at the DevOps University website about the top Docker misconceptions, warning about adopting Docker at small scale and without solid infrastructure foundations, and providing alternatives to improve the deployment process.

Benjamin Wootton, CTO of Contino, at his article "Microservices - Not a free lunch" already warned about the significant operations overhead of a microservices based architecture. Matt, from his experience focused on server setups delivering web applications, warns about it as well, and the need to be an expert in systems administration to use it safely in production:

At the moment, you need more systems expertise to use Docker, not less. Nearly every article you will read on Docker will show you the extremely simple use-cases and will ignore the complexities of using Docker on multi-host production systems. This gives a false impression of what it takes to actually use Docker in production.

If you do not want to have to learn how to manage servers, you should use a Platform-as-a-Service (PaaS) like Heroku. Docker is not the solution.

Matt recommends starting with role based containers (app, db, cache,...) and only for the roles that make sense, but only if the project has a solid infrastructure foundation:

If you have critical holes in your infrastructure, you should not be considering Docker. It would be like parking a Ferrari on the edge of an unstable cliff.

Other optimizations that can be done instead of adopting Docker to improve performance and consistency in the deployment process:

  • Using configuration management tools (Ansible, Puppet,...) to easily create and manage servers, particularly in the cloud.

  • Creating cloud images, instead of provisioning from scratch, to create new servers faster. The base image and the servers already started can still be provisioned with configuration management tools.

  • Pinning versions, using explicit versions for packages and dependencies to ensure software does not change from environment to environment or over time.

  • Deploying applications using git or rsync, so the servers can be updated with minimal downloads, similarly to Docker's image layer caching.

  • Deploying applications using packages if the application build or preparation process takes a long time. Using a pre built package such as a zip, rpm or deb, would speed up the deployment across servers.

For multi-host production, Matt recommends using these optimization methods for as long as possible until the extra benefits provided by Docker are needed, currently requiring a large scale before Docker's benefits outweigh the extra complexity added. However, at the rate the project is progressing, it could quickly no longer be the case.

Rate this Article

Adoption Stage
Style

Hello stranger!

You need to Register an InfoQ account or or login to post comments. But there's so much more behind being registered.

Get the most out of the InfoQ experience.

Tell us what you think

Allowed html: a,b,br,blockquote,i,li,pre,u,ul,p

Email me replies to any of my messages in this thread
Community comments

Allowed html: a,b,br,blockquote,i,li,pre,u,ul,p

Email me replies to any of my messages in this thread

Allowed html: a,b,br,blockquote,i,li,pre,u,ul,p

Email me replies to any of my messages in this thread

Discuss

Login to InfoQ to interact with what matters most to you.


Recover your password...

Follow

Follow your favorite topics and editors

Quick overview of most important highlights in the industry and on the site.

Like

More signal, less noise

Build your own feed by choosing topics you want to read about and editors you want to hear from.

Notifications

Stay up-to-date

Set up your notifications and don't miss out on content that matters to you

BT