Developers simply love Docker. If you're an enterprise IT leader, you've heard about Docker. Your developers have not only heard about it, but have tried it, talked about it with their social networks and bemoaned the fact that they're not already using it at work.
As most enterprise leaders know, getting code to production is a complex and time-consuming chore. Developers can spend days waiting for a system engineer to prepare the infrastructure and configure the right packages. Meanwhile, the code was tested on a system that had yet another version.
With Docker, all of the packages and dependencies are included in the Docker container that the developer prepares. They can push code with zero manual hardware configurations or package updates. Essentially, Docker is a developer's dream -- as long as they're working on a platform supported by Docker.
System engineers don't always love Docker. There's significant debate among cloud system engineers about whether or not Docker is even necessary. Some argue that configuration management tools like Puppet and Chef -- which install packages on the fly as a cloud instance is spun up, rather than having those packages baked into a Docker image -- are easier to manage, version and make universal changes to. Puppet fits more easily into a traditional development methodology and has proven secure in enterprise environments, whereas Docker presents challenges with enterprise security.
Do you and your enterprise love Docker? Check out the following considerations and decide for yourself.
Containers are lighter than VMs
Containers and virtual machines were both created with the same purpose: to emulate multiple hardware systems that are isolated from each other. However, virtual machines have a full operating system (each with its own kernel, memory management, daemons, etc.) and containers share the host OS. This means containers use less resources and take less time to spin up and down.
This lightness can have big impact on application density. By some estimates, you can run twice the workload using containers than using Xen or KVM VMs, which is music to a CTO's ears.
Docker gives you application portability, sort of
Theoretically, Docker allows developers to move applications across multiple clouds, on-premises systems or VMs without a lot of refactoring or reengineering.
In fact, a survey of 745 IT professionals found that the top reason IT organizations are adopting Docker containers is to build a hybrid cloud. Application portability not only means less manual work for developers and systems engineers, but it also can help IT leaders minimize vendor lock-in and shorten cloud migration timetables.
This cross-cloud functionality is useful, but it's not foolproof. Docker works by "demanding" certain CPU and memory from the underlying infrastructure, but there is no built-in tool to devote specific resources, since it is not aware of all the resources under use. In other words, you can't be sure that your container is running efficiently when you don't know if your hosts have enough available space.
There are currently a few Docker orchestration tools on the market that will decide which resources are available on which hosts, but they are for the most part untried by enterprise IT departments. This functionality would be crucial for mission-critical production websites that have to maintain high availability. Every day there are more third party applications that are specifically looking at security, resource management, load balancing, and resource management, and we expect to see their success accelerate.
Not everything can be containerized
Not everything fits in a container, and not every application is suited to containerization. A lot of common Microsoft backend apps can't be containerized, for instance.
Usually, the calculus for picking which application can be containerized is similar to picking which application to put on the cloud: Is it dependent on custom hardware configurations? How much data needs to get shipped into and out of the application?
As you might expect, the companies that will get the most value out of Docker are those that have already developed a microservices approach to software development. When application components are already isolated with zero external dependencies, dropping that microservice into a Docker container is fairly simple.
However, enterprises don't need to undergo a full microservices reorganization to start using Docker. In fact, one of the best ways to start containerizing is to drop existing, small test applications into a container. This strategy gives engineers time to get familiar with Docker, and while it may not give you all the advantages of Docker, it may allow you to increase application density on your test server.
(Image: Phil McDonald/iStockphoto)
Every cloud platform supports Docker
Despite their youth, container services are the darling of every large cloud platform. In fact, providers are racing to be the best at supporting Docker. AWS has even built a system to help organizations orchestrate Docker containers, and other platforms are providing enterprise-grade support around Docker, which is a prerequisite to enterprise adoption. This is good news for organizations moving towards a multi-cloud system.
Docker experts are hard to find
Finding an engineer that can automate your cloud environment, whether with specific tools or any DevOps period, is hard. Finding an engineer that can automate your cloud environment using Docker or any container service is even harder.
There were 43,000 job postings that listed Docker skills in 2015. That's a 1,700% increase from 2014. Suffice it to say, Docker is such a new technology that while developers may know how to use it, few system engineers who know the pros and cons of various management platforms and what will actually be best for the organization. If your enterprise is interested in Docker, a home-grown expert or outsourcing may be the only ways to go.
Security concerns are real
Many organization want to use Docker, but security is a top priority. Containers can make security more difficult to monitor. Most monitoring tools on the market don't have a view of transient instances in public clouds, let alone sub-virtual machine entities like Docker containers.
Systems staff may also find it more challenging to make emergency package updates for security. Docker currently does not allow you to make an update to multiple base images in production, so in the case of an emergency patch, developers would have to manually ensure the new version is running in each container. Some form of image inheritance is necessary for Docker to be ready for a mission-critical enterprise application.
Finally, logging is critical for most enterprises with security and compliance concerns. Setting up Docker containers to consistently and reliably ship logs to a central repository is not simple. Docker containers work best when no permanent data store is required, which is a rare use-case for most enterprises.
Several new tools are being released to perform these tasks, but again enterprises are already using monitoring software and governance processes in well-defined procedures. Docker therefore does not always easily "slot in" to these procedures.
Now is the time to explore
Very few enterprises plan to use Docker in production in the next six months. As much as you think you've been hearing about Docker for a long time, Docker 1.0 launched just a year ago. With hundreds of updates a year and new major features every month from third parties and open source, it is a turbulent time for Docker development. Now is not the time to boil the ocean and overturn corporate IT processes.
But despite all the security and performance challenges mentioned above, every enterprise should be exploring Docker.
Encourage your systems staff to look at it. Talk to your developers about Docker. On that new, somewhat speculative project that's already in the pipeline, discuss using containers. Now is the time to explore it so you'll be ready for when Docker's security features get beefed up and more orchestration tools emerge. Usually it takes enterprises three to five years to adopt new technologies, and most CTOs are dedicated to shortening that window. Even if Docker isn't ready for the enterprise, it's worth placing some bets on.