Network Computing is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

9 Container Fundamentals To Know

  • If you thought containers were a fad that would pass, think again. Since Docker popularized the technology, virtual containers have assumed a prominent role in the developer community and are quickly making inroads into the enterprise. By the end of 2020, containers will be a $2.69 billion market, according to 451 Research.

    Docker made it easy for developers to use containers and get software into production quickly, and now containers are steadily growing in the enterprise, Brian Gracely, director of product strategy for Red Hat OpenShift, said in an interview at Interop ITX.

    "Enterprises trying to go through digital transformations are now seeing how valuable containers can be in terms of helping them grow faster," he said.

    In a presentation on containers at Interop ITX, Stephen Foskett, organizer of Tech Field Day and proprietor of Gestalt IT, told attendees that the future is containerized. "This isn't just a fad or a trend, but an important movement in IT," he said.

    "If you're in the operations space, this becomes the next natural evolution of what you're infrastructure looks like," he said. Learning about containers is a great way to advance your career, he advised. "Your business will want its developers to use these tools."

    Given that containers aren't going anywhere anytime soon, IT infrastructure pros should get familiar with some of the fundamentals about the technology. We put together some container basics – terms, best practices, and educational resources – to help you get started.

    (Image: Red Ivory/Shutterstock)

  • What are containers?

    At the Interop ITX container workshop, Bob Familiar, national practice director at BlueMetal, an Insight company, provided this description:

    • Everything required to make a piece of software run is packaged into isolated run-time environments called containers
    • Unlike VMs, containers do not bundle a full operating system; only libraries and settings required to make the software work are needed
    • This makes for efficient, lightweight, self-contained systems and guarantees that software will always run the same, regardless of where it’s deployed

    (Image: Kevin Remde, Microsoft)

  • Container benefits

    Containers are a big deal for enterprises for a number of reasons, particularly the speed and agility their portability provides. According to Familiar, containers are a solution to the problem of how to get software to run reliably when moved from one computing environment to another. As more software moves into the cloud and is designed and developed using a microservice architectures, and businesses focus on quick release cycles, automating software packaging and deployment is paramount, he said.

    For system administrators, the consistent application environment means not having to worry about OS levels, patches or incompatible applications and utilities, Foskett said.

    (Image: Comaniciu Dan/Shutterstock)

  • Container history

    Containers have become popular in recent years due to the emergence of Docker in 2013, but containerization technology has been around for some time. An early form of containerization dates back to 1979 with the development of the chroot system call in Unix v7, which created early process isolation. In 2000, FreeBSD Jails built on the chroot mechanism to enable virtualization at the operating-system level. Containerization technology grew from there, with developments that included the release of Solaris containerization in 2004 and Linux containers (LXC) in 2008.

    (Image source: Foskett Services)

  • What is Docker?

    Launched as an open source project in 2013, Docker quickly became synonymous with containers. Built on the same underlying kernel mechanisms as Linux containers, Docker helped drive the container craze by providing a streamlined interface and central public repository of images (Docker Hub) that makes the technology easy to use.

    (Image source: Docker Inc.)

  • Docker terms to know

    To get familiar with Docker, it helps to understand some of the basic terminology. At Interop ITX, Mike Coleman, technology evangelist at Docker Inc., provided a list of basic Docker terms and their definitions (see above image).

    Foskett explained that an image is a file system and runtime parameters; you run an image and it becomes a container.

  • Container best practices

    One of the advantages of containers is how fast they can be spun up and torn down. But Docker doesn't actually destroy the container when a user stops it, which can lead to container sprawl and data exposure, Foskett said. "If you run a container and stop it, and the image stays around, someone can easily restart the container and access what you were doing," he said

    That's why it's important to manually delete a Docker container using the rm command, he said.

    Foskett advised practicing good container hygiene by keeping images simple and using external volume storage and clean-up scripts. He also recommended companies build their own Docker images or checking the Dockerfile in order to ensure container quality. It can be difficult to know the origin of images in the Docker Hub, he said.

    (Image: Wstockstudio/Shutterstock)

  • It's easy to get started

    From all accounts, it's not difficult to jump in and experiment with containers. The software is generally free and readily available. Docker can be run on a Windows, Mac, or Linux laptop. Docker provides a step-by-step guide to getting started. Docker Community Edition is free; the Enterprise Edition is subscription-based.

    IT operations pros need to jump in and learn the basics of containers, Gracely said. "Take 20 minutes to an hour to get your feet wet."

    (Image: geralt/Pixabay)

  • Lots of ways to learn

    There's a wealth of online training about containers, some of it free or low-cost. Here's a sample:

    Katacoda – Free Docker training as well as Kubernetes classes

    The Linux Foundation -- Containers Fundamentals course

    Lynda.com – Container basics class taught by well-known cloud expert David Linthicum

    Pluralsight – Getting Started with Docker taught by Nigel Poulton

    CBT Nuggets – Docker training

    (Image: Wokandapix/Pixabay)

  • Container management

    The rapid growth of containers has spawned a number of container management and orchestration systems. Kubernetes grew out of Google, which made the code open source two years ago. Other container management tools include Docker Swarm, Apcera, Apache Mesos, and Rancher Labs; Kubernetes appears to be leading the pack so far. A survey released earlier this year by the OpenStack Foundation showed that Kubernetes was the most deployed container management software by OpenStack users.

    "Container management is the new battleground," Foskett said in an Interop ITX Twitter chat in April. "So far, Kubernetes seems to be running away with the trophy."

    (Image: patpitchaya/Shutterstock)