Network Computing is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Delivering Application Environments: An Evolution

Modern application development has unequivocally embraced Agile, DevOps, and continuous delivery. This ideological shift emphasizes the importance of the environment in which an application runs. As the breadth of consumers and automation grows, the availability and quality of application environments is directly proportional to project success. The shrinking cost of environments has helped fuel this change, but the cost of delivering high-quality environments to consumers remains a key constraint.

Docker is the latest innovation to attack this constraint, and its idea of application containerization is both evolutionary and transformational. But before we can fully appreciate the value of Docker, we need to understand the history and constraints of application environment delivery.

An application environment consists of all the resources required to run an instance of the application. An environment is available when consumers -- developers, QA, or continuous integration systems -- can get access when and where they need it. An environment is high quality when it most closely matches production, including configuration, code, and data. The cost of delivering these environments to consumers includes

  • Acquisition cost: The time and money necessary to secure the resources required to run the application
  • Configuration cost: The time required to get the environment in a suitable state for use by the consumer
  • Maintenance cost: The time and money required to monitor and maintain the environment and its supporting resources

Back in the early 1990s, monolithic applications ran on expensive Unix systems. Each application rarely spawned more than a handful of ancillary environments, and developers were left to fend for themselves in terms of testing and validating their changes.

The rise of Linux in the mid-1990s dramatically lowered the acquisition cost by enabling enterprise application development on commodity x86 hardware. Though this made it financially feasible to deploy more systems, hardware still took a long time to install and configure, and most application environments sat underutilized.

As a result, the industry sought to eliminate demand for physical environments through carefully controlled application runtimes like J2EE. But decades later, the verdict is clear: Dependencies stretch far beyond the confines of a single application runtime, and open-source systems communicating via REST are the new standard. The modern application trend is toward isolated services running in environments that can be horizontally scaled and connected together through centralized orchestration tools.

The advent of server virtualization at the turn of the century was the first major shift that enabled this evolution in application development. Virtualization lowered acquisition costs by breaking the dependency on hardware. It simplified configuration with the ability to snapshot and clone virtual machines, and it reduced the maintenance burden by improving utilization of physical resources.

Five years later, these savings were magnified by cloud computing, which eliminated physical hardware from the equation and provided greater elasticity in response to demand. As a result of lower acquisition and maintenance costs, application environment configuration cost became the key constraint to a project developer. Configuration management tools like Puppet and Chef, driven by automation tools such as Jenkins, flourished to bring rigor and predictability to application environment configuration.

Enter Docker in 2013. It pulls configuration and deployment into a single container that can be quickly provisioned and makes more efficient use of system resources, dramatically shifting the economics of application environment delivery. The technology still lacks a mature tools ecosystem, but it has gained significant traction: There are already 45,000 public Docker images, compared to 20,000 Amazon EC2 images.

Containerization requires changing how your applications are built and packaged, and the lowered costs will create more demand for application environments. If done correctly, however, this organizational shift can simplify the development of modern service- oriented applications and drive higher quality through greater environment availability.

However, whenever one constraint is eliminated, another emerges. Docker makes it easy to deploy containers, but an application environment is incomplete without data. Nascent data management tools such as Flocker exist, but dynamically feeding fresh data into containers remains a challenge, particularly across disparate infrastructure such as public and private clouds.

For all the reasons we want high-quality and available environments, we need the same for our data. Synthetic, stale, or shared data results in poor quality and reduced efficiency. Delivering real data into environments on demand may be the next transformational innovation that will forever change how we deliver application projects.