As cloud computing continues to make its way into the enterprise and upend traditional IT models, it's forcing IT shops to rethink old ways and learn new skills. Along with the cloud, new architectures like containers and open source models are reshaping the enterprise IT landscape.
In light of all these changes, how does one define IT infrastructure today? Keith Townsend, Interop ITX Infrastructure Chair and founder of The CTO Advisor, posed this question at a panel discussion during last week's conference. What used to be silos of networking, storage, and compute has shifted to discussions about containers and cloud, with a focus on application-layer value.
"The key is the line between infrastructure and applications is far less distinct than in years past," answered panelist Scott Lowe, a well-known author and blogger who recently joined startup Heptio as staff field engineer. "We have to keep up our skill sets with that blurring line."
Using software to manage infrastructure falls squarely into the hands of infrastructure professionals, but involves skills from the developer world such as writing configuration files and YAML, he said. Along with working with cloud resources, "all of that presents an opportunity for us as IT professionals to change how we approach things and take a more holistic view," he said.
If you missed Interop ITX, here are more key takeaways and highlights from the week's discussion and debate about the state of IT infrastructure and where it's headed.
Enterprise IT in the cloud era
The role of the IT department as cloud adoption grows was a hot topic at the conference. "My office still needs to connect to the internet, we still need to secure endpoints. There are still reasons to do things in-house, but enterprise IT will have to level up their game," Peyton Maynard-Koran, senior director of worldwide infrastructure at Whole Foods/Amazon, said during a panel on the future of networking.
Cloud is forcing networking to shift to a software focus, which provides more value to business, he said. "We needed a kick in the pants to evolve," he said.
Leslie Daigle, who has held leadership roles with the Internet Engineering Task Force and Internet Society and is now principal at ThinkingCat Enterprises, told attendees, "You want to push out to the cloud anything that you don't need to roll your own. …Retaining the expertise to know what you're getting out of these [cloud] services and doing some in-house development will still be important."
Enterprise networking experts do not become obsolete with the cloud, said Charlie Gero, CTO of Akamai's enterprise division. For example, a company will need in-house experts who can understand what's happening during a Distributed Denial of Service attack and how it impacts its cloud services. "You still need people to solve those problems, the problems are just in a different domain," Gero said.
Lowe said during Townsend's panel that he expects most workloads will end up in the public cloud, with some remaining in on-premises data centers for regulatory or other reasons. "If you're not already becoming very familiar with how your organization can leverage public cloud workloads then you need to address that sooner than later," he told attendees.
It's still early days for containers and container orchestration, but enterprises will want to figure out what benefits they offer to their business, Lowe said. "Choose the right tool for the job instead of choosing a path because it' the new shiny," he said. "Companies will continue to use some VMs…Use containers where it makes sense and is the right choice for your business."
Containers offer a lot of value by increasing developer productivity for companies where the ability to deliver software more quickly is a competitive advantage, he said.
Start by making sure your developers see some actual benefits from using containers, then ensure operations teams are ready to support them as they scale up beyond using Docker on a laptop, he advised.
In a separate session, Lowe presented an introduction to containers and container orchestration. He covered key terminology such as namespace, cgroup, runtime, and image. At a high level, a container is a tool that enables packaging and deployment of applications in a relatively self-contained way, Lowe explained. Then he provided a more in-depth definition: A collection of operating system-level features that enables companies to package and deploy applications while minimizing dependences on the OS.
While Interop ITX put the spotlight on containers and other software trends impacting IT infrastructure, Robert Coenen, VP of business development at InterOptic, noted that serverless and other abstract architectures still run on hardware. "If you have a process, it has to be running on something, whether it's a traditional server or abstracted across multiple devices," he said. "The reality is there's always a piece of hardware somewhere."
The rise of open source
Interop ITX attendees heard a lot about how open source technologies are impacting infrastructure. Arpit Joshipura, general manager of networking and orchestration at the Linux Foundation, said open source development is accelerating and impacting every vertical market, from telecommunications to banking.
"It's about feature innovation and feature velocity," he said at the Network Transformation Summit. "End users couldn’t wait and rely on a single vendor. It's end-user innovation that really disrupts these markets."
Joshipura urged attendees to follow the lead of cloud players and telcos by adopting open networking and SDN. "Most of the strategic vendors are ones that embrace open source," he said.
Maynard-Koran of Whole Foods/Amazon, said open source software and white-box technologies are driving innovation in networking and allowing enterprises to be more agile.
"That changes us from being network implementers to being network engineers again. We don't have to rely so heavily on a few vendors," he said, adding later, "We finally reached a place and time where engineers can grab control and provide direct value to a business."
For Vodafone, open source is a key component of its digital transformation and building an infrastructure to support a highly mobile workforce, Will Hughs, head of fixed and unified communications at Vodafone Americas, presented a session on the global telco's digital intiative. He said the Vodafone's software-defined networking initiative, dubbed Ocean, is designed focus on the user experience and take advantage of the efficiencies of cloud architecture. It includes open source technologies such as Open Daylight, ONOS, and OPNFV.
"We've embraced open source over vendor-specific solutions," Hughs said.