How Service Mesh Lets Enterprises Confidently Operate Cloud-Native Applications

As organizations increase their reliance on cloud-native architectures, a service mesh will inevitably provide for the operability and security of those environments.

Lori MacVittie

July 24, 2020

3 Min Read
How Service Mesh Lets Enterprises Confidently Operate Cloud-Native Applications
(Source: Pixabay)

The adoption of cloud-native architectures continues to grow despite challenges with complexity. The architectural benefits of a modern approach to translating business functions into tightly focused services are the perfect match for organizations entering the second phase of digital transformation, which focuses on digitizing expansion. Rapid development and deployment are critical capabilities to scaling the business, and cloud-native architectures that rely on container technologies support both.

When it comes to implementing those cloud-native architectures, Kubernetes has emerged as the de-facto standard. Whether in the public cloud or on-premises, Kubernetes has clearly won.

That's a good thing. Fragmentation of open source solutions has long been a thorn in the side of the enterprise. Internal community conflicts have often given rise to forked implementations that cause chaos and confusion. By effectively standardizing on Kubernetes, we can all focus on expanding the ecosystem to include those technologies organizations will need to operate efficiently and benefit from cloud-native architectures.

 These technologies arise from the reality that core Kubernetes does not address many operational functions that make a cloud-native environment "enterprise-grade," namely: monitoring and securing cloud-native applications. What Kubernetes does do (extremely well) is enable an ecosystem that can add those capabilities – and the value they bring. The result is there are a variety of offerings that can address the need for visibility and security in the form of a service mesh.

A service mesh like Istio or Aspen Mesh is a highly connected set of proxies that support secure connectivity between container instances (usually via mTLS) as well as visibility into specific business functions, i.e., an application in a container instance. This visibility enables developers and operators alike with the ability to trace workflows as they traverse each individual function. That, in turn, provides greater awareness of performance and availability issues that inevitably negatively impact the user experience.  Basically, it ensures that requests will not be directed to a failing or failed instance, thus ensuring availability and performance by avoiding unnecessary retries to available instances.

Another benefit of a service mesh is that it dynamically adapts along with the applications they are delivering and securing. As one function scales out (adds more instances), the service mesh can track and manage connectivity to and from each one. As those functions scale down, the service mesh appropriately adapts to ensure the user experience is never disrupted by the changes in the cloud-native environment. This means the service mesh can automatically enforce secure connections (mTLS) within the environment without requiring complex key and certificate management processes.

Lastly, the fine-grained control over the flow of traffic within a cloud-native environment means service-mesh makes it simpler to take advantage of modern deployment practices, i.e., canary and blue/green patterns. These types of practices are important to ensuring new deployments meet expectations and work as expected. With the frequency of deployments ramping up as businesses try to push new digital capabilities faster to meet rising customer expectations, the ability to roll out a new deployment and essentially test with real traffic without disrupting existing experiences is an important capability.

A service mesh is often seen as just an "add-on" to a cloud-native environment. But without a service mesh, the process of supporting modern deployment practices, enforcing secure communications, and enabling visibility becomes much more complex. 

As organizations increase their reliance cloud-native architectures, a service mesh will inevitably provide for the operability and security of those environments. By addressing key issues of complexity and visibility, service mesh makes open-source, cloud-native environments more operable and enables enterprises to expand and benefit from their use to meet modern application requirements.


About the Author(s)

Lori MacVittie

Principal Technical Evangelist, Office of the CTO at F5 Networks

Lori MacVittie is the principal technical evangelist for cloud computing, cloud and application security, and application delivery and is responsible for education and evangelism across F5's entire product suite. MacVittie has extensive development and technical architecture experience in both high-tech and enterprise organizations. Prior to joining F5, MacVittie was an award-winning Senior Technology Editor at Network Computing Magazine, where she authored articles on a variety of topics aimed at IT professionals. She holds a B.S. in Information and Computing Science from the University of Wisconsin at Green Bay, and an M.S. in Computer Science from Nova Southeastern University. She also serves on the Board of Regents for the DevOps Institute and CloudNOW, and has been named one of the top influential women in DevOps.

Stay informed! Sign up to get expert advice and insight delivered direct to your inbox
More Insights