Revolutionizing the Network through Edge Computing

Edge computing combines the speed of content delivery networks with the benefits of cloud to enable a new generation of networking.

Lori MacVittie

July 24, 2018

4 Min Read
NetworkComputing logo in a gray background | NetworkComputing

Growing on the internet onramps – where mobile and IoT devices make their way onto the backbone – is a new kind of computing that’s part content-delivery network and part cloud. It’s called edge computing, and it’s likely to be an important part of your strategy in the future.

Performance is inarguably a factor in the success of applications – and thus business – today. Users have been known to delete apps and sever their relationships with brands solely based on a single, slow user experience.

While most often associated with IoT and its high-volume connectivity with low latency requirements, edge computing is not just for connected cars and appliances. In fact, one can argue that edge computing is attempting to solve the age-old problem of end-user performance. That is, how to improve the speed of the last-mile without being a part of the last mile.

Like the precepts of a CDN, edge computing attempts to improve performance by getting as close to the last mile as possible. But unlike CDN, which focuses on simply delivering content like  images and scripts, edge computing wants to move processing closer to the user, too.

This definition from the Open Glossary of Edge Computing does a fantastic job of explaining edge computing:

The delivery of computing capabilities to the logical extremes of a network in order to improve the performance, operating cost and reliability of applications and services. By shortening the distance between devices and the cloud resources that serve them, and also reducing network hops, edge computing mitigates the latency and bandwidth constraints of today's Internet, ushering in new classes of applications. In practical terms, this means distributing new resources and software stacks along the path between today's centralized data centers and the increasingly large number of devices in the field, concentrated, in particular, but not exclusively, in close proximity to the last mile network, on both the infrastructure and device sides. [emphasis added]

The addition of software stacks to the distribution list is what makes edge computing different from CDNs. It’s not just content that's moving; it’s the entire stack. You could call edge computing platform as a service, but decentralized and distributed at strategic points along the backbone of the network that ultimately improve performance and enhance reliability. It also offers a compelling argument in terms of supporting regulatory compliance. With processing localized, local policies can be more effectively enforced without disrupting consumers ungoverned by the regulation.

Edge computing relies on platforms

Apps and devices are the source of big data. In addition to generating all that data, most apps and devices make use of it. That is, some data gathered ultimately violates a threshold – proximity, temperature, condition – that necessitates action. In a centralized model – whether processing occurs in the data center or in the cloud – the notification is delayed until the data has been processed and the violation or condition met. In a decentralized, edge computing model, initial processing occurs close to the consumer. Thus, alerts and information can be sent back to the consumer much quicker without eliminating the bulk processing that might occur later.

But to do that requires processing; thus, the inclusion of application stacks as well as other resources in edge computing. Processing is required to detect violations and variations as data flows from the consumer to the application responsible for collecting it. Edge computing providers are not just banking on speed, but speed with a purpose. That includes taking advantage of purpose-built, performance-enhancing hardware like GPUs and specialized NICs. Platforms, built for and positioned for speed, are what underpin the promise of edge computing.

edge computing

edge.jpg

Perhaps recognizing the challenges faced by organizations in a multi-cloud world - particularly those involving deployment and subsequent management of applications spread across public cloud providers - edge computing suppliers are already acknowledging that workload orchestration is a critical capability. While far from mature, this recognition means it is more likely to be table-stakes by the time edge computing hits mainstream adoption.  

Edge computing may be in its infancy but in reality, it’s been a long time coming. CDNs have existed for nearly 30 years with relatively little change to their fundamental operating premises. It can be argued that cloud, as well, was not revolutionary, but merely evolutionary. After all, it’s just someone else’s computer. Edge computing is like the offspring of both, bringing together the benefits of cloud and the speed of CDNs.

The data center is already in the throes of a major remodel. One of the ways network and security professionals can prepare for edge computing is to pay attention to architecture. Monolithic infrastructure platforms are not conducive to supporting distributed services and applications. Being aware that further disruption is imminent can be a boon and helps avoid the architectural debt inherent in locking down the data center network into a fixed architecture. Keep in mind the need to distribute across clouds and out to the edge as you evaluate new architectures and approaches to delivering the network and application services that are still critical to the success of every business.

 

 

About the Author

Lori MacVittie

Principal Technical Evangelist, Office of the CTO at F5 Networks

Lori MacVittie is the principal technical evangelist for cloud computing, cloud and application security, and application delivery and is responsible for education and evangelism across F5's entire product suite. MacVittie has extensive development and technical architecture experience in both high-tech and enterprise organizations. Prior to joining F5, MacVittie was an award-winning Senior Technology Editor at Network Computing Magazine, where she authored articles on a variety of topics aimed at IT professionals. She holds a B.S. in Information and Computing Science from the University of Wisconsin at Green Bay, and an M.S. in Computer Science from Nova Southeastern University. She also serves on the Board of Regents for the DevOps Institute and CloudNOW, and has been named one of the top influential women in DevOps.

SUBSCRIBE TO OUR NEWSLETTER
Stay informed! Sign up to get expert advice and insight delivered direct to your inbox

You May Also Like


More Insights