The Edge Advantage: Operate Locally to Scale Globally

The demand for real-time business interactions is driving deployments to the “edge” of the Internet where scale is high and latency is low.

Stephen Ludin

May 7, 2019

5 Min Read
The Edge Advantage: Operate Locally to Scale Globally
(Image: Pixabay)

The rapid pace at which everyone and everything is becoming connected is having a profound impact on digital business. With that, users expect their experiences to be exceptional and instant, as well as secure and reliable. The demand for more real-time business interactions, whether it be with people or things, is forcing a digital transformation. Businesses must adapt by pushing faster development, creating more agile processes, favoring faster features over perfect features or all three. In addition, the demand for real-time business interactions is driving deployments to the “edge” of the Internet where scale is high and latency is low.

What is the “edge”? At its core, it simply refers to the part of a service the user first interacts with. For years, companies have been expanding the reach of their edge by building data centers around the world and bringing their services closer to their users. They will often turn to Content Delivery Networks (CDNs) for assistance by using their edge, thus mitigating the need for expensive build outs. CDNs will typically have 40 or more points of presence (PoP) around the world pushing that Edge even closer to users. There are even CDNs today that have thousands of PoPs creating a virtual Hyperedge for serving content closer to the users.

So, there are big edges and small edges – why does this matter? There are critical benefits to having capacity closer to users.  This article dives into those benefits, and further examines why migrating to the Edge allows businesses to deliver on the instant experiences their customers expect, while enhancing their connectivity and security.

Proximity: Distance matters. The farther away two points are, the more latency there will be between those two points. In other words, it takes longer for data to move from one place to another. Part of this is just the limitation of the speed of light. The other part is the longer the route, the more network hardware data needs to pass through, which slows down the delivery of content even more. For example, San Francisco and London are about 8600km apart. With the speed of light in fiber being around 2x108 meters per second, that makes the minimum time to travel about 40ms. Considering the fact that routes are not straight and the overhead of network hardware will cause additional delays, real-world numbers will total in 140ms range. This is too slow to be considered performant. However, by using an Edge that is close to the users, it can dramatically reduce the time.

How big does the edge need to be, in order to be useful? Some of that depends on the audience. If you consider a global audience, 30-40 PoPs will get an average around 1500km from the users. In order to get closer, that number of PoPs has to jump up quite a bit. A Hyperedge referred to above with thousands of locations reduces the average distance to around 400km, almost four times closer. This proximity will provide lower latencies, and because there is just less “stuff” between the edge and the users, reliability is higher with fewer chances of things going wrong.

Connectivity: Modern mobility demands instantaneous insights to be extracted from data to drive actions from personalized customer experiences to autonomous connected technologies. This is seen through the proliferation of machine to machine (M2M) technologies like the internet of things (IoT), which autonomous vehicles, smart cities and manufacturing depend on to function. A delay in processing could frustrate a user, damage your brand or, in the most drastic cases, mean the difference between life and death. Proximity and IoT connectivity mitigate the risks associated with the delays, while enhancing the safety and security of your customers.

As IoT connectivity becomes increasingly prevalent, the sheer volume of generated data makes sending the entirety of it to a centralized location impractical. Processing needs to happen on the distributed compute plane that an edge provides, sending only the most important data or summaries to a computing core or feeding data back to the “thing” itself for instruction.

Security: We live in a time when attackers hold unprecedented power, and there's simply no way to summon the capacity businesses need to defend their customers, employees and their reputation in a centralized data center model. Recent attacks have crested 3 Tbps, a volume unheard of even just a few years ago. Even the largest cloud data centers would be overwhelmed by this sort of attack and, even if it was physically possible to equip the cloud data center with enough capacity, the cost would be prohibitive.

The edge can be the security perimeter, providing scale in processing, resilience through redundancy, better insights into normal and anomalous traffic patterns, and low latency for “good” traffic that you let through that perimeter. This has the added benefit of reducing the load on the security layer at your origin, saving significant time and money.

These are just a few of the benefits of migrating to the edge. Real-time business interactions, fast and reliable connectivity and, most importantly, security are paramount to maintaining customer relationships and staying competitive far into the future.

About the Author

SUBSCRIBE TO OUR NEWSLETTER
Stay informed! Sign up to get expert advice and insight delivered direct to your inbox

You May Also Like


More Insights