Network Computing is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Will the Emerging Edge Fix My Digital Gaming Experience?

gaming
(Source: Pixabay)

There has long been a universal understanding amongst online video gamers that lag kills. A quick search of just that term nets more than 5.7 million results. More accurately, latency kills - your game experience, your character, and sometimes your ability to play at all. Gamers have adopted the term "ping," per the utility, to describe rage-inducing latency. Academic papers have long sought to solve the problem. Players invest heavily in hardware that promises better network performance, leading Netgear to see a good return on its "pro gaming" line of wireless access points.

Yes, I do have one, why do you ask?

Early on, lag was due primarily to the centralization of gaming "servers" and less than optimal player connectivity options. As we all know, distance matters to packet performance, and gaming companies had one and only one data center from which to operate these online games. And the speed and bandwidth available from your local Internet provider can be the deciding factor in the ability to play online in a real-time, multi-player game. 

Cloud brought some relief. With more than one data center, gaming companies like Epic Games launched into the cloud. With multiple data centers around the globe, the gaming behemoth operates multiple game regions. These regions generally map to AWS data centers.

Which still leaves a whole lot of Fortnite's 250 million players trying to win a game despite "really bad ping."

One might immediately think a multi-cloud approach would provide some relief. A quick search of a global map pinpointing the location of major cloud providers shows much better coverage but not nearly enough distribution. Large data centers need power and people, and connectivity, which leads most cloud data centers to be built near or in very large urban centers.

That leaves huge swaths of the world suffering significant latency simply due to distance.

Edge extends beyond CDN

We've watched over the past few years as Edge has begun to emerge as a more distributed, if smaller, data center strategy in which providers and enterprises can extend their presence to more of the world. That's the good news. The bad news is that most of the Edge offerings to date are an extension of existing content delivery network (CDN) offerings, whose infrastructure was designed and built to improve delivery and security of content. Neither addresses the need to optimize the real-time exchange of data required to successfully operate and play the type of sub-second response today's games require.

Assuming that obstacle could be surpassed, the issue of offering a truly multi-cloud platform remains problematic. Such a platform would need to embrace any cloud or edge data center seamlessly. That is, they'd have to be able to take advantage of compute anywhere to standup new instances of a "server" as close as possible to the players.

Obviously, such a multi-cloud, application-oriented platform would be ideal for more than just games. An exploding market for "smart" everything – from lights to garage doors to locks to my fish-tank – requires immediate responses to the telemetry they report. The ability to collect, analyze, and provide a response in near-real-time (latency must be less than 20ms) is increasingly important to many of the basic functions of our lives. Consider the growing responsibility we give sensors in our cars and the movement toward enabling our cars to make decisions based on real-time data about every other car we encounter. Even 20ms in some scenarios seems like a lifetime.  

Devices today have no more tolerance for slow responses than we do. Performance comes first, and we expect technology to provide it. But just as important is security. The more we delegate to applications the responsibility for managing our finances, our homes, and our businesses, the more important it becomes to secure them and the data they rely on.

Where Edge is going

The emerging Edge will converge around applications and the need for speed, portability, analytics, and security. Our answer to the challenge of physical distance has always been to close the physical gap between the application and its user. When apps were mostly made of content, using a distributed caching service made sense. But today's applications are not static.

The applications we need to perform and respond quickly are interactive and thus dynamic. They are the apps that manage dosing supplements to my reef tank on a scheduled basis. They manage the temperature in our homes and warn control systems about equipment conditions that could be dangerous to operators. They entertain us if they perform well.

Many apps today are a combination of cloud-hosted control applications with distributed data collection and processing applications. A growing percentage of them are containerized. All of them are built with components sourced from multiple providers, many of which must be loaded in real-time.

The emerging Edge is designed to eliminate these constraints. By recognizing that this not only a digital world but a multi-cloud one, this new Edge promises to enable true portability of applications across any environment along with the services they need to operate safely, securely, and at speed.

And maybe - just maybe – the emerging Edge will finally eliminate the gamer's nemesis, lag, and fix my digital gaming experience.  

Of course, if it did, then I'd have no excuse for losing.