How IoT Will Redefine Colocation

IoT services will drive a more data-oriented view of resource placement.

Michael Bushong

October 19, 2017

4 Min Read
NetworkComputing logo in a gray background | NetworkComputing

In enterprise IT, data center decisions frequently start with location. Where should companies deploy their servers and all the surrounding infrastructure required to support applications? For companies with a few large concentrations of application-using employees, the answer is frequently on-premises. But for distributed enterprises, hosting and colocation companies provide an effective means of keeping data center resources in key locations that serve broad swaths of employees while providing diverse connectivity options designed to keep costs under control and ensure high availability.

However, the nature of colocation will change as cloud adoption continues to soar, and services leveraging the internet of things (IoT) proliferate through both the enterprise and consumer markets.

Colocation today 

Hosting and colocation services today basically focus on two things: Get companies’ data center resources close to other resources that are important to them, and help manage those resources with on-site staff. 

The most basic requirement for data centers is connectivity. Servers have to have an on-ramp to the internet so that distributed workforces can access their applications. Colocation companies basically provide physical space that is typically serviced by multiple service providers. Acting as a carrier hotel, colocation providers offer enterprises multiple options for their connectivity needs. Minimally, this allows for redundant links, which is good for high availability. Sometimes just as important, it allows enterprises to take advantage of service provider competition to keep their connectivity costs in check.  

In this model, the whole point of colocation is to house equipment in a facility that is in close proximity to the resources companies need. The notion of resources is defined primarily by the physical world. 

IoT and a new category of services

But what if the thing you need to be close to is something different?

IoT operates under the premise that there will be lots of sensors, each of which provides some information to be used by a service. In some cases, that service might run locally, close to the sensors. You can imagine things that require real-time analysis, like monitoring mining equipment to intercept failures before they happen. In other cases, the service might run in the cloud. For instance, collecting usage information from home power meters might feed into a monthly billing process. 

IoT

iot-data.jpg

As with any new technology, the initial offerings will be fairly basic. Collect some information, and do something with it. But as we see more sensors deployed, we will have access to a lot more data. Importantly, some of that information will be related, but come from different places. This means that services that hope to glean insight from information will need to span multiple sources.

It’s this rise of composite services that will change how we look at colocation.

Colocating data 

Assume that most IoT deployments will stream data to a cloud. If a service needs to use specific sensor data, it makes sense to run that service on resources that are in close proximity to where the data is stored. So if, for example, data is being streamed to AWS, then it probably makes sense to run the service in AWS.

In this simple example, the decision on where to host the data center might be influenced (or even driven) by where the data resides. If the sensor provider and the application owner are one and the same, it’s not terribly difficult. But if the application owner is trying to build an over-the-top offering, deciding where to host the application could be critical.

The natural affinity that services have with data will drive a more data-oriented view of resource placement. And the more data that is used, the stronger that affinity will be. For consumers of data, this means that architectural consideration will need to be paid to where data resides.

Enterprise considerations

The examples thus far have suggested that application developers might want to consider where they host their software to account for proximity to data. But what about enterprises looking to take advantage of specific, high-value IoT services? 

It seems inevitable that there will be high-use IoT services that emerge. To the extent that these services become integral components to success, they might influence how enterprises think about their own architecture. These services could drive a predisposition to one cloud provider over another. 

Or, if multiple high-value services are distributed across more than one cloud provider, it could mean that enterprises have to solve a data access problem, effectively architecting their data strategy with diverse applications in mind. Of course, this carries a set of connectivity and security implications that need to be accounted for as well.

The bottom line 

Much of our collective enterprise architectural considerations have been focused on physical resources. But as data continues to exert a heavier influence on our success, we will all need to expand our architectural point of view to consider where that data resides. With IoT in particular, this could mean rethinking where we place our resources and workloads.

In many ways, this redefines the whole notion of colocation and hosting services, bringing an element of data orientation to the table. Of course, this is all still evolving, but enterprises that expect to make heavy use of IoT applications ought to at least begin asking a broader set of questions as they consider the impacts to their IT infrastructure.

About the Author

Michael Bushong

Mike Bushong is Vice President of Enterprise and Cloud Marketing at Juniper Networks. Mike spent 12 years at Juniper in a previous tour of duty, running product management, strategy, and marketing for Junos Software. In that role, he was responsible for driving Juniper's automation ambitions and incubating efforts across emerging technology spaces (notably SDN, NFV, virtualization, portable network OS, and DevOps). After the first Juniper stint, Mike joined data center switching startup Plexxi as the head of marketing. In that role, he was named a top social media personality for SDN. Most recently, Mike was responsible for Brocade's data center business as vice president of data center routing and switching, and then Brocade's software business as vice president of product management, software networking."

SUBSCRIBE TO OUR NEWSLETTER
Stay informed! Sign up to get expert advice and insight delivered direct to your inbox

You May Also Like


More Insights