Data centers

08:10 PM
David Hill
David Hill
Commentary
50%
50%
Repost This

May the Power Be With You

Companies such as Power Assure are helping data center operators uncover hidden capacity by accurately measuring energy consumption and finding ways to run systems more efficiently.

When we think of physical IT infrastructure, we usually focus on servers, storage and networks. But without electricity for running and cooling computing equipment, IT is dead in the water. Yet not enough attention over time has been paid to problems such as provisioning and efficiency that plague what might be called the "shadow" physical IT infrastructure.

That is changing. Data center infrastructure management (DCIM) extends traditional system and network management thinking to the monitoring and management of all critical related data center resources in a comprehensive and integrated manner. DCIM is a relatively new term, but it provides a way of thinking and a rallying cry that is driving innovation. Large players, such as BMC, CA, HP and IBM, are among those that are taking leadership roles in DCIM, but a large number of companies, including startups, are attacking aspects of the problem.

To keep things simple and to illustrate DCIM from a data center energy perspective, our focus here is on a young company, Power Assure.

Unlocking Hidden Capacity

Here is a conundrum: Why are some data centers apparently running out of power despite the fact that server racks or rooms are half-full? The reason is that data centers think that they are running out of power and, therefore, are afraid to add equipment. In fact, they have a lot of power from the grid that is still available for them to use. The anticipated power requirements are collected from equipment nameplates and vendor-supplied information. As this information tends to be conservative, the power requirements are overstated (which means that you seem to be using more power than you use by a wide margin). The difference between what nominally you are using and what you could use is hidden capacity.

Power Assure uses its EM/4 software platform to address the issue of hidden capacity. The first step is monitoring true energy consumption. Power Assure uses a metric called PAR4 that has been IT energy-efficiency-tested by Underwriters Laboratories (UL). Equipment power consumption is measured at four levels: off, idle, loaded and peak. A PAR4 value is the transactions per second per watt at 100% load. The accumulated numbers enable accuracy in planning the maximum power allocation. That is useful in being able to take advantage of hidden capacity.

Note that this approach can also be used to compare efficiency across multiple generations of IT equipment--past, present and future (theoretical). For example, how much power could be saved by moving to a new generation of servers? Or, alternately, how many additional servers could be added without going over planned maximum power consumption?

Power Assure also employs other metrics, such as Power Usage Effectiveness (PUE), which calculates facility efficiency as the ratio of total power consumed throughout the data center and the power consumed by the IT equipment. Power Assure states that, typically, 50% of the power in a data center is consumed by equipment (servers, storage and networking), 40% by the cooling system (a chilling fact!) and 10% by the inherent inefficiencies in the power distribution itself. So all components have to be examined to ensure a proper balance.

[ Join us at Interop Las Vegas for access to 125+ IT sessions and 300+ exhibiting companies. Register today! ]

Note that one key to the effectiveness of this process is the collection of data from a variety of sources. Power Assure software is data-driven software intelligence that uses data that can be captured from the equipment itself. Quite frankly, the old saying that you can't manage what you can't measure is changing IT infrastructure management processes from being reactive (seat-of-the-pants intuition) to proactive (data-driven decision making) at the same time that software (from server virtualization to storage management and on and on) is fundamentally altering how IT interacts with hardware resources. This is a very positive trend, and Power Assure is very much in line with that trend.

But monitoring is only part of the process. A second step is analysis. Prioritizing the list of possible improvements and taking the necessary corrective action (sometimes with the use of automation) seals the deal.

Next Page: Data Center Efficiencies

Previous
1 of 2
Next
Comment  | 
Print  | 
More Insights
More Blogs from Commentary
Edge Devices Are The Brains Of The Network
In any type of network, the edge is where all the action takes place. Think of the edge as the brains of the network, while the core is just the dumb muscle.
SDN: Waiting For The Trickle-Down Effect
Like server virtualization and 10 Gigabit Ethernet, SDN will eventually become a technology that small and midsized enterprises can use. But it's going to require some new packaging.
IT Certification Exam Success In 4 Steps
There are no shortcuts to obtaining passing scores, but focusing on key fundamentals of proper study and preparation will help you master the art of certification.
VMware's VSAN Benchmarks: Under The Hood
VMware touted flashy numbers in recently published performance benchmarks, but a closer examination of its VSAN testing shows why customers shouldn't expect the same results with their real-world applications.
Building an Information Security Policy Part 4: Addresses and Identifiers
Proper traffic identification through techniques such as IP addressing and VLANs are the foundation of a secure network.
Hot Topics
3
Converged Infrastructure: 3 Considerations
Bill Kleyman, National Director of Strategy & Innovation, MTM Technologies,  4/16/2014
2
Heartbleed's Network Effect
Kelly Jackson Higgins, Senior Editor, Dark Reading,  4/16/2014
White Papers
Register for Network Computing Newsletters
Cartoon
Current Issue
Video
Slideshows
Twitter Feed