May the Power Be With You
When we think of physical IT infrastructure, we usually focus on servers, storage and networks. But without electricity for running and cooling computing equipment, IT is dead in the water. Yet not enough attention over time has been paid to problems such as provisioning and efficiency that plague what might be called the "shadow" physical IT infrastructure.
That is changing. Data center infrastructure management (DCIM) extends traditional system and network management thinking to the monitoring and management of all critical related data center resources in a comprehensive and integrated manner. DCIM is a relatively new term, but it provides a way of thinking and a rallying cry that is driving innovation. Large players, such as BMC, CA, HP and IBM, are among those that are taking leadership roles in DCIM, but a large number of companies, including startups, are attacking aspects of the problem.
To keep things simple and to illustrate DCIM from a data center energy perspective, our focus here is on a young company, Power Assure.
Unlocking Hidden Capacity
Here is a conundrum: Why are some data centers apparently running out of power despite the fact that server racks or rooms are half-full? The reason is that data centers think that they are running out of power and, therefore, are afraid to add equipment. In fact, they have a lot of power from the grid that is still available for them to use. The anticipated power requirements are collected from equipment nameplates and vendor-supplied information. As this information tends to be conservative, the power requirements are overstated (which means that you seem to be using more power than you use by a wide margin). The difference between what nominally you are using and what you could use is hidden capacity.
Power Assure uses its EM/4 software platform to address the issue of hidden capacity. The first step is monitoring true energy consumption. Power Assure uses a metric called PAR4 that has been IT energy-efficiency-tested by Underwriters Laboratories (UL). Equipment power consumption is measured at four levels: off, idle, loaded and peak. A PAR4 value is the transactions per second per watt at 100% load. The accumulated numbers enable accuracy in planning the maximum power allocation. That is useful in being able to take advantage of hidden capacity.
Note that this approach can also be used to compare efficiency across multiple generations of IT equipment--past, present and future (theoretical). For example, how much power could be saved by moving to a new generation of servers? Or, alternately, how many additional servers could be added without going over planned maximum power consumption?
Power Assure also employs other metrics, such as Power Usage Effectiveness (PUE), which calculates facility efficiency as the ratio of total power consumed throughout the data center and the power consumed by the IT equipment. Power Assure states that, typically, 50% of the power in a data center is consumed by equipment (servers, storage and networking), 40% by the cooling system (a chilling fact!) and 10% by the inherent inefficiencies in the power distribution itself. So all components have to be examined to ensure a proper balance.
[ Join us at Interop Las Vegas for access to 125+ IT sessions and 300+ exhibiting companies. Register today! ]
Note that one key to the effectiveness of this process is the collection of data from a variety of sources. Power Assure software is data-driven software intelligence that uses data that can be captured from the equipment itself. Quite frankly, the old saying that you can't manage what you can't measure is changing IT infrastructure management processes from being reactive (seat-of-the-pants intuition) to proactive (data-driven decision making) at the same time that software (from server virtualization to storage management and on and on) is fundamentally altering how IT interacts with hardware resources. This is a very positive trend, and Power Assure is very much in line with that trend.
But monitoring is only part of the process. A second step is analysis. Prioritizing the list of possible improvements and taking the necessary corrective action (sometimes with the use of automation) seals the deal.
Next Page: Data Center Efficiencies
Recommended For You
Opensource software depends on community contributions to projects, even projects maintained by organizations. Contributing back to a project improves the project for all.
Like any technology, ADCs solve problems while raising new challenges. ADCs effectively can eliminate many headaches, including those they create. Automation is one possible avenue toward doing that.
Workload placement once required best guesses, but automated workload analysis is changing the game
Service meshes will be an important component of your containerized environments whether on-premises or in the cloud.
Ubiquitous 5G combined with AR promises to transform training, tasks, and customer interactions. Here's what you need to know about building tomorrow's virtual workplace.
Organizations embarking on a digital transformation utilizing Agile concepts should give close consideration to these principles and the accompanying Do's and Don'ts.