David Hill

Network Computing Blogger


Upcoming Events

Where the Cloud Touches Down: Simplifying Data Center Infrastructure Management

Thursday, July 25, 2013
10:00 AM PT/1:00 PM ET

In most data centers, DCIM rests on a shaky foundation of manual record keeping and scattered documentation. OpManager replaces data center documentation with a single repository for data, QRCodes for asset tracking, accurate 3D mapping of asset locations, and a configuration management database (CMDB). In this webcast, sponsored by ManageEngine, you will see how a real-world datacenter mapping stored in racktables gets imported into OpManager, which then provides a 3D visualization of where assets actually are. You'll also see how the QR Code generator helps you make the link between real assets and the monitoring world, and how the layered CMDB provides a single point of view for all your configuration data.

Register Now!

A Network Computing Webinar:
SDN First Steps

Thursday, August 8, 2013
11:00 AM PT / 2:00 PM ET

This webinar will help attendees understand the overall concept of SDN and its benefits, describe the different conceptual approaches to SDN, and examine the various technologies, both proprietary and open source, that are emerging. It will also help users decide whether SDN makes sense in their environment, and outline the first steps IT can take for testing SDN technologies.

Register Now!

More Events »

Subscribe to Newsletter

  • Keep up with all of the latest news and analysis on the fast-moving IT industry with Network Computing newsletters.
Sign Up

See more from this blogger

IBM's STG Is On Track With Smarter Computing

But Watson is not the end of the story. Watson uses the basic Von Neumann architectural concepts that have dominated the computing industry for decades. IBM is working in its lab on cognitive computing, which among other things will lead to chips that mimic the way the brain works. Stay tuned. The IT revolution will continue to transform our lives for a long time to come.

As a storage analyst, I would be remiss if I didn’t touch upon the storage strategy discussed at the STG event. While not as headline-grabbing as Smarter Planet or Watson, hardware in the form of servers, storage, and networking continues to be a foundational technology for all of IBM’s broader efforts. And the dependence on large quantities of data, and the fact that the data explosion continues unabated, means that storage is front and center. The twin underpinnings of IBM’s storage strategy — efficiency and optimization — are mandatory if promises to derive great value from the data explosion are to be met without going broke paying for storage on the way.

Efficiency and optimization are sometimes used interchangeably. Efficiency has been defined (by the late management guru Peter Drucker) as doing things right. Optimization is a maximization play (getting the most that is possible) or a minimization play (expending the least resources). This increasing storage utilization could be seen as an efficiency play. Moving data to the proper storage tier (i.e., one that best meets the price/performance requirements for the data being managed) could be considered a maximization of available resources play. Using tape instead of disk for certain applications (such as active archiving) can be a minimization play (reducing energy costs by having idle tapes is the lowest that you can go in saving energy).

IBM illustrated its optimization and efficiency points through its Easy Tier and Active Cloud Engine solutions. Easy Tier can migrate data among up to three tiers of data (typically SSD and a variety of disk and/or tape systems). That provides cost efficiencies through better utilization on a cost basis (data requiring low performance is stored on more cost effective storage media).

The Active Cloud Engine is about providing efficiency and optimization for data in the cloud, where a single view of the data from multiple geographically distributed sites is necessary. Easy Tier might be seen as the internal vertical hierarchy approach, and Active Cloud Engine might be seen as a horizontal approach that works across geographically dispersed sites as necessary. Buying storage is not a simple commodity play at one site with one tier of storage anymore, but rather a hopefully rational process which examines complex requirements and determines the proper solution.


Page: « Previous Page | 123 4 | 5  | Next Page »


Related Reading


More Insights


Network Computing encourages readers to engage in spirited, healthy debate, including taking us to task. However, Network Computing moderates all comments posted to our site, and reserves the right to modify or remove any content that it determines to be derogatory, offensive, inflammatory, vulgar, irrelevant/off-topic, racist or obvious marketing/SPAM. Network Computing further reserves the right to disable the profile of any commenter participating in said activities.

 
Disqus Tips To upload an avatar photo, first complete your Disqus profile. | Please read our commenting policy.
 
Vendor Comparisons
Network Computing’s Vendor Comparisons provide extensive details on products and services, including downloadable feature matrices. Our categories include:

Research and Reports

Network Computing: April 2013



TechWeb Careers