Upcoming Events

Where the Cloud Touches Down: Simplifying Data Center Infrastructure Management

Thursday, July 25, 2013
10:00 AM PT/1:00 PM ET

In most data centers, DCIM rests on a shaky foundation of manual record keeping and scattered documentation. OpManager replaces data center documentation with a single repository for data, QRCodes for asset tracking, accurate 3D mapping of asset locations, and a configuration management database (CMDB). In this webcast, sponsored by ManageEngine, you will see how a real-world datacenter mapping stored in racktables gets imported into OpManager, which then provides a 3D visualization of where assets actually are. You'll also see how the QR Code generator helps you make the link between real assets and the monitoring world, and how the layered CMDB provides a single point of view for all your configuration data.

Register Now!

A Network Computing Webinar:
SDN First Steps

Thursday, August 8, 2013
11:00 AM PT / 2:00 PM ET

This webinar will help attendees understand the overall concept of SDN and its benefits, describe the different conceptual approaches to SDN, and examine the various technologies, both proprietary and open source, that are emerging. It will also help users decide whether SDN makes sense in their environment, and outline the first steps IT can take for testing SDN technologies.

Register Now!

More Events »

Subscribe to Newsletter

  • Keep up with all of the latest news and analysis on the fast-moving IT industry with Network Computing newsletters.
Sign Up

Data Classification Tips And Technologies

As data continues to grow in terabytes in the enterprise, classifying it based on its sensitivity has never been more important. According to a just-released InformationWeek report, 10 Steps to Effective Data Classification, an organizational classification program defines policy requirements; specific classifications, generally as 'secret', 'private', 'confidential' and 'public', and their associated data types; processes and procedures; accountability metrics; and repercussions from not following the rules, according to author Erik Bataller, a senior consultant with information security consultancy Neohapsis.

In the first of a two-part series on 10 guiding principles and practical recommendations for a classification program that will help companies meet their regulatory requirements, Bataller says the starting point has to be getting buy-in from everyone from the CEO down to take classification seriously. “The best way to make that happen is to develop the classification program directly with key stakeholders,’’ he writes. “The business, not IT, owns organizational data, so establish a dialogue with the executives and staff responsible for relevant systems. They need to be the enforcers across their groups.”

The second step is to understand the drivers for classification. Regardless of whether it is subject to regulatory requirements, all companies have sensitive data and they need to take a look at how are they mitigating risk and liability. Third, organizations should keep classification programs simple, he emphasizes. If rules are overly difficult, they will likely be ignored, and they also cannot have a significant impact on productivity, he says.

The fourth step is to think through classification levels. The highest is of data that is critical to the core value of the organization; the second highest is data that needs to be kept confidential, followed by data that should not be distributed outside of the company, but if disseminated would not cause lasting harm. The last level of data is any that can be viewed publicly, Bataller says.

“Classification efforts in particular are a prime opportunity to develop an availability model that can go hand in hand,’’ he says. Customer support systems, for example, often contain highly sensitive data and must be readily available on a 24/7 basis. “Documenting these criteria while classifying can increase the value and subsequent support of the program and provide a more comprehensive understanding of how systems must be designed and resourced.”

Choosing technology and controls is the fifth step, according to Bataller, even though there isn’t any specific data classification technology. When thinking about what to use, IT should match the value, timeliness and performance requirements of the data, as well as the applications using the data, to the performance and cost characteristics of the media, notes Kurt Marko, a regular contributor to InformationWeek and an IT industry veteran.

“Auto-tiering software, which takes a relatively simplistic approach to the problem by looking at easily measured parameters like last access time [and] frequency of access, at best does a crude job at this,’’ Marko says, suggesting that it is better to “consciously classify different types of data, using not only high-level parameters like file/content-type, but more granular metadata/document tags.” The better the classification, the more accurately IT can bind different data types to the best storage media.

In terms of hardware, there is still use of tape for archive data, but users are migrating to disk for data protection, says Deni Connor, founding analyst, Storage Strategies NOW/Systems Strategies NOW. The cloud is also becoming a useful archive for data, she says, and in some SMBs, is replacing tape.

As for governance, risk management and compliance (GRC), Connor says that users need to have an on-site disk or tape archive of data that can be accessed quickly and efficiently. “GRC is often driven by the courts, so the response time is critical – either disk or LTO-5 tape will suffice,’’ she says. “With LTO-5 you have LTFS capability, which makes it easier and quicker to recover data from tape.

Learn more about Strategy: Hadoop and Big Data by subscribing to Network Computing Pro Reports (free, registration required).


Related Reading


More Insights


Network Computing encourages readers to engage in spirited, healthy debate, including taking us to task. However, Network Computing moderates all comments posted to our site, and reserves the right to modify or remove any content that it determines to be derogatory, offensive, inflammatory, vulgar, irrelevant/off-topic, racist or obvious marketing/SPAM. Network Computing further reserves the right to disable the profile of any commenter participating in said activities.

 
Disqus Tips To upload an avatar photo, first complete your Disqus profile. | Please read our commenting policy.
 
Vendor Comparisons
Network Computing’s Vendor Comparisons provide extensive details on products and services, including downloadable feature matrices. Our categories include:

Research and Reports

Network Computing: April 2013



TechWeb Careers