Adam Ely

Network Computing Blogger


Upcoming Events

A Network Computing Webcast:
SSDs and New Storage Options in the Data Center

March 13, 2013
11:00 AM PT / 2:00 PM ET

Solid state is showing up at every level of the storage stack -- as a memory cache, an auxiliary storage tier for hot data that's automatically shuttled between flash and mechanical disk, even as dedicated primary storage, so-called Tier 0. But if funds are limited, where should you use solid state to get the best bang for the buck? In this Network Computing webcast, we'll discuss various deployment options.

Register Now!


Interop Las Vegas 2013
May 6-10, 2013
Mandalay Bay Conference Center
Las Vegas

Attend Interop Las Vegas 2013 and get access to 125+ workshops and conference classes, 350+ exhibiting companies and the latest tech.

Register Now!

More Events »

Subscribe to Newsletter

  • Keep up with all of the latest news and analysis on the fast-moving IT industry with Network Computing newsletters.
Sign Up

See more from this blogger

White List Or Black List?

I have spent my week deep in thought on how to secure connections from third-party business partners into my organization.  Many of these partners work as an extension of the company, such as outsourced development and operations. These partners have access to source code, business documents, and other sensitive data  we would prefer that no one could get to. Data theft is a serious concern, as are other issues, such as a malware infection that hops from a partner's system onto our network.

When I ask my coworkers about this issue, some say to give full, open access, while others advise locking down resources as tight as possible. This is a problem many security professionals wrestle with, and I'm not sure IT has the right solution for every situation.

Traditional theory tells us to use a white list: allow specific source and destination, port and protocol, and only provide access to those we believe to be safe.  In dynamic and changing environments, however,  this leads to lots of changes for IT and reduced productivity for the requestors. Have you ever developed code only to find out when it moved to production that some firewall rule blocks access and IT can't make the change for a week? I have, and it sucks.

Blacklisting is more effective. You identify what needs protecting and don't allow access to it.  I bet there are fewer systems and data that need to be protected than accessed.  This approach may require moving some systems or blocking entire subnets, but white listing can lead to the same work.

I am willing to admit this is not a one-size-fits-all approach. If you have a limited number of partners that only need a specific number of static resources, white listing is the way to go. But if you have integrated partners with ever-expanding responsibilities, evaluate blacklisting as a serious alternative. Once the access control method is in place, be it a white list or black list, other protections can be layered on as needed to identify and respond to access violations or attacks.


Related Reading


More Insights


Network Computing encourages readers to engage in spirited, healthy debate, including taking us to task. However, Network Computing moderates all comments posted to our site, and reserves the right to modify or remove any content that it determines to be derogatory, offensive, inflammatory, vulgar, irrelevant/off-topic, racist or obvious marketing/SPAM. Network Computing further reserves the right to disable the profile of any commenter participating in said activities.

 
Disqus Tips To upload an avatar photo, first complete your Disqus profile. | Please read our commenting policy.
 
IaaS Providers
Cloud Computing Comparison
With 17 top vendors and features matrixes covering more than 60 decision points, this is your one-stop shop for an IaaS shortlist.
IaaS Providers

WAN Security Reports

Research and Reports

The Virtual Network
February 2013

Network Computing: February 2013

Upcoming Events



TechWeb Careers