Network Computing is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Two Companies Cope With Storage And Transfer Of Large Files: Page 2 of 3

The Mill is a London-based visual effects company with developers in London, New York and Los Angeles, and clients like Nike, Turner Classic Movies and the BBC. "We actually use a private MPLS (Multiprotocol Label Switching) network to link up all of our offices," said computer system manager Jonathan Brazier."

Of course, the large file management quandary does not stop at navigating through network bandwidth constraints. The next step for companies is to craft a data center design strategy for storing the data in a way that permits rapid access and cost-effective resource utilization.

Ilion is a Spanish video production house that creates feature film animation for a worldwide market. "At the peak of production, we were using over 350 people," says Gonzalo Rueda, Ilion's CTO and director of technology. Ilion started as a 20-40 person operation, so the extreme ramp-up of personnel coupled with the large animation files that were being produced daily placed enormous pressures on its video rendering system.

"We hired consultants and we looked at several different options to address our large file storage needs," says Rueda. "We were interested in more than just how many IOPS (input/output per second) the storage system was capable of handling, or the sheer speed of throughput. We wanted a solution that was compatible with the way that we did business. For me, this meant that we had to rethink our storage infrastructure so we could move it forward to better align it with the needs of the business."

Ilion ultimately decided to upgrade storage with a two-head, BluArc Titan 2200 storage cluster. "We had 90 terabytes of data that we were working with on our active network stations and another 110 terabytes of data that we needed to access less frequently, and that we had to store in archives," said Rueda.  "We also had to redefine data best practices for redundancy and hot spares."