One example of a repurpose use would be Hadoop. Hadoop is designed to process large data sets, or big data, in parallel for increased speeds and reduced costs. To accomplish this, Hadoop spreads the processing and data across multiple nodes, from a few to upward of 6.000. Data files are fed into the Hadoop cluster, and processing functions are then run on those data sets.
Utilizing a small number of permanently dedicated resources--primarily Name and Job Tracker nodes, with a few slave nodes--idle resources could be added to the cluster when available. This would allow scalable processing of things like customer or Web traffic data, and so on.
This particular use case would be most relevant in environments with more seasonal peaks, weekly or monthly, due to redistribution of data across the cluster as nodes are added/removed. Additionally, fine tuning would be required to optimize for the architecture, but with the right data sets and resources this would allow data to be processed that might otherwise be neglected due to other priorities.
Hadoop is just one hypothetical example of services or applications that can be run using idle CPU cycles on an enterprise private cloud; many more exist. Remember, a private cloud should provide your IT staff with more time to work on strategic initiatives rather than tactical tasks, an allow them to use that time to optimize efficiencies and support IT’s customer, the business.Joe Onisick is the founder of Define the Cloud and a principal engineer for Cisco's INSBU. Onisick has 17 years of IT experience spanning a broad range of disciplines, starting with server and network administration. From 2000-2005, Onisick was a US Marine, where he served in ... View Full Bio