My first question for those who believe the cloud costs more is: over what time period? If the cloud allows you to avoid making a capital purchase, then it will almost always enjoy a demonstrable cost advantage in the short run. But what about longer periods? This is an argument that needs a case-by-case comparison and is not possible to resolve in the general sense.
The roadblock is determining precisely how much a given IT operation costs over a three- or five-year period, versus how much it costs in Amazon Web Services EC2 or other cloud service. If apples-to-apples comparisons are hard to achieve, what's crystal clear is what Amazon is charging. This enables responsible IT admins' best estimates to be juxtaposed against known cloud costs.
The main argument supporting the cloud costing more is based on Moore's Law, which says the cost of a compute cycle is halved every 18 months by doubling output. So why doesn't cloud pricing follow a similar downward trend? It's because the cloud is a complete system, not just a standalone core or other component to which Moore's Law might apply. Furthermore, cloud computing provides services -- configuration, deployment, monitoring, chargeback and shutdown -- that an IT staff provides on premises. It's hard to assign costs to those on-premises services.
So I guess this debate is going to go on. But the cloud is automating processes that remain the charge of humans in enterprise IT. That alone ought to be a clue where both short- and long-term cost advantages reside.
Image Credit: Flickr user 401(K)2012