Placement of data is not obvious. Tier 0 offers the highest performance, but at the highest per byte cost although it has a cost advantage over FC in terms of $ per I/O. Tier 2 costs the least but delivers the slowest performance. Tier 1 is in the middle—lower performance and cost than Tier 0, but higher cost and better performance than Tier 2.
Getting the right answer to the question of data placement on storage tiers is very important. Applications which need the best performance, say, for decision-making, revenue production, or customer satisfaction reasons, can benefit from the optimum placement of selected data on higher performance drives. Less business critical application data can still meet quality of service (QoS) requirements by residing on more cost effective drives. Getting the proper balance saves storage expenses and reduces energy requirements. Thus the net goal is to simultaneously raise performance service levels while at the same time lowering costs. These are the kinds of tradeoffs data center managers would like to have every day.
In and of themselves, applications know nothing about tiering. Instead, they create, read, update and delete data, but knowing where they write to and read from is beyond their purview. Manual placement of data is possible, such as for a database, but that process tends to be static or even reactive. What's preferable was a dynamic approach to placing data, wherein different pieces of data in an application could move to different tiers when and as appropriate. That is what EMC's FAST accomplishes. FAST automates the movement of data to the appropriate tier and the application is none the wiser. In a FAST environment, the data is transparent to the application at any particular time, and also to the user of the application.
Note that the static placement of data is difficult enough in a non-virtualized environment, but is totally untenable in a virtualized environment. Not only do the applications on virtual machines move hither and yon as deemed necessary, but the physical array on which the data resides may shift, for instance, in a private cloud arrangement from an internal array to a remote third party external array without advanced warning. A dynamic process requires automation in order to keep things consistent, accurate and complete as well as timely.