NexGen: The Next David to Battle Midrange SAN Goliaths
Small vendors, especially startups, often tend to follow baseball great Willie Keeler's advice of "hit 'em where they ain't" by targeting a market niche that has potential, but where larger vendors either don't play or have a very small footprint. NexGen, with its NexGen n5 Storage System, does just that but with a more direct twist. And that makes it very interesting.
November 29, 2011
Small vendors, especially startups, often tend to follow baseball great Willie Keeler's advice of "hit 'em where they ain't" by targeting a market niche that has potential, but where larger vendors either don't play or have a very small footprint. NexGen, with its NexGen n5 Storage System, does just that but with a more direct twist. And that makes it very interesting.
First, NexGen is attacking the midrange SAN market. Now, since this market is a pool that is filled with piranhas (smaller players) as well as sharks (larger vendors), why would NexGen willingly swim there? Well, for one thing, the midrange SAN market is easier for a smaller vendor to attack than a larger enterprise market for a number of reasons, including longer sales cycles that tie up a greater proportion of resources than if they were targeted to multiple prospects. Secondly, the company believes that there is a huge unmet need for midrange SAN solutions that address an important CIO concern: The server virtualization CIOs would like to extend to business and mission-critical applications has greatly slowed because of lack of capabilities or high costs in existing midrange SAN offerings. Thus, while NexGen is going directly at the midrange SAN market, it is doing so by differentiating itself with a product that addresses what it feels is a large unmet need.
There is at least one transparent solid gel in which coins, or other similarly small but heavy items, can be suspended. When the container holding the gel is shaken with sheer force (preferably with a tightly closed cover!), the solid gel turns into a liquid and the coins fly around to different positions within the container. Then, when the shaking stops, the gel returns to a solid state, but the coins are in a new configuration. Consider this example as analogous to a transformational trend in the IT industry, and that the coins are vendors. The vendors’ positions shift during the transformation; when the trend is mature, this results in changed market positions for the vendors. Inevitably, every major vendor hopes that the change is favorable to them while every startup hopes that the transformational process will beneficially alter their position.
NexGen--and it is, of course, by no means alone in its belief--feels that server virtualization is a major transformational force and believes it can benefit from that transformation. The company’s basic premise is that, currently, storage is an impediment to the full implementation of server virtualization. For test and development in virtual machine environments, storage really hasn’t been an impediment, since neither high availability nor shared storage is required in those situations. Similarly, non-critical production workloads, such as Web servers and file servers, are not really a problem as shared storage with high availability is readily available.
However, as enterprises move to virtualize more important applications, typically called business-critical (such as email) and mission-critical (such as revenue-generating online transaction processing applications), the situation changes. Why is that the case? Well, the reason for cramming multiple virtualized applications, each with its own virtual machine operating system image, together in a single system is to utilize a higher percentage of the CPU and I/O resources of that physical server. This reduces or even eliminates the need for other physical servers (this is what consolidation is all about) and enables those other physical servers to be redeployed, retired or sold. The net economic result is better financials for an IT organization.
But these financial benefits come with a non-monetary price. When one critical application runs on one physical server and one dedicated portion of shared physical storage on a SAN, the resource mix of CPU, I/O and storage can be tuned (probably manually) to provide the quality of service (QoS) for performance and capacity that the application requires during both regular processing and times of peak load demands. When multiple critical applications are commingled on the same physical server through virtualization, the situation changes as the demand for resources becomes essentially stochastic (meaning that it may not be possible to predict which applications will need I/O resources at what time). Storage, even with storage virtualization capabilities (think thin provisioning, which is actually a better capacity utilizing tool), was not designed to handle performance demands that are, for all practical purposes, unpredictable.
As a result, the performance predictability that is essential for business-critical applications cannot be guaranteed. High levels of QoS, including priority allocation of resources that mission-critical applications need to ensure that they can do their job properly, cannot be guaranteed. This has led to the slower adoption of server virtualization and lower operational benefits for these types of applications than CIOs would like.
NexGen describes the inability to provide performance predictability and guaranteed QoS levels as "the storage industry gap" and feels that existing capabilities to end the storage industry gap are either unavailable or too expensive. Now, other vendors are attempting to address these issues, such as with scale-out architectures that provide management simplicity for scale; solid state devices that consolidate performance workloads; and high-end midrange arrays that take capabilities from enterprise offerings that are moving down market.
Still, NexGen believes that none of these approaches can really address the unpredictability of performance or related issues, such as cost and complexity. Relative to storage in virtualized systems, managing capacity is easy conceptually and with the use of tools that already exist. Managing performance is a much more difficult process, since there is no easy way to provision/allocate and everything impacts everything else. NexGen feels that it has found a way out of this problem that, at the same time, has resulted in an integrated storage solution for midrange SANs.
NexGen has recently announced its NexGen n5 Storage System, which functions within what it calls an IoControl Operating Environment. This enables deterministic performance in multiple tiers of storage. The NextGen N5 Storage System provides a more or less traditional HDD active-active storage array, as we will see shortly. But it also employs ever-more-popular solid state devices on a PCIe bus for two key reasons.
The first is to fully exploit the speed of SSDs, which cannot be done if the SSDs are merely acting as HDD clones and are, therefore, held back by the limitations of storage controllers designed for HDDs. The second is to separate performance (which is delivered by SSDs) from capacity (which is delivered by HDDs). Why does this approach matter, and how does it differ from other solutions? Because it provides the means for administrators to easily and effectively manage storage performance and data placement in virtualized environments:
Performance QoS--IT administrators can provision performance as if it were capacity. That is. they can see how much of existing performance capabilities have already been provisioned and what is left, and allocate the rest of the capabilities
Dynamic Data Placement--This capability enables assigned QoS levels to be maintained over time and despite changing circumstances
Now back to capacity. NexGen claims to be able to provide up to 90% lower storage operating expenses (with the usual caveat that "your mileage may vary") through an approach called Phased Data Reduction. Data reduction is a good term to use as it really describes what is happening. This is in contrast to the currently fashionable term "data deduplication," which, while useful, tends to be associated with backup on secondary storage and is not comprehensive enough to deal with all the issues required to squeeze down the size of primary storage. Not surprisingly, NexGen’s approach offers significant cost benefits; the company says that using its Phased Data Reduction solution across all storage tiers can ensure up to 58% lower cost per gigabyte as compared to a leading competitive alternative.
Overall, NexGen makes a good case for why it is the new David to shake up the Goliaths of the midrange SAN market. The company promises that it can deliver the predictable performance that business- and mission-critical applications require from storage, allowing companies to take full advantage of server virtualization, and that it does so with increased shared storage efficiency that makes the solution competitively attractive.
Now the original David beat Goliath, but, in the real world, the outcome may not be the same. The Goliaths have existing market share, well-established distribution channels, and deep and capable engineering and financial resources that start-ups can only dream about.
Still, NexGen provides a comprehensive integrated midrange SAN solution for virtualized environments today that will likely gain advanced software features, such as remote replication, further down the road. Given the continuing uptake of server virtualization and its growing importance in business- and mission-critical applications, enterprise customers and competing vendors need to pay close attention to NexGen.
As of this publication, NexGen is not a client of David Hill and the Mesabi Group.
About the Author
You May Also Like