Infrastructure // Storage
Commentary
10/4/2013
03:52 PM
George Crump
George Crump
Commentary
Connect Directly
RSS
E-Mail
50%
50%
Repost This

Why Flash Storage Benchmark Testing Is Not Hype

Some day, your company might need the "oomph" of flash storage.

I've been briefed on or directly involved with a few lab tests in the past few months that demonstrated the performance of flash storage systems. Although the performance achieved is generally well beyond what most organizations need today, there is value in doing these tests and getting their results. Interestingly, not everyone agrees.

The most common complaint is that no one needs this kind of performance. That is not exactly accurate. Most might not need it, but some do -- right now. High-frequency trading (HFT) and high-performance computing (HPC) are two excellent examples.

Also, as virtual server and virtual desktop environments become more dense, with more virtual machines per host resulting in fewer, more storage-I/O-demanding hosts, we will see an increasing performance demand. Finally, it is important to note that most of the millions of IOPS tests are on sequential read I/O, not random read/write I/O. On those tests, performance often drops as low as the 500k IOPS range, a requirement that we are starting to see in some heavily virtualized environments.

[ Ignore vendor pressures and satisfy your own storage needs. Read How To Pick The Right SMB Storage System. ]

The adoption of virtualized servers and desktops as well as clustered servers is a significant change in the way we measure or should measure IOPS. No longer are we looking to meet the demand of a single application with a dedicated storage device. We are now looking to meet the demand of dozens of hosts, all driving traffic to a single or clustered set of storage devices. The combined IOPS of the data center is now a critical factor.

We also now have storage systems capable of supporting a mixture of each of these workloads: virtualized servers and desktops as well as clustered applications and scale-up applications. In the past we had to allocate separate storage to each, so large, combined IOPS numbers were not needed. Now we can support mixed workloads on a single system, which simplifies storage management but requires storage performance.

There is also the value of what we learn about storage system and infrastructure design from these tests. For example, in a recent test we learned the value of having multiple PCIe hubs to transfer data to the storage infrastructure. We also learned the advantage of using Gen 5 Fibre (16 Gbps FC) Channel instead of 8G. These lessons apply in any performance-constrained situation and justify why you should be investing in advanced servers and networks now instead of later.

Finally, there is also the reality that eventually most data centers will need this performance. A few years ago, tests that were delivering 100,000 IOPS were being ridiculed for being unrealistic, now 100k IOPS is a common request for data centers.

You probably don't need 1 million IOPS today but you might very well soon. The good news is the work is already been done; you can apply that learning on today's infrastructure and know we are ready for you in the future.

Comment  | 
Print  | 
More Insights
Comments
Threaded  |  Newest First  |  Oldest First
storman2
50%
50%
storman2,
User Rank: Apprentice
10/7/2013 | 7:46:30 PM
re: Why Flash Storage Benchmark Testing Is Not Hype
The key is truly understanding the performance requirements of the workloads themselves and how they interact with the storage infrastructure. As George points out, relying on IOPS is a poor indicator of performance, particularly for file-based environments, which usually have very heavy metadata performance constraints. IOPS numbers don't take metadata performance into consideration. We at SwiftTest (ww.swifttest.com) are directly addressing this with workload modeling and performance validation solutions that are a direct reflection of your actual workloads. With this insight into your production applications, you can determine which storage systems and which configurations are truly best for your environment before putting them into production.
Register for InformationWeek Newsletters
White Papers
Current Issue
InformationWeek Elite 100 - 2014
Our InformationWeek Elite 100 issue -- our 26th ranking of technology innovators -- shines a spotlight on businesses that are succeeding because of their digital strategies. We take a close at look at the top five companies in this year's ranking and the eight winners of our Business Innovation awards, and offer 20 great ideas that you can use in your company. We also provide a ranked list of our Elite 100 innovators.
Video
Slideshows
Twitter Feed
Audio Interviews
Archived Audio Interviews
GE is a leader in combining connected devices and advanced analytics in pursuit of practical goals like less downtime, lower operating costs, and higher throughput. At GIO Power & Water, CIO Jim Fowler is part of the team exploring how to apply these techniques to some of the world's essential infrastructure, from power plants to water treatment systems. Join us, and bring your questions, as we talk about what's ahead.