Infrastructure // Storage
12:21 PM
George Crump
George Crump
Connect Directly
Repost This

How To Maximize Your SSD Investment

Some servers benefit from solid state drives more than others. An SSD deployment strategy will help you get the biggest bang for your buck.

At least 60% of the conversations I have with end users are about solid state drives (SSDs). The focus of the conversation is no longer "should I use SSD?" but "how can I best use SSD?"

An increasing number of IT managers deal with IT performance problems, and they are willing to pay a premium to make those problems go away. Like anyone who pays a premium price for a service, they want to make sure they maximize the investment. Designing the perfect SSD architecture is an elusive goal, but having an SSD strategy is an excellent idea.

Ironically, the problem is that you can often just add an SSD to a server or storage system and see some performance improvement--no matter how the applications using it are configured. A simple example of this is demonstrated in our recent lab report "Does The SMB Need SSD Performance?" where we demonstrated that an intelligent integration of SSD into a storage system allowed a poorly configured 1-GB iSCSI network and a very simple SQL database test to dramatically improve performance.

In most cases, this has nothing to do with the environment taking advantage of the SSD's I/O per second, but instead it's a benefit of SSD providing such low latency. In other words, you gain performance because the drive heads do not have to rotate into place. While this simple performance fix is ideal for the small and midsize business (SMB) market, an enterprise should get every ounce of SSD performance it is paying for. To do that you need an SSD deployment strategy.

The first step in an SSD strategy, even if you've started to deploy SSD without a plan, should be to understand if you need SSD and where you need it. In our popular white paper "Visualizing SSD Readiness," we cover the old-school manual way of determining SSD viability per application. Many SSD vendors now include some type of profiling tool that will do much the same thing, but our method is free and independent. The older I get the more I like "old school."

Our method and the vendor profiling tools have challenges when trying to gauge performance in a mixed standalone and virtual server environment. These analyses typically have to be done one at a time. As we discuss in our article "Improving Storage Performance," third-party tools can provide a more holistic view into the performance demands and capabilities of the combined storage infrastructure.

As a result of these studies, we break servers into four groups. The first group is typically a small group of servers that have such high I/O demands that they can justify an SSD investment all by themselves. These servers may qualify for their own dedicated SSD--either internal to the server or on a SAN. They are typically standalone database servers or virtual server hosts.

The second group is typically a much larger group that, when their combined I/O demands are studied, justify an SSD investment. These are often second-tier database applications, email servers, and virtual hosts with moderate virtual machine demands.

The third group are servers that will see some benefit from SSD, but don't justify the premium cost. These systems are ideally placed on excess SSD capacity if available. If not we keep them on mechanical drives.

The final group are servers where SSD makes no sense and we don't want their data occupying our premium SSD real estate. These servers are relegated to mechanical storage. Even in the few sites we have worked with that use flash-only arrays, the final group, for now anyway, is going on legacy hard drive systems.

With this information, the IT manager is armed with the facts needed to wade through the ocean of SSD solutions. We call the next step triage and focus on fixing the applications that users are complaining about the most. Armed with the above information, we know which servers to fix and we have a better feel of what SSD solutions we can implement now that will fit into the long-range strategy. After triage, we start looking at which combination of SSD solutions is the right mix for the environment. We will cover these next two steps in upcoming columns.

Follow Storage Switzerland on Twitter

George Crump is lead analyst of Storage Switzerland, an IT analyst firm focused on the storage and virtualization segments. Storage Switzerland's disclosure statement.

Comment  | 
Print  | 
More Insights
Register for InformationWeek Newsletters
White Papers
Current Issue
InformationWeek Elite 100 - 2014
Our InformationWeek Elite 100 issue -- our 26th ranking of technology innovators -- shines a spotlight on businesses that are succeeding because of their digital strategies. We take a close at look at the top five companies in this year's ranking and the eight winners of our Business Innovation awards, and offer 20 great ideas that you can use in your company. We also provide a ranked list of our Elite 100 innovators.
Twitter Feed
Audio Interviews
Archived Audio Interviews
GE is a leader in combining connected devices and advanced analytics in pursuit of practical goals like less downtime, lower operating costs, and higher throughput. At GIO Power & Water, CIO Jim Fowler is part of the team exploring how to apply these techniques to some of the world's essential infrastructure, from power plants to water treatment systems. Join us, and bring your questions, as we talk about what's ahead.