The server market is hot again. The percentage of companies planning to increase the number of servers they own jumped six points since last year, to 31%, according to InformationWeek's latest annual State of Server Technology Survey. That jibes with IDC estimates released in August that show server sales growing almost 18% in the second quarter, with unit shipments up 8.5% compared with the same period in 2010. And, of the 676 business technology pros who responded to our poll, 23% will not just buy more servers, they'll spend for devices with advanced capabilities, like multiple 10-Gbps Ethernet ports.
Our data shows this quantity and quality demand is fueled mostly by server virtualization, as virtual machines displace standalone operating system instances at an explosive pace. Our August 2011 Virtualization Management Survey shows that nearly 40% of almost 400 respondents plan to have more than 75% of their production servers virtualized by the end of next year. The resulting application consolidation is placing unprecedented processing load and memory demands on aging server fleets.
Virtualization is also affecting feature sets, as x86 systems now represent about two-thirds of total sales, according to IDC. Among our State of Server Technology respondents, 81% say they make extensive use of Xeon systems, compared with just 10% saying the same about the nearest non-x86 servers.
While virtualization is the most visible server market driver, it's not the only one. Private clouds, with a new breed of applications built to capitalize on machine diversity and parallelism, could become virtualization's second wave. In line with that, a trend shaping Dell's server strategy is "a different computational paradigm for reaching the next level of efficiency," says Forrest Norrod, VP and general manager of the vendor's server platforms. Norrod characterizes this as a dynamic, finely grained scale-out approach, akin to what Google, Facebook, and Amazon have done in building public cloud services. Platforms like Hadoop (a distributed file system and job tracker) and OpenStack demand a new type of supporting infrastructure architecture, where IT can divorce applications from underlying--and often commodity--compute and storage systems.
For example, Facebook claims to have the world's largest Hadoop cluster, with 21 petabytes of data spanning 2,000 machines, most of which are run-of-the-mill eight-core systems with 32 GB of RAM and 12 TB of disk. In an example of how seriously venture capital investors take this "big data" trend, Opera Solutions (not to be confused with the Norwegian browser developer) recently raised $84 million to fund its "analytics as a service" product. As such data-intensive applications catch on, they require large server farms comprising many dozens or hundreds of commodity devices, linked Lego-like into distributed, resilient, self-healing clouds.
Vlad Rozanovich, AMD's director of Americas commercial business, segments the emerging trend highlighted by Dell's Norrod into two markets: "pure" clouds and high-performance computing (HPC) environments-- although we'd argue that these are converging into a single set of infrastructure requirements characterized by highly distributed, massively scalable systems with fault-tolerance and resiliency baked in.
Whatever you call it, it's making for some nice financial results for server vendors.