Sequencing tasks that used to take a month or more now only take a few days or a few hours.

Charles Babcock, Editor at Large, Cloud

August 25, 2006

3 Min Read

Sun Microsystems has succeeded in displacing aging H-P Alpha servers with its Opteron-based line of servers in what is an acknowledged, compute-intensive environment—The Institute for Genomic Research.

The institute at runs a set of gene-sequencing applications that analyze large amounts of data from a DNA sample. Under a procedure pioneered by the institute, the sample is fractured into many small parts as a way of being able to identify bite-sized chunks.

"The bits need to be put back together to identify the entire gene. That's the essence of the computational problem," says Vadim Sapiro, IT director at the Rockville, Md., institute. On the institute's aging servers, whose origins go back to the Digital Equipment Corp.'s Alpha architecture, "it would sometimes take months to babysit one assembly to completion."

For example, by finding the parts that contain some precise nucleotide overlap, they can slowly build out the sequence of proteins in the gene until they've mapped its complete, unique structure. It's like matching up the sequence 2, 3, 4, 5 with the sequence 3, 4, 5, 6. By finding the match, you've extended by the map by one nucleotide.

It might sound easy, but the number of possibilities is mind boggling, Sapiro says. Three billion nucleotides need to be mapped to come up with the composite genome of 20,000-plus human genes. The same sequences are easily found on different parts of a single gene, so additional software needs to sort through the matches, looking for errors

The institute's gene-sequencing software was named to Information Week's Greatest Software Ever Written list on Aug. 14 as number three out of 12 on the list.

Sun's ability to place its x86-instruction set servers, the Sun Fire V40Z, in a demanding, scientific environment is one sign of why it's been able to restart server sales and renew its fortunes. By designing workstations and servers based on AMD's 64-bit Opteron chip, Sun has departed from its invented-here, UltraSparc mentality and adopted what's been winning, according to marketplace economics. At the end of 2004, the institute purchased three Sun Fire servers and ran them alongside its existing 15 Alpha servers. The original gene sequencing software had been ported from Alpha to Linux in 2000, paving the way for the changeover. When the institute found its ported software produced the same results on Sun Fire, only faster, it switched off the Alpha servers earlier this year and let the V40Zs take over.

Sequencing tasks that used to take a month or more on the Alpha servers now take "a few days or a few hours," Sapiro says. "That makes a huge difference to the institute—to get the data out faster."

The institute paid about $30,000 per server for the Sun Fires, compared with $100,000 per server for the Alphas in 1999, Sapiro says. He estimates that his cooling and electricity needs have decreased 70% with the changeover and space has opened up in his data center.

The institute had to buy 64-bit systems in 1999, before they were commonplace, because of the gene sequencing software's need for huge amounts of address space. Most 32-bit systems can generate up to four gigabytes of virtual memory but that wasn't enough, Sapiro says. Alpha was an early 64-bit system.

The institute is famous because it completed the first gene sequencing of a living organism, a bacterium, in 1995, and its techniques, including the "shotgun" sequencing algorithms created by Craig Venter, lead to a proliferation of gene-sequencing projects.

All of the non-profit institute's software is considered open source code and made available to other research organizations. It's available for free download on SourceForge. A sample of what's available can be seen at "What good is our software if the public can't afford the infrastructure to run it on," Sapiro asks. The proliferation of lower cost, 64-bit servers is going to speed advances in genome research into human pathogens, hereditary diseases and other areas deemed likely to lead to better lives, he said.

About the Author(s)

Charles Babcock

Editor at Large, Cloud

Charles Babcock is an editor-at-large for InformationWeek and author of Management Strategies for the Cloud Revolution, a McGraw-Hill book. He is the former editor-in-chief of Digital News, former software editor of Computerworld and former technology editor of Interactive Week. He is a graduate of Syracuse University where he obtained a bachelor's degree in journalism. He joined the publication in 2003.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like

More Insights