Whatever Happened to Business Supercomputers?

When it came down to business, supercomputers headed in another direction.

John Edwards, Technology Journalist & Author

November 29, 2021

4 Min Read
supercomputer abstract multicolored
Aleksei Gorodenkov via Alamy

It once seemed inevitable -- a surefire thing -- that supercomputers would help businesses tackle the demands imposed by massive databases, complex engineering tools, and other processor-draining challenges. Then, suddenly, both technology and businesses took a different course.

Chris Monroe, co-founder of and chief scientist at quantum computing company IonQ, offers a simple explanation for the abrupt change in interest. “Supercomputers failed to catch on because, although they bring the promise of speed and ability to process large computational problems, they come with a significant physical footprint [and] energy/cooling requirements,” he notes. “When it comes to mainstream adoption, supercomputers never hit the right balance of affordability, size, access, and value-add enterprise use cases.”

New Directions

Supercomputers have traditionally been defined by the fact that they bring together a collection of parallel hardware providing a very high computational throughput and rapid interconnections. “This is in contrast to traditional parallel processing where [there are] a lot of networked servers working on a problem,” explains Scott Buchholz, government and public services CTO and national emerging technology research director for Deloitte Consulting. “Most business problems can be solved either by the latest generation of standalone processors or else by parallel servers.”

The arrival of cloud computing and easily accessible APIs, as well as the development of private clouds and SaaS software, put high-performance computing (HPC) and supercomputers in the rearview mirror, observes Chris Mattmann, chief technology and innovation officer (CTIO) at NASA's Jet Propulsion Laboratory (JPL). “Relegated to science and other boutique use, HPC/supercomputers ... never caught up to modern-day [business] standards.” 

Current Adopters

Today, while most businesses have shied away from supercomputers, scientific and engineering teams often turn to the technology to help them address a variety of highly complex tasks in areas such as weather predictions, molecular simulation, and fluid dynamics. “The sets of scientific and simulation problems that supercomputers are uniquely well suited to solving will not go away,” Buchholz states.

Scott_Buchholz-DeloitteConsulting.jpg

Supercomputers are primarily used in areas in which sizeable models are developed to make predictions involving a vast number of measurements, notes Francisco Webber, CEO at Cortical.io, a firm that specializes in extracting value from unstructured documents. 

“The same algorithm is applied over and over on many observational instances that can be computed in parallel," says Webber, hence the acceleration potential when run on large numbers of CPUs.” Supercomputer applications, he explains, can range from experiments in the Large Hadron Collider, which can generate up to a petabyte of data per day, to meteorology, where complex weather phenomena are broken down to the behavior of myriads of particles.

There's also a growing interest in graphics processing unit (GPU)-and tensor processing unit (TPU)-based supercomputers. “These machines may be well suited to certain artificial intelligence and machine learning problems, such as training algorithms [and] analyzing large volumes of image data,” Buchholz says. “As those use cases grow, there may be more opportunities to 'rent' time via the cloud or other service providers for those who need periodic access, but don’t have a sufficient volume of use cases to warrant the outright purchase of a supercomputer.”

While mostly relegated to large academic and government laboratories, supercomputers have managed to find a foothold in a few specific industry sectors, such as petroleum, automotive, aerospace, chemical, and pharmaceutical enterprises. “While the adoption isn’t necessarily widespread in scale, it does demonstrate these organizations' capacity for investments and experimentation,” Monroe says.

Outlook

The focus moving forward will be on new types of supercomputer architectures, such as neuromorphic and quantum computing, Mattmann predicts. “This is where supercomputing companies will be investing in to disrupt the traditional model powering clouds.”

Chris_Mattmann-NASAJetPropulsionLaboratory.jpg

Classical computing will simply reach a limit, Monroe observes. “Moore's law no longer applies, and organizations need to think beyond silicon,” he advises. “Even the best-made supercomputers … are dated the moment they are designed.” Monroe adds that he's also beginning to see calls for merging supercomputers with quantum computers, creating a hybrid computer architecture.

Eventually, however, Monroe anticipates the widespread adoption of powerful and stable quantum computers. “Their unique computational power is better suited to solve complex and wide-scale problems, like financial risk management, drug discovery, macroeconomic modeling, climate change, and more—beyond the capabilities of even the largest supercomputers,” he notes. “While supercomputers still have a large presence … the top business minds are already looking toward quantum.”

Takeaway

Buchholz doesn't expect mainstream enterprises to reverse their view of supercomputers at any point in the foreseeable future. “If the question is whether or not most organizations need a special-purpose, multi-million-dollar piece of hardware, the answer is generally, ‘no’, because most applications and systems are targeted at what can be done with commodity hardware today,” he explains.

On the other hand, Buchholz notes that technological momentum may eventually sweep many enterprises into the supercomputer market, whether they realize it or not. “It’s important to remember that today’s supercomputer is the next decade’s commodity hardware,” he states.

Related Content:

What Comes Next for the COVID-19 Computing Consortium

Supercomputers Recruited to Work on COVID-19 Research

Is Quantum Computing Ready for Prime Time?

About the Author

John Edwards

Technology Journalist & Author

John Edwards is a veteran business technology journalist. His work has appeared in The New York Times, The Washington Post, and numerous business and technology publications, including Computerworld, CFO Magazine, IBM Data Management Magazine, RFID Journal, and Electronic Design. He has also written columns for The Economist's Business Intelligence Unit and PricewaterhouseCoopers' Communications Direct. John has authored several books on business technology topics. His work began appearing online as early as 1983. Throughout the 1980s and 90s, he wrote daily news and feature articles for both the CompuServe and Prodigy online services. His "Behind the Screens" commentaries made him the world's first known professional blogger.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights