The list of priorities includes a call for "ultrascale" computing to dwarf Japanese advances in supercomputing.
The U.S. Energy Department has created a 20-year map for scientific exploration that includes "ultrascale scientific computing" that would reduce from years to days the time needed to simulate complex systems, helping businesses create prototypes rapidly for new products.
Ultrascale scientific computing will involve long-term relationships between the government and U.S. computer vendors, according to the 48-page report, Facilities For The Future Of Science: A 20-Year Outlook. It will require integrated investments in ways to optimize computer performance for scientific and commercial problems by creating a better balance of memory size, processor speed, and interconnection rate.
Energy Secretary Spencer Abraham says ultrascale scientific computing could have a significant economic impact, on the order of billions of dollars, on commercial product design, development, and marketing. Today, U.S. industries are forced to build prototypes for new designs that are expensive and cause significant manufacturing delays. Ultrascale scientific computing will mean virtual prototypes, which would greatly shorten time to market and shrink investment costs, he says. "There is no real way to measure the competitive advantage this kind of computing power can give us," Abraham said Monday in remarks at the National Press Club.
The report points out that current prototyping and simulation of jet engines costs General Electric Co. several years and millions of dollars. The process could be accomplished in less than a day using ultrascale scientific computing. The report also cites General Motors Corp., which, it says, already saves hundreds of millions of dollars using its in-house computing capability. Yet GM believes it can't meet the steady demand for safer, more fuel-efficient, and cleaner cars, without substantial increases in computing capabilities--increases that can't be achieved through existing information technology, according to the report.
Most supercomputers have been designed with the consumer market in mind. The proposed approach would offers new configurations that will build on accomplishments of the new Japanese supercomputer known as Earth Simulator to develop computing capability specifically designed for science and industrial applications, the Energy Department report says. Earth Simulator boasts the power of the 20 fastest U.S. computers combined and a peak speed of 40 teraflops--three times faster than the theoretical peak performance of any U.S. machine.
Ultrascale scientific computing processors envisioned by the Energy Department would be located at multiple sites and would increase by a factor of 100 the computing capability available to support public scientific research, drastically reducing computing time. The computing facilities would be available to all and subject to peer review, the Energy Department says.
Building facilities for ultrascale scientific computing and other scientific endeavors will benefit the American economy and society, Abraham says. "If we build the required facilities and equipment and support the large laboratories that accelerate discovery, America is assured of remaining the world's center of scientific research for many years to come," he says. "This is where the world will want to come to explore the future. This is where the world will want to come to do science."
5 Top Federal Initiatives For 2015As InformationWeek Government readers were busy firming up their fiscal year 2015 budgets, we asked them to rate more than 30 IT initiatives in terms of importance and current leadership focus. No surprise, among more than 30 options, security is No. 1. After that, things get less predictable.
InformationWeek Tech Digest, Nov. 10, 2014Just 30% of respondents to our new survey say their companies are very or extremely effective at identifying critical data and analyzing it to make decisions, down from 42% in 2013. What gives?