Intel Leads Push To Move Graphics Chips From Niche Role To Mainstream
With all that's going on, GPUs might soon become more important than regular multicore processors.
Intel CEO Paul Otellini hit all the obvious buttons in his keynote speech at the chipmaker's Developer Forum last week, emphasizing the rush toward more cores and smaller process technology. But one trend got little attention: The huge role graphics chips could play in a few years.
From what Intel revealed about its plans for graphics-processing units, they might very soon become more important than all the regular multicore microprocessors we've been getting so focused on lately. Until now, GPUs have been viewed as specialty processors designed for speedy handling of things such as matrix operations for video applications. Their use has been technologically ghettoized to the gaming, supercomputing, and medical imaging corners of the industry.
But that's about to change. Based on last week's news, it's likely we'll look back on 2008 as the year graphics processing went mainstream. That's when Intel is planning to ship an eight-core processor, code-named Larrabee, which will deliver very high-performance graphics and support high-performance computing.
And Intel isn't alone in the graphics push. Advanced Micro Devices is planning to release a notebook platform, called Eagle, with a dedicated GPU and a separate GPU core in the main microprocessor in 2009. This means AMD is going full-tilt to integrate the smarts of ATI, the graphics house it acquired last year, into its mainstream processors.
This "regularizing" of graphics alongside central processing is something that's been expected for a long time. Now, conditions are ripe. We have the process technology that supports the transistor counts required to etch GPU and CPU side by side. And AMD's purchase of ATI has sent Intel into a graphics frenzy, taking the war between the two companies into that realm.
Nvidia also appears to be moving in this direction. It's mostly thought of as a graphics-chip supplier to the PC industry, but early this month the chipmaker showed up at the High Performance on Wall Street conference to present its graphics processors to the developers who run high-end clusters for advanced stock-trading systems. Nvidia wasn't showcasing them as graphics engines per se--although that was certainly part of the sell--but as processors to which significant tasks could be assigned, to maintain and even boost system throughput.
Pushing Pixels Intel's Larrabee
Eight-core processor that will deliver high-performance graphics and support high-performance computing. Expected to be unveiled next year.
AMD's Eagle
Notebook platform with a dedicated GPU and a separate GPU core in the main microprocessor. Expected in 2009.
Nvidia's Expansion
Chipmaker pitches its graphics processors for high-end clusters in advanced stock-tracing systems.
SILICON STRATEGIES
Besides its GPU plans, Intel last week showed a prototype silicon wafer built using 32-nanometer technology and laid out a road map for its 45-nm Penryn processors, which it says will be available Nov. 12. The chipmaker also emphasized that it will expand availability of WiMax on laptops.
Intel executives presented the company's next-generation microarchitecture for business computers, code-named Nehalem, and outlined new technologies for hardware-based security and improved, manageable data encryption technology.
Intel also announced system-on-a-chip technology that it expects to launch next year. Intel says the technology, aimed at the embedded and communication markets, will deliver a 20% reduction in power and as much as a 45% smaller footprint over previous multicomponent security products. On the competitive front, AMD fired back at Intel earlier this month when it launched its quad-core Opteron server processor, code-named Barcelona. And last week it announced plans for a desktop chip with three cores.
Intel's forum also highlighted Intel's co-founder Gordon Moore, who observed in 1965 that the number of transistors in computers would double every 18 months. Moore said that he expects his axiom will continue to hold true for only 10 to 15 more years.
About the Author
You May Also Like
2024 InformationWeek US IT Salary Report
Aug 15, 20242024 InformationWeek US IT Salary Report
May 29, 20242022 State of ITOps and SecOps
Jun 21, 2022