IBM's eDRAM Helps AMD More Than It Hurts Intel

While the next-generation DRAM will boost gaming and embedded systems, don't expect Big Blue to burst market leader Intel's sales bubble anytime soon.

Michael Singer, Contributor

February 16, 2007

3 Min Read

IBM unveiled a smaller, faster, and cooler type of computer memory on Tuesday it says will improve the performance of graphics and other embedded systems over traditional SRAM, or static random access memory.

But even with the advancements in DRAM, the announcement is expected to bolster IBM's relationship with Advanced Micro Devices and be a sharp stick in the eye to Intel.

Named eDRAM -- for embedded dynamic random access memory -- the technology will be a key feature of IBM's Cell processor road map starting sometime in 2008. IBM's Cell chip, which it co-produces with Sony Electronics and Toshiba, is the core CPU in Sony's PlayStation. IBM also supplies the core processor architecture in Microsoft's Xbox, and Nintendo's Wii console.

With the advent of multicore chips, memory has become an increasingly critical aspect of microprocessor performance. IBM said eDRAM contains more than 12 million transistors and high-performance logic. It also takes up about one-third the space with one-fifth the standby power of conventional SRAM.

"What IBM is offering is pretty amazing in terms of how fast it is," says Alan Niebel, CEO and president of Web-Feet Research, an analyst firm that specializes in computer system memory. "With 2-nanosecond read access speeds, eDRAM will help render graphics in real time, making them so realistic that you can see the beads of sweat on a character's face."

Intel and other chipmakers' model of memory and logic differs in that it allows for floating integers and pooled computing resources or stacks double data rate synchronous dynamic random access memory (DDR) near the CPU. The methods have had tremendous success, especially with multicore processors. Intel is expected to announce speed and design improvements later this year.

However, Niebel says IBM's eDRAM is little different than putting 1 Gbyte of DRAM next to a central processor. "It's a lengthy process because you are writing the memory into the same chip. Traditionally, chipmakers can make the memory better and the logic suffers or the logic improves and the memory suffers. ... So it's a question of how long before eDRAM can be productized before we can make true comparisons."

"The big thing now is to take the technology from eDRAM and embed it into one of these standalone processors such as an Athlon or Opteron," Niebel says. Given IBM's and AMD's past collaborations, he says, it's only a matter of time before eDRAM technology makes its way into an AMD chip.

Either way, it is unlikely that IBM will use its eDRAM for its own Cell processor to power desktops or laptops anytime soon, Niebel says, because currently the Cell is not optimized to run Microsoft's Windows Vista or other Windows products.

And despite its application benefits, Niebel doesn't expect IBM to shatter any sales records soon. The worldwide market sales leader for DRAM is Samsung, followed by Micron and then South Korea's Hynix.

So why spend the R&D on improving the memory? IBM justifies the breakthrough because it has always sought to break the envelope with its system-on-a-chip designs. Other IBM breakthroughs include High-k, which enhances a transistor's function, copper on-chip wiring, silicon-on-insulator and silicon germanium transistors, strained silicon, and eFuse, a technology that lets computer chips slow down and speed up depending on changing conditions.

This article was modified on Feb. 20 to clarify IBM's contribution to the top three selling gaming consoles.

About the Author(s)

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like

More Insights