Moore's Law: The Future Is Energy Efficiency

As the industry continues to shrink chips, energy efficiency is one way to expand Moore's Law beyond its current form. Let's explore how.

Mark Papermaster, SVP & CTO, AMD

August 20, 2014

4 Min Read
(Source: <a href="http://commons.wikimedia.org/wiki/File:Southeast_Steam_Plant-University_of_Minnesota_1.jpg" target="_blank">Karl Frankowski</a>)

The future of energy-efficient IT
Going forward, the power efficiency of IT is expected to continue to improve but in very different ways.

For example, at my company, AMD, we recently announced an ambitious goal to increase the "typical use" energy efficiency of our entire line of mobile processors by 25 times from 2014 to 2020. We plan to achieve this through a combination of accelerated performance and reductions in the energy consumption. If we reach our goal, this means that in 2020 computers running on AMD technology could accomplish a computing task in one-fifth the time of today's PCs while consuming less than one fifth the power on average.

To put this in perspective, try to envision getting this same performance improvement in the car you drive. Using a similar performance-to-energy use ratio: If you drive a 100 horsepower car that gets 30 miles per gallon today and were able to increase its performance by 25-fold in six years, you would be driving a 500 horsepower car that gets 150 miles per gallon in 2020.

This ambitious goal comes on the heels of a 10-fold energy efficiency improvement in this product line since 2008. The difference going forward is that many of the gains will come from outside the traditional silicon "shrink" cycle or what industry insiders call the "race to the next process node."

Rather than waiting for the next generation of silicon technology to come online, our approach is to aggressively design energy efficiency through processor architecture and intelligent power management. And, the energy efficiency gains made by achieving this goal would outpace the Moore's Law efficiency trend by at least 70% between 2014 and 2020.

Here are a few of the key design innovations that AMD believes will help propel the future of energy-efficient IT:

Heterogeneous computing and power optimization: AMD's accelerated processing units (APUs) include both central processing units (CPUs) and graphic processing units (GPUs) on the same piece of silicon. Combining CPUs and GPUs on the same chip saves energy by eliminating connections between discrete chips. AMD extracts even more energy savings by enabling APUs to seamlessly shift the computing workloads between the CPU and GPU to optimize efficiency, part of the Heterogeneous Systems Architecture that is now being widely adopted in the industry.

Intelligent, dynamic power management: This could be retitled as a "race to idle" because the energy advantages are primarily derived from finishing a job quickly and efficiently to enable a faster return to the ultra-low power idle state

Future innovations in power-efficient design: Inter-frame power gating, per-part adaptive voltage, voltage islands, further integration of system components, and other techniques still in the development stage will continue to yield accelerated energy efficiency gains going forward. AMD had implemented an "ambidextrous" product offering to cover both ARM and x86 instruction sets, so the same power management approach can be applied to the vast majority of IT use cases.

Why energy efficiency matters
With their lives hanging in the balance, the brave Apollo 13 astronauts struggled to conserve power. While our quest for energy-efficient IT is not as acute, the stakes are high. By 2020, the number of connected devices is estimated to be nearly five times larger than the Earth's population and result in increased energy demand. It follows that energy-efficient technology is essential to achieve the promise of an IT-enabled society.

And, with the massive projected increases in connected devices, there is a strong environmental motive to pursue energy-efficient IT. The International Energy Agency (IEA) referred to energy efficiency as "the world's first fuel." Similarly, the Alliance to Save Energy stated that "energy efficiency is one of the most important tools for avoiding climate change by reducing use of fossil fuels." While power-efficient IT alone cannot fully address climate change, it is an important part of the solution.

Improvement in the power efficiency of IT devices is just part of the story. IT-enabled devices can also help make other systems more energy efficient. A recent study by the Global e-Sustainability Initiative (GeSI) projected IT-enabled devices could cut global greenhouse gas (GHG) emissions by 16.5% in 2020. These gains will be achieved through many different applications, ranging from smart power grids, sophisticated HVAC systems, sensor-driven intelligent traffic management, and more. The study predicted that savings from IT-enabled devices would amount to $1.9 trillion in gross energy and fuel savings and a reduction of 9.1 gigatonnes carbon dioxide-equivalent of greenhouse gases by 2020.

As someone who has worked in the high-technology industry for my entire career, I am proud of the gains we have made in conserving power and the benefits these technologies provide to the world around us. I am even more excited about the innovations to come.

In its ninth year, Interop New York (Sept. 29 to Oct. 3) is the premier event for the Northeast IT market. Strongly represented vertical industries include financial services, government, and education. Join more than 5,000 attendees to learn about IT leadership, cloud, collaboration, infrastructure, mobility, risk management and security, and SDN, as well as explore 125 exhibitors' offerings. Register with Discount Code MPIWK to save $200 off Total Access & Conference Passes.

About the Author

Mark Papermaster

SVP & CTO, AMD

Mark Papermaster is chief technology officer and senior vice president at AMD, responsible for corporate technical direction, and AMD's intellectual property and system-on-chip product research and development. His more than 30 years of engineering experience includes significant leadership roles managing the development of a wide range of products spanning mobile devices to high-performance servers.
Before joining AMD in October 2011, Papermaster was the leader of Cisco's Silicon Engineering Group, the organization responsible for silicon strategy, architecture, and development for the company's switching and routing businesses.

In prior roles, Papermaster served as Apple senior vice president of Devices Hardware Engineering, where he was responsible for the iPod products and iPhone hardware development. He also held a number of senior leadership positions at IBM, serving on the company's Technical Leadership Team and overseeing development of the company's key microprocessor and server technologies.
Papermaster received his bachelor's degree in electrical engineering from the University of Texas at Austin and a master's degree in electrical engineering from the University of Vermont. He is a member of the University of Texas Cockrell School of Engineering Advisory Board, Olin College Presidents Council, and the Juvenile Diabetes Research Foundation.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights