Last month MIT’s Technology Review reported on a new development that the energy efficiency of computers doubles roughly every 18 months:

The conclusion, backed up by six decades of data, mirrors Moore’s law, the observation from Intel founder Gordon Moore that computer processing power doubles about every 18 months. But the power-consumption trend might have even greater relevance than Moore’s law as battery-powered devices—phones, tablets, and sensors—proliferate.

The development by Dr. Jonathan Koomey, lead researcher and Consulting Professor of Civil and Environmental Engineering at Stanford University, sparked a lot of excitement across tech blogs. My favorite is this thought experiment by Alexis Madrigal over at The Atlantic:

Imagine you’ve got a shiny computer that is identical to a Macbook Air, except that it has the energy efficiency of a machine from 20 years ago. That computer would use so much power that you’d get a mere 2.5 seconds of battery life out of the Air’s 50 watt-hour battery instead of the seven hours that the Air actually gets. That is to say, you’d need 10,000 Air batteries to run our hypothetical machine for seven hours. There’s no way you’d fit a beast like that into a slim mailing envelope.

Fun. I think things get more interesting if you look forward to where the computer industry is headed. With devices like iPhones and iPads, computing has now moved off the desk and into our pockets, enabled largely due to improvements in energy efficiency. And the move to cloud-based computing and data services (think Wolfram Alpha or Netflix) is opening up opportunities for energy efficiency.

Recently, I emailed Dr. Koomey about his findings (informally referred to as Koomey’s Law) and how it is a major factor in future innovations.

In your paper, you write that the main driver of increased performance has been reducing transistor sizes. Are there other technologies or innovations that have contributed to increased performance? I am thinking of things like systems on a chip, or improved energy management functions in software. Or are the hardware improvements the main driver?

The biggest recent innovation affecting performance has been the shift to multiple cores, which allowed computers to increase performance within the same power envelope. Of course, that meant that the software designers had to redesign their software to take full advantage, which is was a departure from recent practice of relying on hardware improvements alone to improve performance.

Integration techniques (like system on a chip) certainly help performance. In fact, integration of more and more functionality (replacing separate components with silicon on the chip) has had a non-trivial impact on both power efficiency and performance. There’s a general trend towards more integration as technology matures, and that’s all to the good.

Improving the software that manages energy (i.e. sleep and other low-power modes) has been important in affecting power use when a computer is idle (which for most systems is most of the time), but we didn’t focus on that in the paper.

You also mention that the EPA’s Energy Star program has had a substantial impact on electricity consumed by office equipment because low-power innovations in laptops made their way into desktop machines. With rapid innovation taking place in mobile devices/phones and tablets, do you see a similar transfer of knowledge from mobile devices back to traditional computers like laptops and desktops?

It will be interesting to see if the knowledge from the low power mobile devices filters back to the more traditional computers. You’re already starting to see laptops (like the Macbook Air) that use solid state drives to replace standard hard drives. Some even use flash RAM instead of regular RAM, just like phones and tablets. We’ll just have to see how fast this transition happens, but I think consumers will grow to expect computers to function more like the touch based tablets and phones, and less like traditional computers, and that will drive vendors towards more power efficient technologies that also enable that increased functionality.

With consumers increasingly relying on data centers for many services (Netflix streaming, Apple iCloud, Facebook, Google search, etc.), how important is energy efficiency in a data center, and have there been any major breakthroughs or innovations in data center energy efficiency?

Cloud computing represents an immense leap forward in data center efficiency. I wrote about 4 reasons why that’s the case here:

The cloud providers have economies of scale, flexibility, diversity and the ability to easily allow users to shift to the cloud without having to face the difficult institutional problems that they are confronted with by their internal IT organization. As an example, Google’s data centers have an “overhead” of about 0.15 kWh per kWh of computing load, while typical “in-house” data centers for companies for whom computing is not their core business is closer to 0.8 or 0.9 kWh per kWh of computing load. That’s a huge difference in both energy use and costs.

Are there any technologies or innovations you are excited about or looking forward to? Is there anything else you would like to add?

The most exciting developments to me are in the area of mobile sensors and controls. As it becomes possible to do small amounts of computing with ever decreasing amounts of power, more and more such applications will become feasible, and we’ll be able to more finely tailor products to actual consumer demands. We’ll also see an explosion of data the likes of which we’re mostly unprepared for, and there will be opportunities in helping companies sift through those data. And this all relates to what Professor Erik Brynjolfsson at MIT calls the emergence of “the Internet of Things”.

You can follow Dr. Koomey’s work by visiting his website