Energy pessimists and fossil fuel optimists love to talk about how much juice our electric grid needs to run. The point here is that solar and other renewable sources could never meet the demand, and that demand is only going to grow. Thus we must grow grid capacity in big chunks, as Southern Co. (SO) and SCANA Corp. (SCG) are doing with commitments to nuclear power plants. Present trends in computing threaten the assumptions on which those investments are based.
Many areas of grid consumption are declining. We know we can reduce the amount of electricity used for lights. We know appliances are becoming more efficient, and we have simple solutions to cutting the electricity off from unused sockets – all you have to do is buy them.
Ah, say the pessimists, but what about your computers? Even when you get a laptop, as I have, it's still usually plugged into a wall, or has to be recharged from the wall every few hours. And then there are all those huge enterprise server farms to contend with.
In 2007 David Sarokin estimated that all computers – servers and clients – were consuming 9.4% of grid energy, 350 billion kwH per year. Nicholas Carr adds this may not include peripherals and handheld devices like the iPhone.
That may be changing. The two biggest trends in computing are all about reducing that load, about reducing the industry's reliance on grid power.
Tablets and smartphones are built around power-sipping. That's what makes ARM Computing (ARMH) so powerful in the market. The longer they can run between charges, the better customers like them. And the big trend is giving them a small enough load that they can become solar powered. Thin clients, virtual desktops and apps naturally require less energy than PC-based networks.
Intel's (INTC) recent demonstration of a solar powered CPU was sort of phony. The solar panel was just running the central processor, not the whole computer – certainly not the monitor. But if Intel can keep making progress in energy consumption it figures Apple (AAPL) may look at it again for running iPads.
Another thing that's bringing the day of solar powered PCs closer is flash memory. Replacing hard drives with flash, as is done in netbooks, dramatically cuts energy requirements, while at the same time speeding system response. The more flash prices decline in price, the lower your PC power bill goes.
Then there's the cloud: When Google (GOOG) revealed its energy use statistics recently most of the headlines were about how much power the company was using. It was 260 million watts per year, as much as a city like Richmond, VA.
But the headline Google wanted you to read was that it has been carbon neutral for four years – that it has been buying renewable power to run its data centers, and concentrating hard on reducing its use of electricity. It's a cost of doing business, and the lower it can drive that cost – by using the equivalent of laptop boards in server farms, or by just opening windows instead of running air conditioners – the more costs are driven out.
Clouds are a hot technology right now because they let companies meet a huge spike in traffic or a sudden huge problem that must be gone through – and because they let them consolidate resources. But clouds also, by their nature, reduce power requirements. A lot of CIOs don't think about power when they're thinking cloud but that is starting to change. And when you consolidate server rooms you're retiring the older stuff – the stuff that uses the most power.
The cheapest renewable energy remains the energy you don't use. If the computing industry can use clouds, flash and solar-powered devices to cut its own energy use by just 10%, and that seems fairly easy to foresee, that will have a major impact on utility industry planning.
Disclosure: I am long GOOG.