Wednesday, January 11, 2012

Landauer vs. the Fan

Darn that !@#$% laptop fan. Why is it running again?

Maybe because I took Stanford's free online courses in artificial intelligence and machine learning last quarter, and I've been running experiments with neural networks on and off since then.

Neural networks are fun and flexible ways of coming up with a function when you really have no clue how the outputs relate to the inputs, but it sure takes a lot of time and electricity to train them. I'm basically turning coal (or whatever the local power plant burns) into functions.

I suppose the fact that we recently put photovoltaic panels on our roof should salve my conscience a little, but it still seems as if it's going to take an awful lot of expensive electrons to train enough AIs to provide us all with robot maids and butlers. Can't we do any better?

A recent Scientific American article suggests that Fujitsu's K Computer has around 4 times as much computational power as a human brain, but uses about half a million times as much energy. There's plenty of room to argue that the K computer and the brain do types of processing that aren't comparable, and even if you ignored that, I imagine there's plenty of fudging in the figure the article cites for the brain. Still, we can probably say that the brain is several decimal orders of magnitude more efficient than a modern electronic computer.

What about the brain itself? Is it actually as efficient as possible, or does physics permit computational systems that use even less energy? Well, it was only recently that I discovered the Landauer limit, which describes the minimum energy required to change one bit of information. The Wikipedia article says that at room temperature, the Landauer limit requires at least 2.85 picowatts to record a billion bits of information per second. I'm not sure what that translates to in flops—maybe a single floating point operation requires recording a thousand bits of intermediate information? If so, a billion bits per second is a megaflop, or less than a billionth of a human brain, per the Scientific American article. If I've done the math right, that leaves the brain only a few orders of magnitude less efficient the room-temperature limit (but fortunately, Landauer says colder computers would do better).

At any rate, physics does leave some room for cooler computers. Maybe I won't have to listen to fan noise every time I hang out with Rosey and Mac after all.

2 comments:

  1. Don't worry. By my math and Moore's law, in just 25.4 years, a computer 25% as powerful as the K computer should consume the same amount of power as the human brain.

    By then, surely, the robots won't find us useful anymore (except perhaps to keep up the forces driving Moores law).

    ReplyDelete
  2. I hadn't been aware that electrical power consumption of computers was decreasing as quickly as computational power was rising, but apparently that's the trend—as documented, for example, in this paper.

    I'm sure the robots will keep a few of us around. What could be more fashionable than a human butler?

    ReplyDelete