Friday, July 13, 2012

I Can See Right through You

I've been doing some reading and thinking about contracts and hybrid type checking, and there's something I haven't seen mentioned yet.

Those of us who use languages with decent support for abstraction are used to thinking of operations that can have different implementations—depending, for example, on the particular class associated with an object.

But there are some operations for which multiple implementations don't really make sense. For example, consider an interface for integers that supports even and odd operations. In a language with nominal typing, you can't get very specific about what these methods return:

  trait Integer {
    def isOdd: Boolean
    def isEven: Boolean
But in a language where your contracts can be more specific, you could say something like:
  trait Integer {
    def isOdd: { _ % 2 == 1 }
    def isEven: { _ % 2 == 0 }
That is, the contract of odd specifies that it returns whether its argument is odd, and similarly for even.

Assuming that you don't mind putting code in your interface, you can make sure that no override of these methods violates the contract by insisting on the implementation:

  trait Integer {
    final def isOdd: { _ % 2 == 1 } = this % 2 == 1
    final def isEven: { _ % 2 == 0 } = this % 2 == 0
Holy redundancy, Batman! The type is the implementation! In fact, if these are pure functions, the types specify the only reasonable implementation of these operations (modulo optimizations, such as using a shift instead of a division operation).

If, like me, you tend to assume that everything you encounter is just a special case of some more general phenomenon, you might want to give this sort of function a name. I'm inclined to use the word transparent, because the function's type acts like a window that lets you see the whole inside of the function—although I'm not sure that term wouldn't lead to confusion with referential transparency, which is something else.

But if this phenomenon is indeed sufficiently common to receive a name, you might want to support it directly in a programming language, with something like:

  trait Integer {
    def isOdd transparently this % 2 == 1
    def isEven transparently this % 2 == 0
On a related note, if a contract is a function, then that function might have a sort of transparency—that is, a reflection facility that let you inspect a contract might not just let you test whether the contract holds of a particular value, but also let you look directly at the code that specifies the contract. An alternative is to hide the contents of the contract, and reveal only whether one contract implies another; but since that is impossible to deduce in general, the reflection system would probably serve you better by giving you the raw code and letting you make of it what you will.

Tuesday, June 19, 2012

The Y Conspiracy

Some phenomena are so outlandish that to explain them seems to require a conspiracy by the powerful.

For example, consider the modern computer chip. Is it really believable that a tiny piece of slightly adulterated silicon can, when an electric current is passed through it, compute millions of digits of pi? Well, maybe you can swallow the pi, but doesn't it burst the bounds of credulity to accept that it can also show you videos of cute kittens dancing?

Surely there's a more plausible explanation for the technology we all enjoy, preferably an explanation involving wickedness and shadowy forces. Is it possible that the computer companies of the world have made a deal with Satan, and imprisoned damned souls inside their machines, forcing them to do billions of calculations per second under threat of intensified torment?

Alas, my theory doesn't stand up to examination. There are conspiracies in the world, but they don't explain all hard-to-believe phenomena. For a conspiracy theory to be plausible, the conspiracy has to have a small enough number of participants that its details could reasonably remain secret—or at any rate, that has to be more reasonable than the alternatives. Given how much difficulty most people I know have in keeping a secret, my minions-of-Satan hypothesis falls apart if more than about three people are involved in the computer industry. Lacking a better explanation, then, I'll just have to accept the information-processing abilities of dirty sand.

Even more mind-bending are the properties of fixpoint combinators. No matter how many times I step through a demonstration of the Y combinator turning a nonsensical up-by-its-own-bootstraps lambda expression into a tidy well-behaved function, I can't quite convince myself that it could ever really work.

The problem with disbelief in stuff that can be demonstrated mathematically is that you have no freedom to invoke a conspiracy. Not even the Trilateral Commission with a fleet of black helicopters could fool us into putting our faith in the Y combinator if it didn't actually perform as advertised.

I'll just have fall back on the idea that The Lambda Calculus Is Magic.

Wednesday, January 11, 2012

Landauer vs. the Fan

Darn that !@#$% laptop fan. Why is it running again?

Maybe because I took Stanford's free online courses in artificial intelligence and machine learning last quarter, and I've been running experiments with neural networks on and off since then.

Neural networks are fun and flexible ways of coming up with a function when you really have no clue how the outputs relate to the inputs, but it sure takes a lot of time and electricity to train them. I'm basically turning coal (or whatever the local power plant burns) into functions.

I suppose the fact that we recently put photovoltaic panels on our roof should salve my conscience a little, but it still seems as if it's going to take an awful lot of expensive electrons to train enough AIs to provide us all with robot maids and butlers. Can't we do any better?

A recent Scientific American article suggests that Fujitsu's K Computer has around 4 times as much computational power as a human brain, but uses about half a million times as much energy. There's plenty of room to argue that the K computer and the brain do types of processing that aren't comparable, and even if you ignored that, I imagine there's plenty of fudging in the figure the article cites for the brain. Still, we can probably say that the brain is several decimal orders of magnitude more efficient than a modern electronic computer.

What about the brain itself? Is it actually as efficient as possible, or does physics permit computational systems that use even less energy? Well, it was only recently that I discovered the Landauer limit, which describes the minimum energy required to change one bit of information. The Wikipedia article says that at room temperature, the Landauer limit requires at least 2.85 picowatts to record a billion bits of information per second. I'm not sure what that translates to in flops—maybe a single floating point operation requires recording a thousand bits of intermediate information? If so, a billion bits per second is a megaflop, or less than a billionth of a human brain, per the Scientific American article. If I've done the math right, that leaves the brain only a few orders of magnitude less efficient the room-temperature limit (but fortunately, Landauer says colder computers would do better).

At any rate, physics does leave some room for cooler computers. Maybe I won't have to listen to fan noise every time I hang out with Rosey and Mac after all.