## Wednesday, October 13, 2010

### Algorithmic Entropy and Thermodynamics

A very nice way that Physics, Mathematics, and Computer Science can all fit together is a field known as Algorithmic Information Theory which was launched in 1960 by Ray Solomonoff (1926-2009). Algorithmic information theory was later developed independently by Andrey Kolmogorov, in 1965 and Gregory Chaitin, around 1966.

The wonderful John Baez, wonderful because he takes the "boring" out of Mathematics and replaces it with "fun and exciting", of Azimuth has a nice take on it in a new paper he wrote with Computer Scientist Mike Stay and which John explains: here.

Briefly and to whet your whistle:

Which of the following strings of data have more entropy (degree of randomness)?

00000000000000000000000

or

01010101010101010101010

or

11001000011000011101111

or

01101010001010001010001

Well, the first one has the least entropy, because a program to print it (PRINT 0, Repeat) is the simplest, therefore it has the most order.

The second is almost as simple (PRINT 01, Repeat), but just not quite as much. So it has slightly more entropy because its one digit longer and therefore a touch more random. It could have been 02 to 99, whereas the first could have been only 0-9.

It turns out the third one is truly random, therefore it has the greatest entropy. There are so many possibilities, but only one way to list (order) it and that is to just list it.

The last one has great order if you recognize it as the place holdings of the prime numbers.

The upshot is that the physics of Thermodynamics (in this case, dE = T dS – P d V + μ dN) is helping Mathematics for a change, rather than the other way around. It will be interesting to see where this goes.

If you're wondering about applications, one important one would be in the emerging field of Artificial Intelligence, which we may achieve even before locating any natural intelligence here on our home planet. :-) From the Wiki entry for Ray Solomonoff:

Although he is best known for algorithmic probability and his general theory of inductive inference, he made many other important discoveries throughout his life, most of them directed toward his goal in artificial intelligence: to develop a machine that could solve hard problems using probabilistic methods.

Further reading: The Algorithmic Information Theory resource page and Scholarpedia's Algorithmic information Theory page.

#### 1 comment:

Elizabeth J. Neal said...