John Baez originally shared this post:

How much a message tells you depends on what you were expecting. We can understand this very precisely using the concept of 'relative information'. Today I'll explain this and give an incredibly cool application to biology. Suppose a population of organisms has an evolutionarily stable state. Then as time passes, the information in this stable state relative to its current state always decreases! In short: *the population keeps learning through natural selection, so it has less 'left to learn'.*

(This is a theorem proved by +Marc Harper and others. Like all theorems, it has assumptions… and these assumptions don't fit nicely into a G+ post. So please read the blog article before you argue.)

Information Geometry (Part 11)

Last time we saw that given a bunch of different species of self-replicating entities, the entropy of their population distribution can go either up or down as time passes. This is true even in the……