Here's another interesting insight on thermodynamics and information theory to add to my previous: I realized why "joules per kelvin" is a measure of entropy. Not exciting? Wait, you'll see.
In the previous post on this topic, I mentioned all the parallels between entropy in information theory and entropy in thermodynamics. Also, some properties can be calculated by their information-theoretic definition or their thermodynamic definition, such as the thermodynamic availability, which can be calculated as the Kullback-Leibler divergence, a measure from information theory. But what's interesting is that this value can be expressed in terms of bits, or in terms of Joules per Kelvin, which has units of energy over temperature, with a simple constant multiplier for conversion.
You see, there's the hard part: why on earth would bits -- which measure how much memory your computer has -- possibly refer to the same property as "Joules per Kelvin", the way that inches and meters refer to the same property?
And that's where we get to the interesting part. First of all, what is temperature? It's not how much internal energy something has, but rather, it's internal energy per degree of freedom. In this context, a "degree of freedom" is a distinct way that something can be modified at the molecular level. A single-atom molecule may be viewed as having three degrees of freedom, since it can translate in three dimensions. Once the molecule has shape, however, it can rotate in addition to translating. So, two different substances at the same temperature can have different internal energy, because one of them may be stuffing that energy into more degrees of freedom.
So where does that get us with Joules per Kelvin and energy per unit temperature? Well, watch what happens when you expand out temperature in the entropy expression:
= energy * degree-of-freedom/energy
= degree-of-freedom (!)
So there you have it! Once you expand it out, energy per unit temperature is simply a roundabout way of saying "degrees of freedom".
Now you may ask, "Nice, but that still doesn't explain what that has to do with bits." But then, what is a bit but a binary degree of freedom? When you have memory of n bits, then there are n values that you can independently set to one of two possible values, making it likewise a measure of degrees of freedom. (Note that this capability allows you to store 2^n possible states.) And informational entropy, in turn -- also expressed in bits -- is the logarithm of the number of possible states a system can be in, making it proportional to the degrees of freedom as well.
The two lessons to take away are that:
1) The number of degrees of freedom a system has depends on the arbitrary choice of what you count as a degree of freedom, just like the number of "units of length" something is.
2) Whichever consistent method you use of counting degrees of freedom, the number of degrees of freedom is proportional to the logarithm of the number of possible states.
Mystery solved! (No, I don't know if this discussion is given in any textbook treatment of the issue.)
Oh, and: Happy Saint Patty's Day!