Sunday, March 7, 2010

When you can't go back to sleep: thermodynamics

Did you miss my posting? Well, it's been one of those days when you wake up early and can't go back to sleep. My mind's running wild this morning, and I figured I'd make some good use out of it. (It is no longer early as of completing this post because of interruptions from a playful kitty.)

I'm going to continue the lesson about thermodynamics that last left off about a year ago, by discussing some more interesting implications of the idea that "energy per unit temperature" is a measure of degrees of freedom. You see, in the time since then, I read John S. Avery's book Information Theory and Evolution, which, as you might have inferred, discusses life from the perspective of that ever-so-useful field of information theory. He also applies it to cultural (often called "memetic") evolution.

The first interesting insight that this book alerted me to is about molar entropy. Some background: in your chemistry class, you might have learned about the Gibbs free energy of a reaction, ΔG, which is calculated from ΔH - TΔS, where H is the molar enthalpy (internal + flow energy per mole), T is absolute temperature, and S is the molar entropy. For a chemical reaction, you look up the molar enthalpies of the products and subtract off the enthalpies of the reactants. Then you do the same for molar entropies, multiplying by the absolute temperature at which the reaction takes place, and add them. A negative sign for ΔG means the reaction happens spontaneously (well, as long as there is an available pathway).

With that out of the way, what are the units for S? Most tables give them as J/K*mol (Joules per Kelvin per mole, or energy per unit temperature per quantity of molecules). But, as the last post in this series showed, energy per unit temperature measures degrees of freedom, which can also be expressed in bits. So, as Avery neatly derives on pages 81-82, you can also express molar entropy in bits per molecule. (The conversion factor is 1 J/K*mol = 0.1735 bits/molecule.) I find this a much more intuitive way to think about it, because it connects the concept of molar entropy to the underlying dynamic: how many bits of information (on average) do you need to specify a molecule's current state, beyond that which you know from the temperature?

Also, rather than having to empirically derive this value directly (either from reaction data or by integrating its specific heat capacity per unit temperature from 0 K to its current temperature), it can be inferred from the known properties of the molecule: its shape, size, and bond strength. The stronger ("stiffer") its bonds are, the lower the entropy of the molecule, because large deviations from its equilibrium configuration are less probable. (Diamond, with its very strong covalent bonds, has the incredibly low molar entropy of 0.24 bits per carbon atom at STP, meaning you need less than one bit of information to specify every four atoms.)

[ADDENDUM: Avery also adds that if you divide the Gibbs equation through by T, you can describe a reaction in terms of the "information lost", i.e., the greater number of degrees of freedom you have permitted by letting the reaction take place.]

By recognizing this interconnection between molecule properties and complexity (needing more information to fully specify = more complex), one sees more unity ("consilience") to the science as a whole: entropy and bond properties aren't just off in their own domains, but have a lawful relationship. Unfortunately, however, I haven't worked out how to derive entropy from stiffness of a degree of freedom, and I haven't found a text that does it either.

Next in the series: A discussion of Eric J. Chaisson's Cosmic Evolution: The Rise of Complexity in Nature, which proposes specific energy flux (energy flow through a system per unit mass) as a measure of complexity that is applicable to everything from stars to planets to life to vehicles to computer chips to culture.

4 comments:

Jason Cooper said...

You'll find information on the calculation of various contributions to molecular entropy here:

http://www.gaussian.com/g_whitepap/thermo.htm

Silas Barta said...

Thanks Jason! Very interesting, and it looks like I'll be able to satisfy my curiosity by reading the paper and the references.

jsalvati said...

I think you mean dG = dH - T dS, unless I have misunderstood something.

Silas Barta said...

Oops, look like I goofed on that. Good catch, I'll fix it.