My ramblings about thermodynamics aren't so off-base, it turns out!
Remember this one? From 2009? Where I explained how Joules per Kelvin (energy per unit temperature) is a valid measure of information (or entropy, effectively "missing information"), which is normally measured in bits?
Well, now there's a paper that formalizes that idea and related ones. As the title ("Temperature as Joules per Bit") indicates, it looks at a rearranged version of the same insight (mine was "bits as Joules per temperature"). But, it also goes a lot deeper and derives thermodynamics starting from entropy to understand temperature, rather than the other way around, as is conventionally done.
Related thought: I remember back in that 2009 thermodyanmics/info theory frenzy, one of my goals was rederive the Carnot limit based on information-theoretic considerations -- that is, show it as a simple implication of the amount of knowledge you have about a system in a case where only know the temperature difference. (Naturally, I assumed someone had already done this and tried to find it but it was very hard to google for.)
Background: The Carnot limit tells you the maximum amount of mechanical work ("useful energy") you can extract from heat -- like, through an engine -- and, as it turns out, it's a function of the ratio of absoute temperatures you're working between. You don't face this limit when extracting work from a flywheel (spinning disc with grooves). Inspired by an counterintuitive insight in an Eliezer Yudkowsky LessWrong post, and my thoughts about it, I figured you could draw a more direct line from "knowledge of a temperature difference" to "how much energy is extractable".
Now I'll give it a go with ChatGPT, and post my findings!