Temperature and Entropy

2005-05-05T17:05:00Z

I have been trying to understand temperature and entropy. I never had a good grasp of these concepts before. Finding a good explanation of them has been elusive.

I have gone back to basics to try and understand temperature and this is what I have got. Objects have this internal energy called heat. Heat is where mechanical energy goes during friction. When two objects are touching (or even when they are just facing each other) sometimes this internal energy is transfered from one to the other. If no internal energy is transfered the two objects are said to be in thermodynamic equilibrium.

It turns out that if objects A and B are in thermodynamic equilibrium and objects B and C are in thermodynamic equilibrium, then objects A and C are in thermodynamic equilibrium. This means we can design a calibrated object, called a thermometer, such that if my thermometer is in thermodynamic equilibrium with my object, and your identical thermometer is in thermodynamic equilibrium with your object, and the two thermometers read the same value, then we know that our object are in thermodynamic equilibrium with each other. Further more, we can mark our thermometer in such a way that if your thermometer reads larger than my thermometer, then we will know that if we allow our objects to touch the internal energy will flow form your object to my object.

So far this does not tell us exactly how to mark our thermometer. Any monotone rearrangement of our marking would preserve this property. We choose a scale with the additional property that if your object is at temperature T1 and my object is at temperature T2, then the maximum efficiency of a heat engine operating between these two object is 1 - T1T2.

Steve Baum emphasizes:

It should be emphasized that the mercury-in-glass scale of temperature is not simply a linear transformation of the Kelvin temperature. […] It is apparent then that if the mercury-in-glass temperature is to yield the correct thermodynamic temperature […] it must be constructed with slightly nonuniform subdivisions with the 50℃ mark being not at the half way position between the ice point and the boiling point.

This is a good start for defining temperature, but it does leave a lot of questions. What is thermodynamic equilibrium? Why does heat only flow from hotter objects to colder objects? Why is a heat engine’s efficiency limited?

It seems I have a much better understanding of entropy, or at least Shannon entropy. If you have a set of distinguishable elements {s1, …, sn} and you have a random distribution where pi is the probability of selecting si, then the entropy of this distribution is the average number of bits needed to store the information that a particular selection was made. This value is -∑i = 1 … n pilg(pi) and is measured in bits. This value is always less than or equal to lg(n), and is equal to lg(n) for a uniform probability distribution.

Thermodynamic entropy is not a measurement of disorder. Thank god, because that always confused me. However this authors definition that entropy change measures energy’s dispersion at a stated temperature still confuses me.

As far as I can tell entropy is a measurement of your uncertainty of which state a system is in. So if I place five coins in a row under a sheet of paper, and tell you that the coins are all heads up, then the entropy of this system is 0 because you know exactly which way they are all facing. If I tell you that two are heads up, then the system has lg(20)lg(10) bits of entropy, because you don’t know exactly the state of the system. If I tell you nothing, the coins have lg(120)lg(32) = 5 bits of entropy. If I tell you from left to right the coins are tails, tails, heads, heads, tails, then the coins have 0 bits of entropy, because you know exactly the state of the coins, even if that state is “disorderly”.

In the last three examples, the coins may have had the same configuration; it is your knowledge of the state determines the entropy. This value corresponds to the Shannon entropy where all states of the coins that are allowed are considered equally as likely to occur.

If I give you a box with volume V of gas with n molecules at a pressure of P, there are some large number of states these molecules could be in. It is my understanding that the lg of this number is the entropy of the gas.

If I give you the same box and tell you the state of every molecule, then the same gas has 0 entropy. This makes sense to me, because if you know the state of every molecule, you would not even need Maxwell’s demon to separate the gas into hot and cold molecules. You could do it yourself.

But wait, entropy is measured in energy per temperature, and I have been measuring it in bits. Apparently there are kB∕lg(e) joules per kelvin per bit. I would very much like to know where this comes from. Actually, since I still have not got a good grasp of temperature, I should think of one kelvin as equal to kB∕lg(e) joules per bit. I am not exactly sure what a joule per bit means. (The term lg(e) is an artifact that comes about because physicists use the natural logarithm (ln) and computer scientists, like myself, use logarithm base two (lg). If Boltzmann as a computer scientist he would have defined kB in such a way that the term lg(e) would disappear.)

Anyhow, I have this feeling that temperature is not really real. What I suspect you actually have is some distribution of particles at various energy levels. This distribution leads to entropy in accordance to Shannon’s formula. A common family of distributions is Boltzmann’s distributions, and this family is indexed by temperature. Then somehow this temperature relates to thermal equilibrium and heat engine efficiencies. But I imagine it is possible to have a distribution of energies that is not one of Boltzmann’s distributions. This is all just speculation.

The fact that entropy of a closed system never decreases just says that you cannot magically get information about a system from nowhere. But since state evolution in quantum mechanics is reversible, it seems to me that means that entropy of a closed system can never increase either! I kinda have this feeling that entropy of a system increasing is analogous to energy being lost by friction; if you look close enough you find the quantity is actually preserved.

Sorry if you are still reading this. My purpose in writing this was mostly so that I could look back on it later. I still have a lot to learn about this subject.

Tags

,

Russell O’Connor: contact me