Entropy II: Fundamentals

Entropy began as a puzzle. If heat was energy, why was there always some heat wasted when one tried to turn it into mechanical energy? The heat engine was perhaps the master invention of the industrial revolution, but it posed both theoretical and practical problems – I will only worry here about the former. Realization that heat was a form of energy, and computation of the mechanical equivalent of heat, generalized the concept of conservation of energy. This was codified in the First Law of Thermodynamics. That inevitable wasted heat energy in every heat engine showed that this could not be the full story of thermodynamics, though, and investigation of the properties of the waste heat led to the concept of entropy, and the Second Law.

Like pressure and temperature, entropy was a state variable, but its interpretation was obscure. Its fundamental property was that it was non-decreasing. In particular, it could be constant in a reversible (quasi-equilibrium) and adiabatic (no heat transfer) process, but always increased in a non-adiabatic or irreversible process. For a reversible process, the entropy change was the ratio of the heat transferred to the temperature at which the transfer occurred.

Once the interpretation of temperature in terms of the motion of microscopic components of matter became popular, it became natural to search for a microscopic foundation of entropy as well. That discovery, mostly by Boltzmann, is one of the deepest in physics. Entropy, said Boltzmann, is a logarithmic measure of the number of microscopic states available to a system in a give macroscopic state. With this definition, the law of entropy increase can be seen as a matter of probability.

Consider an experiment where we flip a number of coins, say 10. If they are fair coins, each will have equal probability of coming up heads or tails. If we flip the coins all at once, what is the probability that, say, exactly 3 will be heads? The way we do this calculation is to imagine a universe of all possible coin flips and count what fraction of that universe has exactly three heads. There are ten coins, each of which can be either H or T, so there are 2^10 = 1024 possible outcomes. If we remember the binomial expansion, we can see that there are 10!/((10-3)!3!) = 10*9*8/6 = 120 ways that this can happen, so that the probability P(3 heads) = 120/1024 = 15/128.

An analogous idea is used in calculating entropy. The macro state is the analog of total number of heads, and microstates are analogous to possible results of a flip. Consider a box with ten particles moving randomly about in it. Suppose that we quickly insert a partition down the middle so that each particle has an even chance of ending up on either side of the box. Call those that wind up on the right side “heads” and those that wind up on the left “tails.” We won’t calculate all the way to probability here, but we will count the number of possible outcome micro-states corresponding to a given macro outcome. The entropy will then be the logarithm of that number of states.

Suppose all the particles end up on the right hand side (ten “heads”). There is only one way that can happen, so the entropy of that state is log (1) = 0. If three wind up on the RHS (3 “heads”), we have already seen that that can happen in 120 ways so entropy = log (120) = 4.79. In this way we can calculate the entropy of each possible macro-state. We would find that the highest entropies corresponded to the most probable results. The law of entropy increase becomes the probabilistic rule that the system is likely to end up in one of the most probable states. For large N, the state distributions become much more sharply peaked.

An elementary book that gets right to the point of calculating entropy and its consequences is
Kittel and Kroemer

Next: Gravity

For the advanced student: and link therein. Mikey likes it. So far.

Comments

Popular posts from this blog

Anti-Libertarian: re-post

Uneasy Lies The Head

We Call it Soccer