The subject matter over the last few months has touched upon thermodynamics in a variety of guises.  For example, the concept of enthalpy and isentropic flow has played a key role in compressible fluid flow.  In the posts discussing the Maxwell relations, the thermodynamics square and the classic relationships between second order partial derivatives were the main tools used to eliminate pesky terms involving the entropy in favor of quantities easier to measure in the lab.

It seems that it is now prudent to put down a few notions about entropy itself.  No other physical quantity, with the possible exception of energy, is as ubiquitously used as entropy and none is as poorly understood as entropy.  Indeed, in his 2011 article entitled How physicists disagree on the meaning of entropy, Robert Swendsen starts with the quotation from von Neuman “nobody understands entropy”.  Chemists use entropy to determine the direction of chemical reactions, physicists use it when looking at matter in motion (e.g. compressible gas within a cylinder), electrical engineers use it when characterizing information loss on channel, the amount software can compress an depends on its entropy, and so on.

Entropy seems to be a Swiss army knife concept with lots of different built-in gadgets that can be pulled out and used on a moment’s notice.  It’s no wonder that such multi-faceted idea is not only poorly understood but also gives rise to radically contradictory notions.  For example, Swendsen starts his article with the following list of 18 properties that he has seen or heard attributed to entropy:

  • The theory of probability has nothing to do with statistical mechanics.
  • The theory of probability is the basis of statistical mechanics.
  • The entropy of an ideal classical gas of distinguishable particles is not extensive.
  • The entropy of an ideal classical gas of distinguishable particles is extensive.
  • The properties of macroscopic classical systems with distinguishable and indistinguishable particles are different.
  • The properties of macroscopic classical systems with distinguishable and indistinguishable particles are the same.
  • The entropy of a classical ideal gas of distinguishable particles is not additive.
  • The entropy of a classical ideal gas of distinguishable particles is additive.
  • Boltzmann defined the entropy of a classical system by the logarithm of a volume in phase space.
  • Boltzmann did not define the entropy by the logarithm of a volume in phase space.
  • The symbol W in the equation S=k log W, which is inscribed on Boltzmann’s tombstone, refers to a volume in phase space.
  • The symbol W in the equation S=k log W, which is inscribed on Boltzmann’s tombstone, refers to the German word “Wahrscheinlichkeit” (probability).
  • The entropy should be defined in terms of the properties of an isolated system
  • The entropy should be defined in terms of the properties of a composite system.
  • Thermodynamics is only valid in the “thermodynamic limit,” that is, in the limit of infinite system size.
  • Thermodynamics is valid for finite systems.
  • Extensivity is essential to thermodynamics.
  • Extensivity is not essential to thermodynamics.

This list, which is really a list of 9 pairs of contradictory statements about entropy, goes out of its way to show just how many diverging ideas scientists have about entropy.  And since it is trendy to have one’s own pet idea(s) about this fundamental concept, it seems about time, that I get my own and that it is the aim of this blog and the ones that follow.  As a warm up to a deeper dive, I decided to return to the basic ideas introduced in Halliday and Resnick physics. 

The most intriguing aspect of the textbook discussion of entropy is that it is a state variable, that is to say, its value depends only on what the system is doing at any given time and not how the system got there.  This is a key concept because it means that we are relieved in trying to find the particular path through which the system evolved.

What is particularly remarkable about this discovery is that it came about in the 19th century.  This was the time in which the idea of smooth distributions of matter held the day.  When the primary concept was that of a field, continuous in every way.  A time well before the concept of discrete, microscopic states emerged from an understanding of the quantum mechanics of atoms, molecules, and other substances. 

The thermodynamic relationship for entropy reads

\[ S_f – S_i = \int_{i}^{f} \frac{dQ}{T} \; , \]

where any path connecting the initial state (denoted by $$i$$) with the final state (denoted by $$f$$) will do.  Nowhere in this definition can one find any clear signpost to indicate lumpy matter or the concept of the discrete.  In addition, nothing in this definition even hints at a particular substance or class of them; nor is a particular phase of matter required.  A breathtaking sweep of generality is hidden behind a few simple glyphs on a page. 

As an example of the universality of the fundamental statement consider a familiar household system, say a glass of milk.  If we do something prosaic like warm it by 10 degrees Celsius we arrive at the same entropy change as we would have if we had boiled the milk off into a vapor, melted the glass down, reconstituted the latter and recondensed the former.  No matter what bizarre journey we subject a material to, the resulting change in entropy will simply depend on the initial and final configurations and not on the details connecting one to the other.

The usual playground for first thinking about entropy is the ideal gas and the usual example given the student is the computation of the entropy change of the free expansion of a gas.  The context of this discussion usually follows upon the heels of an introduction to the kinetic theory of gases – a theory that presupposes the existence of atoms.  The free expansion of gas is, perhaps, the most radical of all irreversible processes.  There is no orderly flow, the very concept of a continuum fails to apply; every atom goes its own way and no macroscopic evolution of thermodynamic state can even be imagined. 

And yet, almost blithely, textbooks argue the ease at which the entropy change in such a process can calculated.  The argument goes as follows.  From the kinetic theory of gases, one can show that during a free expansion, the internal energy does not change.  The reason for this is that the gas does no work (that is what ‘free’ really means) and the process happens fast enough that no heat is transferred in or out.  Since the change in internal energy is given by

\[ \Delta U = n C_V \Delta T \; \]

any ideal gas process that doesn’t change the internal energy also leaves the temperature unchanged.  The matching thermodynamic process, where reversibility and equilibrium are maintained at all times is the isothermal expansion. 

The first law

\[ dU = dQ – dW \; \]

can be specialized to any reversible ideal gas process, to yield

\[ n C_V dT = dQ – p dV = dQ – \frac{n R T}{V} dV \; .\]

Solving for $$dQ/T$$ gives

\[ \frac{dQ}{T} = n R \frac{dV}{V} + n C_V \frac{dT}{T} \; .\]

Integrating both sides from the initial to final state gives

\[ S_f – S_I = \int_i^{f} \frac{dQ}{T} = n R \ln \left( \frac{V_f}{V_i} \right) + n C_V \ln \left( \frac{T_f}{T_i} \right) \; .\]

This simplifies for an isothermal process to be

\[ S_f – S_i = n R \ln \left( \frac{V_f}{V_i} \right) \; .\]

So, the change in entropy for a free expansion must be exactly equal to the expression above even though, as the free expansion occurs, there is a complete absence of anything resemble a volume.  This is a subtle result that gets only more subtle when one reflects on the fact that statistical mechanics wasn’t available when the concept of entropy first appeared.

It is for this reason that the next several blogs will be looking at entropy.