If you’ve ever done a foundation course in classical thermodynamics, you will no doubt be familiar with the thought-experiment performed with the piece of thought-apparatus I have attempted to draw below.
Two vessels connected by a stopcock, the one filled with an ideal gas, the other evacuated. The experiment consists of opening the stopcock and allowing the gas to fill the vacuum. A simple procedure, accompanied by some not-so-simple considerations regarding the nature of spontaneous processes.
An ideal gas is a wonderfully convenient thing to conduct this mental experiment with, since the Joule-Thomson coefficient of an ideal gas is zero at any temperature. There is no absorption or liberation of heat when an ideal gas expands into a vacuum. At the same time, no work is done since there is no external pressure to expand against. It follows from the familiar first law equation that ΔU will then also be zero: the isothermal expansion of an ideal gas under these conditions is accompanied by no change in the internal energy. Since U is a state function, this can be generalised to the statement that for an ideal gas, ΔU is zero for any isothermal process, regardless of how it is carried out (ΔU=0, q=w). And from this result, it is a relatively straightforward matter to show that the accompanying molar entropy change is
ΔST = R log V2/V1
where R is the gas constant, and V1 and V2 are the initial and final volumes. In my student days of long ago, I was struck by the simplicity of this expression, which implied a state function relation
S = R log V + constant = NA k log V + constant
where NA is the Avogadro constant (≈ 6 x 1023 mol-1) and k is the Boltzmann constant. It seemed to me that the above relation bore more than a passing resemblance to the statistically-derived expression
S = k log P
where P is the number of microstates (n) corresponding to a particular macrostate, expressed as a fraction of the total number of microstates (N) in which a system can exist, i.e. P = n/N, and S is the entropy of that macrostate. The similarity set me wondering whether it was possible to reconcile the expressions for entropy in its statistical (model-based) and classical (model-free) modes by constructing a logic path from one equation to the other. As things turned out back then, I never got further than wondering. A campus that seemed to teem with perfectly lovely lovelies, a noisy occupation with progressive rock, and a youthful fascination with the products of yeast-mediated fermentation got in the way of such rarefied contemplative pursuits.
Now, all these years later, the time seems right to seek to conclude this unfinished business.
It is reasonable to assume that the probability of a molecule being found in a certain volume of space V is proportional to the volume, which implies that an expression can be written
P(1) = cV (1)
On this assumption, the probability that N molecules will be found in the volume V is
P(N) = (cV)N (2)
Expressing (2) in logarithms
log P(N) = N (log c + log V) (3)
The corresponding statistical expression for entropy is
S = k log P(N) (4)
Substituting (3) in (4) gives
S = k N (log c + log V) (5)
For a closed system comprising a molar mass of ideal gas in initial state (S1,V1) where NA is the Avogadro constant
S1 = k NA (log c + log V1) (6)
and for another state (S2,V2)
S2 = k NA (log c + log V2) (7)
the entropy change in an isothermal transition between these states is
S2 – S1 = k NA (log c + log V2) – k NA (log c + log V1)
ΔST = k NA (log V2 – log V1)
ΔST = R log V2/V1 (8)
On this assumptive analysis, it does appear possible to navigate between model-based (4) and model-free (8) expressions for entropy, from which one can infer as much or as little as one pleases. If I had done this in my student days, I would probably have ended up thinking that entropy had something akin to the wave-particle duality about it – two manifestations of one tertium quid. Whether that thought would have afforded me any access to deeper meaning is another matter altogether.
Header photo credit: 123rf.com