Tucked away at the back of Volume One of *The Scientific Papers of J. Willard Gibbs*, is a brief chapter headed ‘Unpublished Fragments’. It contains a list of nine subject headings for a supplement that Professor Gibbs was planning to write to his famous paper *“On the Equilibrium of Heterogeneous Substances”*. Sadly, he completed his notes for only two subjects before his death in April 1903, so we will never know what he had in mind to write about the sixth subject in the list: On entropy as mixed-up-ness.

Mixed-up-ness. It’s a catchy phrase, with an easy-to-grasp quality that brings entropy within the compass of minds less given to abstraction. That’s no bad thing, but without Gibbs’ guidance as to exactly what he meant by it, mixed-up-ness has taken on a life of its own and has led to entropy acquiring the derivative associations of chaos and disorder – again, easy-to-grasp ideas since they are fairly familiar occurrences in the lives of just about all of us.

Freed from connexion with more esoteric notions such as spontaneity, entropy has become very easy to recognise in the world around us as a purportedly scientific explanation of all sorts of mixed-up-ness, from unmade beds and untidy piles of paperwork to dysfunctional personal relationships, horse meat in the food chain and the ultimate breakdown of civilization as we know it.

This freely-associated understanding of entropy is now well-entrenched in popular culture and is unlikely to be modified. But in the parallel universe occupied by students of classical thermodynamics, chaotic bed linen and disordered documentation are not seen as entropy-driven manifestations. Sure, how these things come about may defy rational explanation, but they do not happen by themselves. Some external agency, human or otherwise, is always involved.

To physical chemists of the old school like myself, entropy has always been seen as the driver of spontaneously occurring thermodynamic processes, in which the combined entropy of system and surroundings increases to a maximum at equilibrium. This view of entropy partly explains why many of us had difficulty in absorbing the notion of entropy as chaos, since equilibrium always seemed to us a very calm and peaceful thing, quite the opposite of chaos.

Furthermore, we were quite sure that entropy was an extensive property, i.e. one that is dependent on the amount of substance or substances present in a system. But disorder didn’t at all have the feeling of an extensive property. If one (theoretically) divided a thermodynamically disordered system into two equal parts, would each part be half as disordered as the whole? We didn’t think so. To us, there were serious conceptual obstacles to accepting the notion of entropy as disorder.

But while our fundamental understanding of entropy was grounded in the thermal theories of Rankine and Clausius, we did give a statistical nod in the direction of Boltzmann when seeking to explain spontaneous isothermal phenomena. We accepted the notion of aggregation and dispersal as arbiters of entropy change, which we viewed (rightly or wrongly) as separate and distinct from changes in thermal entropy. We even had a name for it – configurational entropy.

Having not one but two different kinds of entropy to play with turned out to be quite useful at times. For example, it helped to explain counter-intuitive spontaneous happenings such as the following:

This is an experiment I remember well from my college days. The diagram shows a sealed Dewar flask containing a supercooled, saturated solution of sodium thiosulphate (aka thiosulfate). A tiny seeding crystal is dropped through a hole in the lid. Crystallization immediately occurs, with an apparent increase in organisation as piles of highly regular crystals form in the solution. It’s an awesome sight to behold.

The experiment provides an unequivocal demonstration that visually-assessed disorganisation and entropy cannot be regarded as synonymous, for while the former unquestionably decreases, the latter must surely increase because the process is spontaneous.

And in overall terms, indeed it does. Although the configurational entropy of the system decreases due to the aggregation of Na^{+} and S_{2}O_{3}^{2-} ions into crystals, the other kind of entropy – thermal entropy – more than compensates as the heat of crystallization causes the temperature of the system to rise. For the whole process ΔS_{system} > 0, and therefore ΔS_{universe} >0 since the system is isolated from its surroundings.

As I said, having two kinds of entropy to play with can be useful in explaining things that are otherwise counter-intuitive. The above experiment also serves to show that the fashion in popular culture to interpret entropy simply as mixed-up-ness can end up being more than mildly misleading.

We cannot really know what Gibbs was going to write in his “On entropy as mixed-up-ness”, but it is not too hard to extrapolate from his writings. Gibbs’ intelligence and familiarity with these matters, I think he would have written something very like this blog post. He would have said that the view of entropy as mixed-up-ness can be misleading and Gibbs probably would have given very similar examples like this blog post. Why do I say this? Just before he died, Gibbs published in 1902 a book on statistical mechanics that gave a very thorough and deep view on statistical mechanical entropy as the underlying mechanism for thermodynamic entropy. It turns out that Gibbs had been studying statistical mechanics for nearly 20 years, as evidenced by an abstract of his from 1884. His 1902 remarks on spontaneous processes (and spontaneous entropy increase) have to do with mixing up of statistical ensembles due to Hamiltonian flow, a mathematical notion closely related to chaos theory. In the end Gibbs’ arguments about entropy were very closely related to the later-developed information theory. Moreover, given the stress he placed on careful definitions and mathematical rigour, it is doubtful that he would have allowed vague notions of disorder to creep into his arguments.

LikeLike

Reblogged this on nebusresearch and commented:

The blog CarnotCycle, which tries to explain thermodynamics — a noble goal, since thermodynamics is a really big, really important, and criminally unpopularized part of science and mathematics — here starts from an “Unpublished Fragment” by the great Josiah Willard Gibbs to talk about entropy.

Gibbs — a strong candidate for the greatest scientist the United States ever produced, complete with fascinating biographical quirks to make him seem accessibly peculiar — gave to statistical mechanics much of the mathematical form and power that it now has. Gibbs had planned to write something about “On entropy as mixed-up-ness”, which certainly puts in one word what people think of entropy as being. The concept is more powerful and more subtle than that, though, and CarnotCycle talks about some of the subtleties.

LikeLike