Entropy Demystified (2nd ed.)

Since I reviewed The Briefest History of Time by this author in which he discusses the relation between the concept of entropy, the second law of thermodynamics and the arrow of time, I got the occasion to read (and write this review about) the current book which is the second edition of the book that started a series that Ben-Naim wrote since 2007 about his insight into the concept of entropy and related subjects. In fact this is almost a reprint from the revision of 2008. There is an extra section in chapter 7 in which he links the concept of entropy that he has introduced with Shannon's information theory.

Also in The Briefest History of Time, the concept of entropy was explained, but I was still rather confused about the exact definitions. So I was eager to read this book hoping to understand it better. But I now realize that the explanation in The Briefest History of Time it is basically a summary of what is in this book, recycling much of the material and illustrations.

Let me tell you from the start that this is an attempt to explain these concepts to the lay person. For a more mathematical treatment and precise definitions the reader is referred to two of his other books A Farewell to Entropy (2008) and Discover Entropy and the Second Law of Thermodynamics: A Playful Way of Discovering a Law of Nature (2010). The book opens with the history of the word entropy and the origin of the second law of thermodynamics. It starts with Sadi Carnot and Lord Kelvin to arrive at Clausius who introduced the word entropy, meaning change or transform. Ben-Naim finds this naming rather unfortunate since it refers to the second law expressing an indefinite increase of entropy while his claim is that for an isolated system entropy does not change in time. The atomistic view of Boltzmann induced a statistical interpretation for entropy in gas kinetics and he defined a quantity H that decreased towards a minimum reached at equilibrium. This caused controversy because it seemed to contradict conservation laws and it seemed to contradict all the physical laws that allowed time reversibility. It is however the embryo of the current definition.

Since the entropy concept relies heavily on probability and statistics, Ben-Naim continues with a chapter introducing the lay reader to probability and information theory. No formulas, no formal definitions, no probability space whatsoever, but a lot of dice rolling and examples. The main thing to remember (but this becomes only clear in the subsequent chapter) is that if you roll 2 dice then you get for the sum a result between 2 and 12 but the probability to have a 7-sum is larger than all the other sum values because there are more ways to obtain 7-sum with 2 dice than any other sum value. Conditional and joint probability are also explained, but I guess that is not really relevant for the rest of the book. Concerning information theory, the following needs to be remembered. Suppose you have $N$ boxes, and one contains a coin and that you know the probability distribution of its position. If this distribution is uniform, then you need at least $\log_2 N$ binary questions to find it. The number of questions you need to find he coin is a measure of missing information. With other distributions like for example when you know that the coin in one of the first two boxes you need only one question. Thus the amount of missing information is maximal for the uniform distribution. In general if the probability that the coin is in box $i$ is $p_i$, then Shannon proved that the number of questions is $H=−\sum p_i\log_2 p_i$, which is Shannon's measure of information (SMI).

Chapter 4 is an introduction to the second law. Consider $N$ binary dice (3 faces are 1 and the other 3 are 0) arranged in a row. You start with all of them in state 0. This gives an overall sum 0. Pick at random one of the dice and throw it resulting in a 0 or 1 at that position. Repeat picking and throwing a die in the row and sum the total value. Since the probability that you pick a die in state 0 in the beginning is larger than picking one in state 1, the sum will initially probably increase until there are as many dice in state 1 as there are in state 0. From that moment on the sum will fluctuate around the value $N/2$ (the steady state or equilibrium of the system). Of course there is a chance that the sum may drift away from $N/2$, but the chance to arrive back at the initial state of all zeros is really small, and the larger $N$, the smaller this is.

The next two chapters repeat this idea saying that instead of 0-1 you could call it red-blue or cold-hot or sour-sweet or whatever and you may expect an averaging mixture as the equilibrium. So why not use the amount of missing information (AMI) which is of course minimal at initial state (all dice are 0 or red or ...) and is maximal at equilibrium (steady state). This explains the second law: AMI (measured by Shannon's measure of information) increases while the system evolves to its equilibrium where it reaches its maximum. This natural evolution toward equilibrium is thus not a matter of order or disorder, but just a matter of probability. The probability of evolution towards equilibrium is larger than the probability of moving away from it. In real life, $N$ is incredibly large since it does not concern a coin in a box but one has to deal with a very large amount of particles, each with its own state, i.e., with a position and a velocity, in an (isolated) box.

Although the purpose of the book is to demystify entropy, I do not find anywhere a section to a clear definition of entropy. The best description I find in this book is on page 180 (that is in the new section added in the second edition) I read that you need to consider the equilibrium distribution of the system, compute its Shannon measure of information and then "[...] you get a quantity which is up to a multiplicative constant identical with the thermodynamic entropy". On page 182, only two pages later, I find a formula $\Delta S(expansion)=k_B N\ln_2$ which claims to link entropy with SMI. $k_B$ is the Boltzmann constant and the formula says how entropy changes when $N$ gas particles, originally confined to half the volume, are released to the whole volume. Thus a particle can be in the left half or right half (1 question) and there are $N$ particles. So this explains $N\ln ⁡2$. The fact that $\ln 2$ is used and not $\log_2 2 = 1$ (there is only one question) is not important since is is just another constant factor. Is then $\Delta S$ the change in entropy or is it the entropy? What if the volume of the box is changing in time? Does then entropy also change with time? Of course, I can rely on what I learned from reading The Briefest History of Time, but if you read this as your first introduction to entropy, I believe you are not really instructed and the questions I had in my review of The Briefest History of Time are not clarified either. Unless I missed it, the formula on the cover of the book $S=k\log W$ (which is Boltzmann's equation for entropy engraved on his tomb) isn't explained, not in the historical part and not in the text. On page 153 I find a W to denote the number of states; but is that the same W? The colourful "formula" I find on page 182 $WHY=2^{N×WHAT}$ which is supposed to illustrate "what is entropy" and "why entropy always increases" isn't much enlightening either. It probably is not to be understood in a strict mathematical sense, but whatever it is meant to say is not explained here. It is copied from his other book Entropy and the Second Law. Moreover since it is stressed in The Briefest History of Time that entropy is only defined at equilibrium and hence does not depend on time, so how come that this formula illustrates that "entropy always increases"? And why is it written on page 124 that "[...] we know that (real) entropy always increases" if it is independent of time? (Here 'always' means with probability 1.) Unfortunately, the book is not really demystifying entropy for me.

In a final chapter he discusses some related topics like order and disorder, the arrow of time, and why and how entropy and the second law have been misunderstood and by whom.

I have the impression that this book is written for people who know (or who think to know) what entropy is and need to be brainwashed. If you are an outsider, you may have much fun playing with dice, and you learn some pitfalls and mechanisms of statistics, but I doubt that this is a clear introduction to entropy for the lay person. I should mention that the book starts with a list of programs (and he refers to these also in his text) that can be used to play these dice games online. They can be found on his website ariehbennaim.com: (go to Books>Entropy Demystified>FMI). Certainly these games give you some insight into the statistics that play a role in the mechanisms behind the second law and how SMI relates to entropy. However, as a mathematician I need more precision and exactness to understand the correct meaning and the time dependency is till a mystery to me. I probably need to read his more mathematical books that I have mentioned at the beginning of this review.

Reviewer: 
Adhemar Bultheel
Book details

This is the second edition of the book originally published in 2007 and revised in 2008. Ben-Naim explains for the lay person his vision of how entropy should be defined as an instance of the amount of information that is present in the system and how this influences the interpretation of the second law of thermodynamics. 

Author:  Publisher: 
Published: 
2016
ISBN: 
978-981-3100-12-1 (pbk)
Price: 
£21.00 (pbk)
Pages: 
262
Categorisation