The Briefest History of Time

Arieh Ben-Naim is a professor of physical chemistry who retired in 2003 at the age of 70 from the Hebrew University of Jerusalem. While reading popular science books, he disliked the authors who considered entropy as something mysterious and unclear. And so he wrote the first of his own popular science books Entropy Demystified: The Second Law Reduced to Plain Common Sense (2008) in which he started spreading the gospel that entropy, classically considered as a measure of disorder, should be replaced by a concept of missing information; it's just a special case of Shannon's measure of information (SMI). The same message is given in a second book A Farewell to Entropy (2008) and in Discover Entropy and the Second Law of Thermodynamics: A Playful Way of Discovering a Law of Nature (2010). Also he claims that the second law of thermodynamics is misunderstood and that entropy of an isolated system does not increase with time. His colleague Frank L. Lambert is advocating to replace the classical notion of entropy by a concept of dispersion of energy. Lambert wrote a critical review of Ben-Naim's Entropy Demystified on the Amazon site which launched an online discussion between both where sentences were used like "...your proposal about Shannon information and S - is nothing but another word that begins with S and refers to human excrement". Ben-Naim responded with yet another book Entropy and the Second Law: Interpretation and Misss-Interpretationsss (2012) in which he mainly defends his view against Lambert's. In 2015 he wrote yet another book Information, Entropy, Life and the Universe. What We Know and What We Do Not Know in which he applies his ideas to explain life and the evolution of the universe. And I may have missed a few of his publications.

In the current book the same issues are brought up again. However, since the "arrow of time" is, again in a classical viewpoint, associated with a unidirectional increase of entropy, his target subjects to be criticized are the authors of popular science books who explain (the arrow of) time (errantly of course) for the layman. The title of his book obviously refers to the Hawking books A Brief History of Time (1988) and A Briefer History of Time (2005). In short the argument of Ben-Naim is that if we assume that time started at the Big Bang and will end at the Big Crunch (both events he doubts, even if one could come up with some meta-time axis on which these Big events could be placed), then time just existed in between and did not change, so it has no proper history, and that is all there is to say about the history of time. The briefest history indeed.

Clearly this requires scrutinizing the precise definitions of time and history. So that is what the book starts with. There is the colloquially used concept of time which has properties like good or bad, it can be killed, saved or wasted, and it can fly. On the other hand there is Time with a capital, which is the real physical time. History on the other hand is a sequence of events taking place at a point in Time and space. So that is a rather clear concept and he illustrates this with a number of brief histories: the history of mankind, his own biography, and the history of writing his book Entropy Demystified, etc. A history of an abstract concept like Time or space makes no sense. Events do take place in Time and space but Time or space itself remains untouched by this. Time and space do not have a history. Next question: Does time (or Time) flow and is there an arrow of time? Perhaps a psychological arrow, a thermodynamic one, or a cosmological one? Ben-Naim answers these questions, but he refrains from giving a clear definition of what time really is. That is quite understandable if it concerns the fuzzy time (lower case t) concept. But what about Time (upper case T)? Well, Ben-Naim seems to assume that Time is like "obvious". It defines the sequence of events in experiments but this is an implied definition. Hence a clear definition of Time is missing in my opinion. Nevertheless on page 172, Ben-Naim finds the three arrows mentioned above to be "figments of Hawking's imagination", although on page 16 he believes most animals have some kind of sense of time (or is this Time?), but each individual has several such psychological arrows, depending on ones mood. Hence this does not allow a proper generally accepted definition of the psychological arrow of time. Neither do any of the other "arrows" exist: "There is no thermodynamic arrow of time. Thermodynamics does not have any arrows!" (p. 172). If the arrow is connected with the expansion of the universe, will then time reverse when the universe starts contracting? "This is of course a preposterous idea" (p.173).

Then follows a longer chapter on his pet topic: entropy and the second law of thermodynamics. He explains again that there is no connection between entropy and the arrow of Time. Does entropy (of the universe or an isolated system) always increase as stated in the second law? That is a senseless question says Ben-Naim. If the universe is infinite, then its entropy is not even defined (p. 197). How about a finite isolated system? Therefore he explains his vision that entropy as a special form of Shannon's measure of information (SMI). If molecules are clustered in a small part of a box, it takes less binary questions to locate a particular one than when the molecules are uniformly distributed. For each state of the system (for each spatial distribution of the particles — for simplicity we forget about velocity), there is some SMI. And entropy is proportional to SMI at equilibrium. Thus, says Ben-Naim, entropy is a function of the state of the system, not a function of time. But wait a minute.Unfortunately, in his effort to go step by step, Ben-Naim repeats himself over and over again with different situations, which creates a labyrinth in which the lay reader (as I am) will be more confused and mystified than ever before. Let me stay with the example of the particles in a box. I am not a specialist on entropy or thermodynamics but this is what I understand from this book. A certain spatial configuration of the particles is a realization, i.e., a sample of the distribution defining the state of the system. If the distribution is uniform (maximal SMI), the probability that the particles are clustered is small. If the distribution is peaking in some zone (smaller SMI), then the probability that you find there a cluster of particles is high. Now there seems to be a thermodynamic principle that makes the system evolve to higher SMI and the system reaches its maximal SMI at equilibrium. The evolution of the state of the system (thus of the SMI) to its maximal value can be quick but is not instantly. So the distribution, hence the state and thus the SMI depend on time. So far so good, but where is entropy? Entropy is proportional to the SMI at equilibrium. But when is this equilibrium reached? At time infinity? Then entropy does not exist, before the end of time. However, says Ben-Naim, it can be shown that the equilibrium distribution (in this case) is the uniform distribution and there is a formula that allows us to define entropy instantly for any given distribution $\{pi\}$: $S=−k_B\sum p_i\ln p_i$. I think that here is the crux of the (my?) confusion: one can define an entropy for each distribution, but an (isolated) system can only have one well defined equilibrium distribution. Thus an (isolated) system can have a time-varying SMI but only have one entropy at the end of time. Sentences like the following ones do not make this much clearer for an unprepared reader: "Does this SMI depend on time? Of course not!" (p.106) (the given distribution defines the SMI, but the problem is that the distribution changes with time until it reaches equilibrium). Here is another "Clearly, one cannot say that the SMI is a function of time. One must first examine whether the distribution on which the SMI is defined changes with time, and how it changes with time. The same is true of the entropy." (p.115) (if the distribution changes with time then so does the SMI, but not the entropy since this is for infinite time). Moreover everyone experiences that systems are constantly changing in time, hence also their equilibrium state, and thus the entropy. Thus entropy does depend on time? What system should this time-varying entropy be assigned to? Not the universe since its entropy is not defined (p.197). This is directly related to the problem with the second law. At least, that is what I guess is going on. Perhaps if I read some of his other books I get a better idea of what Ben-Naim's definition of entropy is and how it depends on time.

In the context of the second law Ben-Naim discusses the question: Does time automatically lead to a uniform distribution of the molecules (maximal entropy)? Again his answer is no. In an theoretical ideal communist society, wealth will be redistributed until everybody has the same amount of money. However in a theoretical ideal capitalist society where one individual organizes a lottery promising an exuberant amount of money for anybody who pays a small prize and can guess the winning number between zero and $10^{10^{10}}$. The probability of winning is so small that all the money will eventually accumulate with the organizer of the lottery. In practice, assuming that the equilibrium distribution in thermodynamics is indeed uniform, then the system will converge to it almost certainly (with probability 1) but during convergence there is always a possibility that it deviates (temporarily). This I can understand. It means that there is no conflict with the reversibility of the equations and the irreversibility of thermodynamics. Thermodynamics is reversible too but the probability to observe this is practically zero.

The next chapter is called "The history of the histories of time". Here Ben-Naim writes his rather extensive reviews of a number of books that gave (or failed to give according to Ben-Naim) the history of time. There are the two books by Hawking that I mentioned above, A Brief History of Time (1988) [B-N: Over 90% is irrelevant to the history of Time. This part is poorly written and most of it is incomprehensible for the lay reader. The remaining 10% relevant to Time is mostly meaningless and nonsensical.]; A Briefer History of Time (2005) [B-N: This is slightly better [...] mainly because it was cleared of most of the gibberish [...]. Specifically, the 10% [...] most "relevant to time" was eliminated rendering the book totally irrelevant to the history of Time.]; the book From Eternity to Here: The Quest for the Ultimate Theory of Time(2010) by S. Carroll [B-N: I have never heard of, seen, or read any other book with such a high density of meaninglessness, silly, and nonsensical statements which are repeated again and again, from here to eternity...]; and Did Time Begin? Will Time End? (2010) by P.H. Frampton [B-N: Most of the book is not about these questions. [...]. In my view, science will never have answers to these questions. And if it will have answers, they will have zero effect on our lives.]. Ben-Naim goes through the chapters of these books and comments on many quotes that he doesn't agree with. I gave already some examples of quotations that Ben-Naim uses to express his disagreement. Here are some more to get an idea of how the book continues: "I got a total blackout from reading this paragraph" (p.164); "The rest of the chapter is packed with meaningless, incomprehensible baloney (p.167); "I will not bother the reader with detailed discussion and criticism of all the nonsense written in this chapter" (p.171); "Here the author goes from silly to sillier to the silliest statements" (p.177); "Perhaps the most absurd of all the absurdities is found on page..." (p.203).

From reviews about the other books of Ben-Naim (which I did not read) it may be more clear how he defines entropy, but the summary he gives here (he refers to his other books for more details on about every page) does not make it very clear to the layman. As a mathematician I am used to much more clear definitions. I do not feel after reading this book that I now have a clear idea about what time (or Time) is. I do understand what is meant by history. What I do not understand is why he has to give his own brief biography and the history of his pen he uses to write to make clear what a history can be. He may have a point in his critique of some of the statements about time or more particularly, like it is mentioned in the subtitle the mistrued association between entropy and time but that does not justify everything in the first half of the book, and it does not justify the language used in the second half. That second part is just bashing of "the authors of popular science books" and other jerks who do not really know what they are talking about. Proper scientific arguments are not really given, except for the fuzzy definitions and arguments given for the layman in the first part. If you are looking for mathematical aspects of entropy, this is not the place to look for it.

Reviewer: 
Adhemar Bultheel
Book details

It has become A. Ben-Naim's pet horse in several of his recent books to emphasize that entropy as a measure of disorder makes no sense and that entropy should be properly defined as a special case of Shannon's measure of information (SMI). The basic idea of his vision is recalled and then he makes it clear that connecting entropy with the arrow of time is senseless. This culminates of a critical review of Hawking's books A Brief History of Time and his sequel A Briefer History of Time (to which his own title refers of course) along with two other popular science books by S. Carroll and by P.H. Frampton that are also dealing with time. 

Author:  Publisher: 
Published: 
2016
ISBN: 
978-981-4749-65-5 (pbk)
Price: 
£18.00 (pbk)
Pages: 
264
Categorisation