European Mathematical Society - Arieh Ben-Naim
https://euro-math-soc.eu/author/arieh-ben-naim
enEntropy Demystified (2nd ed.)
https://euro-math-soc.eu/review/entropy-demystified-2nd-ed
<div class="field field-name-field-review-review field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><div class="tex2jax"><p>
Since I reviewed <a href="/review/briefest-history-time"><em>The Briefest History of Time</em></a> by this author in which he discusses the relation between the concept of entropy, the second law of thermodynamics and the arrow of time, I got the occasion to read (and write this review about) the current book which is the second edition of the book that started a series that Ben-Naim wrote since 2007 about his insight into the concept of entropy and related subjects. In fact this is almost a reprint from the revision of 2008. There is an extra section in chapter 7 in which he links the concept of entropy that he has introduced with Shannon's information theory.</p>
<p>
Also in <em>The Briefest History of Time</em>, the concept of entropy was explained, but I was still rather confused about the exact definitions. So I was eager to read this book hoping to understand it better. But I now realize that the explanation in <em>The Briefest History of Time</em> it is basically a summary of what is in this book, recycling much of the material and illustrations.</p>
<p>
Let me tell you from the start that this is an attempt to explain these concepts to the lay person. For a more mathematical treatment and precise definitions the reader is referred to two of his other books <em>A Farewell to Entropy</em> (2008) and <em>Discover Entropy and the Second Law of Thermodynamics: A Playful Way of Discovering a Law of Nature</em> (2010). The book opens with the history of the word entropy and the origin of the second law of thermodynamics. It starts with Sadi Carnot and Lord Kelvin to arrive at Clausius who introduced the word entropy, meaning change or transform. Ben-Naim finds this naming rather unfortunate since it refers to the second law expressing an indefinite increase of entropy while his claim is that for an isolated system entropy does not change in time. The atomistic view of Boltzmann induced a statistical interpretation for entropy in gas kinetics and he defined a quantity H that decreased towards a minimum reached at equilibrium. This caused controversy because it seemed to contradict conservation laws and it seemed to contradict all the physical laws that allowed time reversibility. It is however the embryo of the current definition.</p>
<p>
Since the entropy concept relies heavily on probability and statistics, Ben-Naim continues with a chapter introducing the lay reader to probability and information theory. No formulas, no formal definitions, no probability space whatsoever, but a lot of dice rolling and examples. The main thing to remember (but this becomes only clear in the subsequent chapter) is that if you roll 2 dice then you get for the sum a result between 2 and 12 but the probability to have a 7-sum is larger than all the other sum values because there are more ways to obtain 7-sum with 2 dice than any other sum value. Conditional and joint probability are also explained, but I guess that is not really relevant for the rest of the book. Concerning information theory, the following needs to be remembered. Suppose you have $N$ boxes, and one contains a coin and that you know the probability distribution of its position. If this distribution is uniform, then you need at least $\log_2 N$ binary questions to find it. The number of questions you need to find he coin is a measure of missing information. With other distributions like for example when you know that the coin in one of the first two boxes you need only one question. Thus the amount of missing information is maximal for the uniform distribution. In general if the probability that the coin is in box $i$ is $p_i$, then Shannon proved that the number of questions is $H=−\sum p_i\log_2 p_i$, which is Shannon's measure of information (SMI).</p>
<p>
Chapter 4 is an introduction to the second law. Consider $N$ binary dice (3 faces are 1 and the other 3 are 0) arranged in a row. You start with all of them in state 0. This gives an overall sum 0. Pick at random one of the dice and throw it resulting in a 0 or 1 at that position. Repeat picking and throwing a die in the row and sum the total value. Since the probability that you pick a die in state 0 in the beginning is larger than picking one in state 1, the sum will initially probably increase until there are as many dice in state 1 as there are in state 0. From that moment on the sum will fluctuate around the value $N/2$ (the steady state or equilibrium of the system). Of course there is a chance that the sum may drift away from $N/2$, but the chance to arrive back at the initial state of all zeros is really small, and the larger $N$, the smaller this is.</p>
<p>
The next two chapters repeat this idea saying that instead of 0-1 you could call it red-blue or cold-hot or sour-sweet or whatever and you may expect an averaging mixture as the equilibrium. So why not use the amount of missing information (AMI) which is of course minimal at initial state (all dice are 0 or red or ...) and is maximal at equilibrium (steady state). This explains the second law: AMI (measured by Shannon's measure of information) increases while the system evolves to its equilibrium where it reaches its maximum. This natural evolution toward equilibrium is thus not a matter of order or disorder, but just a matter of probability. The probability of evolution towards equilibrium is larger than the probability of moving away from it. In real life, $N$ is incredibly large since it does not concern a coin in a box but one has to deal with a very large amount of particles, each with its own state, i.e., with a position and a velocity, in an (isolated) box.</p>
<p>
Although the purpose of the book is to demystify entropy, I do not find anywhere a section to a clear definition of entropy. The best description I find in this book is on page 180 (that is in the new section added in the second edition) I read that you need to consider the equilibrium distribution of the system, compute its Shannon measure of information and then "[...] you get a quantity which is up to a multiplicative constant identical with the <em>thermodynamic entropy</em>". On page 182, only two pages later, I find a formula $\Delta S(expansion)=k_B N\ln_2$ which claims to link entropy with SMI. $k_B$ is the Boltzmann constant and the formula says how entropy changes when $N$ gas particles, originally confined to half the volume, are released to the whole volume. Thus a particle can be in the left half or right half (1 question) and there are $N$ particles. So this explains $N\ln 2$. The fact that $\ln 2$ is used and not $\log_2 2 = 1$ (there is only one question) is not important since is is just another constant factor. Is then $\Delta S$ the change in entropy or is it the entropy? What if the volume of the box is changing in time? Does then entropy also change with time? Of course, I can rely on what I learned from reading <em>The Briefest History of Time</em>, but if you read this as your first introduction to entropy, I believe you are not really instructed and the questions I had in my review of <em>The Briefest History of Time</em> are not clarified either. Unless I missed it, the formula on the cover of the book $S=k\log W$ (which is Boltzmann's equation for entropy engraved on his tomb) isn't explained, not in the historical part and not in the text. On page 153 I find a W to denote the number of states; but is that the same W? The colourful "formula" I find on page 182 $WHY=2^{N×WHAT}$ which is supposed to illustrate "what is entropy" and "why entropy always increases" isn't much enlightening either. It probably is not to be understood in a strict mathematical sense, but whatever it is meant to say is not explained here. It is copied from his other book <em>Entropy and the Second Law</em>. Moreover since it is stressed in <em>The Briefest History of Time</em> that entropy is only defined at equilibrium and hence does not depend on time, so how come that this formula illustrates that "entropy always increases"? And why is it written on page 124 that "[...] we know that (real) entropy always increases" if it is independent of time? (Here 'always' means with probability 1.) Unfortunately, the book is not really demystifying entropy for me.</p>
<p>
In a final chapter he discusses some related topics like order and disorder, the arrow of time, and why and how entropy and the second law have been misunderstood and by whom.</p>
<p>
I have the impression that this book is written for people who know (or who think to know) what entropy is and need to be brainwashed. If you are an outsider, you may have much fun playing with dice, and you learn some pitfalls and mechanisms of statistics, but I doubt that this is a clear introduction to entropy for the lay person. I should mention that the book starts with a list of programs (and he refers to these also in his text) that can be used to play these dice games online. They can be found on his website <a href="http://ariehbennaim.com/">ariehbennaim.com</a>: (go to Books>Entropy Demystified>FMI). Certainly these games give you some insight into the statistics that play a role in the mechanisms behind the second law and how SMI relates to entropy. However, as a mathematician I need more precision and exactness to understand the correct meaning and the time dependency is till a mystery to me. I probably need to read his more mathematical books that I have mentioned at the beginning of this review.</p>
</div></div></div></div><div class="field field-name-field-review-reviewer field-type-text field-label-inline clearfix"><div class="field-label">Reviewer: </div><div class="field-items"><div class="field-item even">Adhemar Bultheel</div></div></div><div class="field field-name-field-review-desc field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><div class="tex2jax"><p>
This is the second edition of the book originally published in 2007 and revised in 2008. Ben-Naim explains for the lay person his vision of how entropy should be defined as an instance of the amount of information that is present in the system and how this influences the interpretation of the second law of thermodynamics. </p>
</div></div></div></div><span class="vocabulary field field-name-field-review-author field-type-taxonomy-term-reference field-label-inline clearfix"><h2 class="field-label">Author: </h2><ul class="vocabulary-list"><li class="vocabulary-links field-item even"><a href="/author/arieh-ben-naim" typeof="skos:Concept" property="rdfs:label skos:prefLabel" datatype="">Arieh Ben-Naim</a></li></ul></span><span class="vocabulary field field-name-field-review-publisher field-type-taxonomy-term-reference field-label-inline clearfix"><h2 class="field-label">Publisher: </h2><ul class="vocabulary-list"><li class="vocabulary-links field-item even"><a href="/publisher/world-scientific" typeof="skos:Concept" property="rdfs:label skos:prefLabel" datatype="">world scientific</a></li></ul></span><div class="field field-name-field-review-pub field-type-number-integer field-label-inline clearfix"><div class="field-label">Published: </div><div class="field-items"><div class="field-item even">2016</div></div></div><div class="field field-name-field-review-isbn field-type-text field-label-inline clearfix"><div class="field-label">ISBN: </div><div class="field-items"><div class="field-item even">978-981-3100-12-1 (pbk)</div></div></div><div class="field field-name-field-review-price field-type-text field-label-inline clearfix"><div class="field-label">Price: </div><div class="field-items"><div class="field-item even">£21.00 (pbk)</div></div></div><div class="field field-name-field-review-pages field-type-number-integer field-label-inline clearfix"><div class="field-label">Pages: </div><div class="field-items"><div class="field-item even">262</div></div></div><span class="vocabulary field field-name-field-review-class field-type-taxonomy-term-reference field-label-hidden"><ul class="vocabulary-list"><li class="vocabulary-links field-item even"><a href="/imu/mathematics-education-and-popularization-mathematics" typeof="skos:Concept" property="rdfs:label skos:prefLabel" datatype="">Mathematics Education and Popularization of Mathematics</a></li><li class="vocabulary-links field-item odd"><a href="/imu/mathematics-science-and-technology" typeof="skos:Concept" property="rdfs:label skos:prefLabel" datatype="">Mathematics in Science and Technology</a></li><li class="vocabulary-links field-item even"><a href="/imu/probability-and-statistics" typeof="skos:Concept" property="rdfs:label skos:prefLabel" datatype="">Probability and Statistics</a></li></ul></span><div class="field field-name-field-review-website field-type-text field-label-hidden"><div class="field-items"><div class="field-item even"><a href="http://www.worldscientific.com/worldscibooks/10.1142/6916" title="Link to web page">http://www.worldscientific.com/worldscibooks/10.1142/6916</a></div></div></div><span class="vocabulary field field-name-field-review-msc field-type-taxonomy-term-reference field-label-hidden"><ul class="vocabulary-list"><li class="vocabulary-links field-item even"><a href="/msc/94-information-and-communication-circuits" typeof="skos:Concept" property="rdfs:label skos:prefLabel" datatype="">94 Information and communication, circuits</a></li></ul></span><span class="vocabulary field field-name-field-review-msc-full field-type-taxonomy-term-reference field-label-hidden"><ul class="vocabulary-list"><li class="vocabulary-links field-item even"><a href="/msc-full/94a17" typeof="skos:Concept" property="rdfs:label skos:prefLabel" datatype="">94A17</a></li></ul></span><span class="vocabulary field field-name-field-review-msc-other field-type-taxonomy-term-reference field-label-hidden"><ul class="vocabulary-list"><li class="vocabulary-links field-item even"><a href="/msc-full/80-01" typeof="skos:Concept" property="rdfs:label skos:prefLabel" datatype="">80-01</a></li></ul></span>Sun, 12 Jun 2016 07:15:36 +0000Adhemar Bultheel46988 at https://euro-math-soc.euhttps://euro-math-soc.eu/review/entropy-demystified-2nd-ed#commentsThe Briefest History of Time
https://euro-math-soc.eu/review/briefest-history-time
<div class="field field-name-field-review-review field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><div class="tex2jax"><p>
Arieh Ben-Naim is a professor of physical chemistry who retired in 2003 at the age of 70 from the Hebrew University of Jerusalem. While reading popular science books, he disliked the authors who considered entropy as something mysterious and unclear. And so he wrote the first of his own popular science books <em>Entropy Demystified: The Second Law Reduced to Plain Common Sense</em> (2008) in which he started spreading the gospel that entropy, classically considered as a measure of disorder, should be replaced by a concept of missing information; it's just a special case of Shannon's measure of information (SMI). The same message is given in a second book <em>A Farewell to Entropy</em> (2008) and in <em>Discover Entropy and the Second Law of Thermodynamics: A Playful Way of Discovering a Law of Nature</em> (2010). Also he claims that the second law of thermodynamics is misunderstood and that entropy of an isolated system does not increase with time. His colleague Frank L. Lambert is advocating to replace the classical notion of entropy by a concept of dispersion of energy. Lambert wrote a critical review of Ben-Naim's <em>Entropy Demystified</em> on the Amazon site which launched an online discussion between both where sentences were used like "...your proposal about Shannon information and S - is nothing but another word that begins with S and refers to human excrement". Ben-Naim responded with yet another book <em>Entropy and the Second Law: Interpretation and Misss-Interpretationsss</em> (2012) in which he mainly defends his view against Lambert's. In 2015 he wrote yet another book <em>Information, Entropy, Life and the Universe. What We Know and What We Do Not Know </em>in which he applies his ideas to explain life and the evolution of the universe. And I may have missed a few of his publications.</p>
<p>
In the current book the same issues are brought up again. However, since the "arrow of time" is, again in a classical viewpoint, associated with a unidirectional increase of entropy, his target subjects to be criticized are the authors of popular science books who explain (the arrow of) time (errantly of course) for the layman. The title of his book obviously refers to the Hawking books <em>A Brief History of Time</em> (1988) and <em>A Briefer History of Time</em> (2005). In short the argument of Ben-Naim is that if we assume that time started at the Big Bang and will end at the Big Crunch (both events he doubts, even if one could come up with some meta-time axis on which these Big events could be placed), then time just existed in between and did not change, so it has no proper history, and that is all there is to say about the history of time. The briefest history indeed.</p>
<p>
Clearly this requires scrutinizing the precise definitions of time and history. So that is what the book starts with. There is the colloquially used concept of time which has properties like good or bad, it can be killed, saved or wasted, and it can fly. On the other hand there is Time with a capital, which is the real physical time. History on the other hand is a sequence of events taking place at a point in Time and space. So that is a rather clear concept and he illustrates this with a number of brief histories: the history of mankind, his own biography, and the history of writing his book <em>Entropy Demystified</em>, etc. A history of an abstract concept like Time or space makes no sense. Events do take place in Time and space but Time or space itself remains untouched by this. Time and space do not have a history. Next question: Does time (or Time) flow and is there an arrow of time? Perhaps a psychological arrow, a thermodynamic one, or a cosmological one? Ben-Naim answers these questions, but he refrains from giving a clear definition of what time really is. That is quite understandable if it concerns the fuzzy time (lower case t) concept. But what about Time (upper case T)? Well, Ben-Naim seems to assume that Time is like "obvious". It defines the sequence of events in experiments but this is an implied definition. Hence a clear definition of Time is missing in my opinion. Nevertheless on page 172, Ben-Naim finds the three arrows mentioned above to be "figments of Hawking's imagination", although on page 16 he believes most animals have some kind of sense of time (or is this Time?), but each individual has several such psychological arrows, depending on ones mood. Hence this does not allow a proper generally accepted definition of the psychological arrow of time. Neither do any of the other "arrows" exist: "There is no thermodynamic arrow of time. Thermodynamics does not have any arrows!" (p. 172). If the arrow is connected with the expansion of the universe, will then time reverse when the universe starts contracting? "This is of course a preposterous idea" (p.173).</p>
<p>
Then follows a longer chapter on his pet topic: entropy and the second law of thermodynamics. He explains again that there is no connection between entropy and the arrow of Time. Does entropy (of the universe or an isolated system) always increase as stated in the second law? That is a senseless question says Ben-Naim. If the universe is infinite, then its entropy is not even defined (p. 197). How about a finite isolated system? Therefore he explains his vision that entropy as a special form of Shannon's measure of information (SMI). If molecules are clustered in a small part of a box, it takes less binary questions to locate a particular one than when the molecules are uniformly distributed. For each state of the system (for each spatial distribution of the particles — for simplicity we forget about velocity), there is some SMI. And entropy is proportional to SMI at equilibrium. Thus, says Ben-Naim, entropy is a function of the state of the system, not a function of time. But wait a minute.Unfortunately, in his effort to go step by step, Ben-Naim repeats himself over and over again with different situations, which creates a labyrinth in which the lay reader (as I am) will be more confused and mystified than ever before. Let me stay with the example of the particles in a box. I am not a specialist on entropy or thermodynamics but this is what I understand from this book. A certain spatial configuration of the particles is a realization, i.e., a sample of the distribution defining the state of the system. If the distribution is uniform (maximal SMI), the probability that the particles are clustered is small. If the distribution is peaking in some zone (smaller SMI), then the probability that you find there a cluster of particles is high. Now there seems to be a thermodynamic principle that makes the system evolve to higher SMI and the system reaches its maximal SMI at equilibrium. The evolution of the state of the system (thus of the SMI) to its maximal value can be quick but is not instantly. So the distribution, hence the state and thus the SMI depend on time. So far so good, but where is entropy? Entropy is proportional to the SMI <em>at equilibrium</em>. But when is this equilibrium reached? At time infinity? Then entropy does not exist, before the end of time. However, says Ben-Naim, <em>it can be shown that</em> the equilibrium distribution (in this case) is the uniform distribution and there is a formula that allows us to <em>define</em> entropy instantly for any given distribution $\{pi\}$: $S=−k_B\sum p_i\ln p_i$. I think that here is the crux of the (my?) confusion: one can define an entropy for each distribution, but an (isolated) system can only have one well defined equilibrium distribution. Thus an (isolated) system can have a time-varying SMI but only have one entropy at the end of time. Sentences like the following ones do not make this much clearer for an unprepared reader: "Does this SMI depend on time? Of course not!" (p.106) (the given distribution defines the SMI, but the problem is that the distribution changes with time until it reaches equilibrium). Here is another "Clearly, one cannot say that the SMI is a <em>function</em> of time. One must first examine whether the distribution on which the SMI is defined changes with time, and how it changes with time. The same is true of the entropy." (p.115) (if the distribution changes with time then so does the SMI, but not the entropy since this is for infinite time). Moreover everyone experiences that systems are constantly changing in time, hence also their equilibrium state, and thus the entropy. Thus entropy does depend on time? What system should this time-varying entropy be assigned to? Not the universe since its entropy is not defined (p.197). This is directly related to the problem with the second law. At least, that is what I guess is going on. Perhaps if I read some of his other books I get a better idea of what Ben-Naim's definition of entropy is and how it depends on time.</p>
<p>
In the context of the second law Ben-Naim discusses the question: Does time automatically lead to a uniform distribution of the molecules (maximal entropy)? Again his answer is no. In an theoretical ideal communist society, wealth will be redistributed until everybody has the same amount of money. However in a theoretical ideal capitalist society where one individual organizes a lottery promising an exuberant amount of money for anybody who pays a small prize and can guess the winning number between zero and $10^{10^{10}}$. The probability of winning is so small that all the money will eventually accumulate with the organizer of the lottery. In practice, assuming that the equilibrium distribution in thermodynamics is indeed uniform, then the system will converge to it almost certainly (with probability 1) but during convergence there is always a possibility that it deviates (temporarily). This I can understand. It means that there is no conflict with the reversibility of the equations and the irreversibility of thermodynamics. Thermodynamics is reversible too but the probability to observe this is practically zero.</p>
<p>
The next chapter is called "The history of the histories of time". Here Ben-Naim writes his rather extensive reviews of a number of books that gave (or failed to give according to Ben-Naim) the history of time. There are the two books by Hawking that I mentioned above, <em>A Brief History of Time</em> (1988) [B-N: Over 90% is irrelevant to the history of Time. This part is poorly written and most of it is incomprehensible for the lay reader. The remaining 10% relevant to Time is mostly meaningless and nonsensical.]; <em>A Briefer History of Time</em> (2005) [B-N: This is slightly better [...] mainly because it was cleared of most of the gibberish [...]. Specifically, the 10% [...] most "relevant to time" was eliminated rendering the book totally irrelevant to the history of Time.]; the book <em>From Eternity to Here: The Quest for the Ultimate Theory of Time</em>(2010) by S. Carroll [B-N: I have never heard of, seen, or read any other book with such a high density of meaninglessness, silly, and nonsensical statements which are repeated again and again, from here to eternity...]; and <em>Did Time Begin? Will Time End?</em> (2010) by P.H. Frampton [B-N: Most of the book is not about these questions. [...]. In my view, science will never have answers to these questions. And if it will have answers, they will have <em>zero</em> effect on our lives.]. Ben-Naim goes through the chapters of these books and comments on many quotes that he doesn't agree with. I gave already some examples of quotations that Ben-Naim uses to express his disagreement. Here are some more to get an idea of how the book continues: "I got a total blackout from reading this paragraph" (p.164); "The rest of the chapter is packed with meaningless, incomprehensible baloney (p.167); "I will not bother the reader with detailed discussion and criticism of all the nonsense written in this chapter" (p.171); "Here the author goes from silly to sillier to the silliest statements" (p.177); "Perhaps the most absurd of all the absurdities is found on page..." (p.203).</p>
<p>
From reviews about the other books of Ben-Naim (which I did not read) it may be more clear how he defines entropy, but the summary he gives here (he refers to his other books for more details on about every page) does not make it very clear to the layman. As a mathematician I am used to much more clear definitions. I do not feel after reading this book that I now have a clear idea about what time (or Time) is. I do understand what is meant by history. What I do not understand is why he has to give his own brief biography and the history of his pen he uses to write to make clear what a history can be. He may have a point in his critique of some of the statements about time or more particularly, like it is mentioned in the subtitle <em>the mistrued association between entropy and time</em> but that does not justify everything in the first half of the book, and it does not justify the language used in the second half. That second part is just bashing of "the authors of popular science books" and other jerks who do not really know what they are talking about. Proper scientific arguments are not really given, except for the fuzzy definitions and arguments given for the layman in the first part. If you are looking for mathematical aspects of entropy, this is not the place to look for it.</p>
</div></div></div></div><div class="field field-name-field-review-reviewer field-type-text field-label-inline clearfix"><div class="field-label">Reviewer: </div><div class="field-items"><div class="field-item even">Adhemar Bultheel</div></div></div><div class="field field-name-field-review-desc field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><div class="tex2jax"><p>
It has become A. Ben-Naim's pet horse in several of his recent books to emphasize that entropy as a measure of disorder makes no sense and that entropy should be properly defined as a special case of Shannon's measure of information (SMI). The basic idea of his vision is recalled and then he makes it clear that connecting entropy with the arrow of time is senseless. This culminates of a critical review of Hawking's books <em>A Brief History of Time</em> and his sequel <em>A Briefer History of Time</em> (to which his own title refers of course) along with two other popular science books by S. Carroll and by P.H. Frampton that are also dealing with time. </p>
</div></div></div></div><span class="vocabulary field field-name-field-review-author field-type-taxonomy-term-reference field-label-inline clearfix"><h2 class="field-label">Author: </h2><ul class="vocabulary-list"><li class="vocabulary-links field-item even"><a href="/author/arieh-ben-naim" typeof="skos:Concept" property="rdfs:label skos:prefLabel" datatype="">Arieh Ben-Naim</a></li></ul></span><span class="vocabulary field field-name-field-review-publisher field-type-taxonomy-term-reference field-label-inline clearfix"><h2 class="field-label">Publisher: </h2><ul class="vocabulary-list"><li class="vocabulary-links field-item even"><a href="/publisher/world-scientific" typeof="skos:Concept" property="rdfs:label skos:prefLabel" datatype="">world scientific</a></li></ul></span><div class="field field-name-field-review-pub field-type-number-integer field-label-inline clearfix"><div class="field-label">Published: </div><div class="field-items"><div class="field-item even">2016</div></div></div><div class="field field-name-field-review-isbn field-type-text field-label-inline clearfix"><div class="field-label">ISBN: </div><div class="field-items"><div class="field-item even"> 978-981-4749-65-5 (pbk)</div></div></div><div class="field field-name-field-review-price field-type-text field-label-inline clearfix"><div class="field-label">Price: </div><div class="field-items"><div class="field-item even">£18.00 (pbk)</div></div></div><div class="field field-name-field-review-pages field-type-number-integer field-label-inline clearfix"><div class="field-label">Pages: </div><div class="field-items"><div class="field-item even">264</div></div></div><span class="vocabulary field field-name-field-review-class field-type-taxonomy-term-reference field-label-hidden"><ul class="vocabulary-list"><li class="vocabulary-links field-item even"><a href="/imu/mathematical-physics" typeof="skos:Concept" property="rdfs:label skos:prefLabel" datatype="">Mathematical Physics</a></li></ul></span><div class="field field-name-field-review-website field-type-text field-label-hidden"><div class="field-items"><div class="field-item even"><a href="http://www.worldscientific.com/worldscibooks/10.1142/9943" title="Link to web page">http://www.worldscientific.com/worldscibooks/10.1142/9943</a></div></div></div><span class="vocabulary field field-name-field-review-msc field-type-taxonomy-term-reference field-label-hidden"><ul class="vocabulary-list"><li class="vocabulary-links field-item even"><a href="/msc/94-information-and-communication-circuits" typeof="skos:Concept" property="rdfs:label skos:prefLabel" datatype="">94 Information and communication, circuits</a></li></ul></span><span class="vocabulary field field-name-field-review-msc-full field-type-taxonomy-term-reference field-label-hidden"><ul class="vocabulary-list"><li class="vocabulary-links field-item even"><a href="/msc-full/94a17" typeof="skos:Concept" property="rdfs:label skos:prefLabel" datatype="">94A17</a></li></ul></span><span class="vocabulary field field-name-field-review-msc-other field-type-taxonomy-term-reference field-label-hidden"><ul class="vocabulary-list"><li class="vocabulary-links field-item even"><a href="/msc-full/80-01" typeof="skos:Concept" property="rdfs:label skos:prefLabel" datatype="">80-01</a></li><li class="vocabulary-links field-item odd"><a href="/msc-full/82b30" typeof="skos:Concept" property="rdfs:label skos:prefLabel" datatype="">82B30</a></li><li class="vocabulary-links field-item even"><a href="/msc-full/82b35" typeof="skos:Concept" property="rdfs:label skos:prefLabel" datatype="">82B35</a></li></ul></span>Sun, 29 May 2016 08:31:35 +0000Adhemar Bultheel46970 at https://euro-math-soc.euhttps://euro-math-soc.eu/review/briefest-history-time#comments