Ever since Zeno has formulated his well known paradoxes, there has been a tug of war between mathematicians and philosophers alike about whether continuous mathematics is an approximation of a discrete reality or whether reality is continuous, hence properly described by continuous mathematics, but which can in practice only be computationally approached by discretized approximations. Does not quantum physics suggest that spacetime consists of quanta and hence is not continuous? Is it possible, or will it ever be possible to prove or disprove the discrete or continuous nature of reality? So far we have only been able to verify experimentally up to an order $10^{−16}$ that we can still subdivide, but that is still infinitely far away from true continuity. We have built the Large Hadron Collider to measure the smallest particles, but it needs many orders of magnitudes more to arrive at the Planck length $1.6\times10^{−33}$ cm. And that is beyond reach unless we can capture all the energy of a few galaxies. The only possible alternative is that we may hope to extract information from cosmological observations echoing states of the universe at a very early stage, when those amounts of energy were still available in a condensed form.

So Amit Hagar sets on a quest for the fundamental length in physics if ever there is one. Because the description of spacetime so far worked perfectly well with the continuous model, there is quite obviously, still great resistance against a discrete alternative. There are only few exceptions. See for example Max Tegmark's plea for a mathematical universe in Our mathematical universe. Also Hagar is convinced that there is some fundamental length which has for example the great advantage that a discrete model avoids the current singularities of a continuous worldview.

The book starts with a historical survey of the mathematical arguments used in favor or against spatial discreteness. The arguments are that the two visions are mutually exclusive and only one can lead to consistency of geometry. An analysis of Zeno's paradoxes by Adolf Grünbaum who criticized the views of Whitehead and Russel, leads to the conclusion that if we want to keep countable additivity, then a line segment must have $\aleph_1$ elements, or if we go for the discrete model, i.e. allowing line segments with $\aleph_0$ elements, then we have to give up on countable additivity. However Hagar objects that such arguments deal with the mathematics itself, and not with the applicability of mathematics to the real world. Other attempts have been made to construct a geometry on a discrete line segment. For example Weyl argued that if the shortest length is the distance between neighboring points, then in a 4 by 4 square, the diagonal would have the same length as the side since both cross 4 squares, and that violates Euclidean geometry. Again, the objection against this argument is that this only says that a discrete geometry should not be Euclidean. In the information age, the Church-Turing test was relying on a discrete Turing machine which could approximate a continuum to any desired accuracy, so it is generally accepted that a physical Turing machine would be able to describe the real world with any desired accuracy. In fact that is what applied mathematics do: they compute solutions good enough for any practical situation even if pure mathematics predict only a continuous solution.

In this way Hagar continues defusing all the mathematical arguments used against a discrete universe and he goes on to strip the arguments used on a more general philosophical level. A finite viewpoint would for example downgrade metaphysics to epistemology, i.e., what is depends on what we know. That would simplify many discussions. However, this viewpoint brings about a problem if one wants to distinguish classical from quantum probabilities. This can however be resolved by defining an appropriate measure which brings a new, more natural, interpretation to the thought experiment that motivated quantum physics.

From here Hagar turns to (quantum) physics. Almost by definition, quantum physics is discrete and the renormalization program is a collection of techniques to deal with infinity and singularities that inevitably arise when discrete quantum physics describe dynamics assumed to be continuous, i.e., dynamics in a continuous spacetime. Originally renormalization was used in quantum electrodynamics (QED) in which relativity theory and quantum mechanics were integrated. In this context we also see the notion of fundamental length appear for the first time. It was assumed to be of the order of the electron radius. Also Heisenberg's uncertainty relation imposed a finite resolution on the simultaneous measurements of position and momentum. This led Heisenberg even to speculate about a lattice world. The more general quantum field theory (QFT) considers particles as exited states of an underlying field. QED for example considers only one electron field and one photon field. However, it is not a candidate to describe full reality and shed some light on the problem at hand since it does not include gravity. Thus one should move a further step up and we should put our hope in quantum gravity (QG) theory, which is still under construction.

So Hagar continues by sketching the history of quantum gravity in which mathematicians and physicists join forces to incorporate gravity into quantum field theory. Discretization of gravity may lead to a fundamental length. Completing a theory of quantum gravity is still a matter of searching in the dark by absence of experimental data. However, accepting that there is some fundamental length, hence a discrete reality, may help the development of the theory. This can be explained as follows. Hagar's historical description of quantum gravity includes a correspondence between Einstein and W.F.G. Swann from which Hagar gets the inspiration for his "thesis L". By this thesis he means that there is the possibility that the dynamics of some (discrete) postulated building blocks are consistent with observable spacetime. In fact this thesis can serve to design these building blocks their symmetry groups and the proper metric. Hagar continues by proving his thesis assuming some discrete system with a fundamental length (or area or volume) exists. However, the problems caused by conflicting with relativistic causality, locality, unitarity, and Lorentz invariance are not yet resolved. So this is the current challenge for quantum gravity theorists: to construct a model that solves these problems on the small scale but still being consistent with what the classical approach can do at other scales. In a final chapter, a summary is given of the whole argumentation in the form of questions and answers.

This book is rather philosophical and definitely concerns a metaphysical problem, but the discussion is more at a metascience level. How can science, and in particular quantum gravity with all the mathematics that it involves, be advanced by his discrete worldview, and how this theory can eventually lead to an invitation for experimental verification of the existence of a fundamental length. However, it should be clear that the book is not about mathematics or about quantum physics. The text is only occasionally interrupted by a formula. It is certainly of interest for mathematicians and physicists working on quantum gravity theory, but it is not the place to learn about this topic. Although of general interest, it is no easy reading at all. It's a philosophical discussion of the foundations of modern physics placed in a historical context. Preliminary training in advanced mathematics and certainly in quantum physics is required.