# The Great Formal Machinery Works

People identify computers sometimes with their technology: chips ever decreasing in size up to the level of nanostructures, but computer science is so much more about the software that runs on this technology. Since the early digital computers emerged around the middle of the twentieth century, computer science has grown into a mature science requiring rigorous and rather abstract approaches to solve highly challenging problems in for example computer graphics, computational complexity, formal languages, artificial intelligence, program verification, cyber security, and many more. Maturity shows when a theoretical level exists above a more experimental approach. In many of the applications just mentioned, one will recognize in their scientific analysis the same deductive formalism that also mathematicians are used to. In these aspects it can certainly be considered as a branch of (applied) mathematics. On top of the current state of affairs, it is almost certain that Von Neumann, Turing and other founders of the early computer age, would not have developed their fundamental ideas if the time had not been ripe for it. The way scientists were thinking about the foundations of mathematics and logic and their use of formalism in deducing their conclusions were essential elements that triggered mathematicians and engineers to become the computer pioneers who made it all happen.

Indeed, computers do not think, but need a strict formalism to describe the algorithms that will allow to design compilers, to extract information from big data, to search the web, to render computer animated movies and many more. So the title of the books is most appropriate: The great formal machine works because of the preparatory work on the theories of deduction, logic and the foundation of mathematics as they were at the start of the digital age. In this book the philosopher von Plato sketches in some detail how the concept of a formal proof emerged after a long history of evolution and of competing opinions. The transition from the deductive logic and the syllogistic proofs of Aristotle's time to the logicism of Frege and Russell and the subsequent natural deduction came not overnight but has followed a meandering path, sometimes with great ideological controversies.

After introducing the Aristotlian model, the book immeditely jumps to the nineteenth century. Two events then started to revolutionize the proof techniques from Greek antiquity that had dominated mathematics for centuries. One is the discovery of non-Euclidean geometry (E. Beltrami, 1868), and the other is the algebraic technique used to formalize calculus and the definition of the natural numbers (H. Grassmann, 1861). Grassmann gave a recursive definition of the natural numbers which later led to the Peano axioms (1889). Still later Dedekind added the real numbers to the system. Hilbert was the one to support and push an axiomatic geometry to the frontline. With the formalization of mathematics also the formalization of the logic deduction came along. Boole algebra (1847) reduced the description of complex circuitry to simple algebraic computations which actually corresponds to the valuation calculus of a logic propositional system. E. Schröder (1890) used an order relation and this was picked up by T. Skolem who brought a lattice theory to logic. But the true father of formal logic is Gottlob Frege. Von Plato makes it very clear that no exegesis of Frege's work can be taken seriously if it does not recognize that his greatest contribution is the inference to generality (he essentially introduced quantifiers) although it was only properly recognized by B. Russell in the *Principia Mathematica* (1910). Russell's paradox on the other hand revealed the inconsistency of Frege's foundational system of mathematics. The *Principia* are of course a summit of logic formalism and a direct consequence of the work of Frege, adding the quantifiers to Peano's logic.

Following Kronecker, Skolem was the 20th century promoter of finitism: something only exist if it can be decided in a finite number of steps. In this spirit, Skolem rejected the use of quantifiers and introduced recursion instead. Intuitionism was introduced in mathematics and logic by Brouwer and his student Heyting in Holland. But David Hilbert in Göttingen was a leading mathematical figure with many followers in his slipstream. Göttingen had a research tradition on foundations in mathematics and logic and Hilbert was strongly opposed to Brouwer's ideas of constructivism. The German approach was explained in several books by Hilbert and Ackermann and Hilbert and Bernays, of which von Plato gives a thorough discussion, even more so as what he did in previous chapters with other publications that were important for the swings of the historical flow. Similarly he also discusses the work of Gödel who, besides several other things proved the completeness of predicate logic and of course his famous incompleteness theorems (1930-31). This of course had a tremendous impact on the community at the time. In the remaining chapters von Plato discusses the work of Gentzen who introduced a system of natural deduction. His sequent calculus formed the basis of current proof analysis and proof search. Both of these contributions by Gentzen are again explained in detail by von Plato. A final chapter is an in depth discussion of an intuitionistic consistency proof of classical arithmetic as it was originally conceived by Gentzen (1934).

In this review, I have spent quite a few lines on the historical events that have led to the eventual work of Gödel. I do not want to leave here the wrong impression that this is the main part of this book. On the contrary, von Plato starts with the ideas of Hilbert and his co-workers when the book is about half way. It is clear that with this book, which is based on notes for his lectures, he wants to push the students towards a study of primary sources. He discusses them in depth, lifting those elements that are important for this historical line of thoughts with references to pages, paragraphs and formulas. He also uses the formal notation that is used there, rather than translating everything to a uniform modern standard notation. Thus readers should have a strong interest and preferably an certain education in formal logic and/or the foundations of mathematics and maybe some philosophical interest as well. I am not aware of any publication that covers this particular subject in any depth as it is developed here. The book definitely fills a gap. Even though this book is based on mostly primary sources, it is still quite readable, not letting the quotations dominate and obscure the historical trail that has led to the eventual consistency proof in the last chapter. On the other hand, it is a research text, focussing on the foundations of mathematics and logic. The sources used are historic, but it is not just a history book either. We are not informed about biographical data of the mathematicians or the unfortunate fate of mathematics in Göttingen during the Nazi regime. The fact that Gentzen collaborated is utterly irrelevant for his scientific achievements. So you probably will not want to read this if you are looking for an easy reading text such as a popular history or science book. I should also warn computer scientists that this is about foundations that have led to computer science and that are still relevant in some of the more abstract branches of theoretical computer science, but it is not about computer science in an engineering or applied sense. There is no cryptography, computer graphics, networks, etc. and not even the P vs NP problem. Turing, for example, is present but only as a bit player while others take the main stage,

**Submitted by Adhemar Bultheel |

**20 / Sep / 2017