This is not really a mathematics book. Its subtitle is Ten Sketches of Computer Science. So it is more about algorithms and computers than about mathematics. The idea is the same as in many similar books popularizing mathematics. Here the author is explaining with some elementary examples what computer scientists are involved in. Just like the mathematical analogs that often take numbers or history as a theme to attach mathematics to, the theme of this book is (book) publication and text processing in a digital era.
Even though computer science is not mathematics, it is an applied science and as most other applied sciences, it relies heavily on mathematics. Sometimes the mathematics are really simple (like in this book), but when one thinks of cryptography, the analysis of big data, artificial intelligence, numerical analysis, and other computer science problems, the mathematics are not always so simple and require answering deep and challenging mathematical questions. In fact Turing was involved with the foundations of mathematics and Gödel's theorems, the hardware of the electro-mechanical decryption machine for the Enigma, and so with his Turing machine laid at least the foundations of computers and computer science.
What mathematics can there be involved in printing and publishing? Well, that is easier than you think. For example positioning a text on a page (or a screen) requires describing the location of every point (or pixel). Such a coordinate system was basically what Descartes and Fermat needed when they married geometry and algebra and created analytic geometry, No matter how high your printing resolution is, there will in practice always be a finite number of points that do or do not get printing ink (or pixels on a screen that get colored). Thus when zooming in deep enough, you will notice that a straight line is made up of a collection of dots, and these will be only a jagged approximation of what is perceived on a macro-scale as a straight line. Finding out which dots will need ink to best form this approximation is not completely trivial. Of course the same holds for characters or any edge on a black-and-white image no matter whether this represents text or illustrations.
Characters come in many different shapes and can be defined by giving the boundaries between black and white in the form of piecewise smooth curves. Donald Knuth did not only gave TeX to the mathematical and scientific community, but he also designed Metafont which allows to design fonts (the computer modern font is the standard one used in TeX). Did you know that successive releases of TeX are numbered using the digital expansion of pi (3.14159...) and the releases of Metafont with the expansion of Euler's number e (2.7182...). Metafont is a programming language and in principle it is perfectly possible to solve a system of linear equations using its syntax. Metafont uses cubic splines to outline the closed boundaries of the character, hence defining an `inside' and an `outside'. The `inside' being the black region defining the character that we should be able to recognize. In this book Whitington explains how the character outline can be described using Bézier curves hooked up on control points. Not the mathematical formulas are given, but the idea can easily be explained graphically. In the chapters at the end of this book also the different font families, with carefully designed sizes, faces, and types are discussed. Also the algorithms needed to put lines and paragraphs and pages together taking care of spacings between letters and words, how words can be split at the end of a line, and how to avoid dangling lines at the bottom (orphans) or at the top (widows) of a page. All these issues are dealt with in TeX (although discussed here independent of that particular system).
Other chapters require less, or much more elementary mathematics. For example it is explained how characters can be represented by numbers as in the ASCII standard of the 1960s. Also a (simple) algorithm is described to look up a word in a text (the search or find function of your favorite text editor) and a description of the input instruments like the typewriter or a computer keyboard. (Do you know how to input Chinese characters on a Western keyboard?) Somewhat more mathematical (although with a very elementary explanation here) is the compression of the data using Huffman encoding. Of course this is only the last step in a compression protocol like for example jpg that makes use of Fourier analysis or wavelets or other advanced techniques. These are however not discussed here, given the level of this book. Parsing algorithms and context free grammars are introduced and illustrated with the evaluation of a mathematical expression and an elementary sorting algorithm. Techniques to obtain gray-scale images (from etching and engraving to dithering variants which are the modern digital analogs).
So this brief summary illustrates that the book is more about initiating computer science concepts to the reader (and even these are at a fairly elementary level) rather than the mathematics underlying these. The chapters typically end with some exercises at the same elementary level as the text. Solutions are given at the end of the book. One might get an idea of what an algorithm might be like, or what kind of algorithms are at work when one is editing a text, but the discussion of most elements is too shallow to bring the uninitiated reader beyond a vague idea. It might be an idea for an author who wants to popularize mathematics to write a book on the silent mathematical aspects of what is discussed in this one.