Today nearly everybody is able to handle a computer, yet many have no idea what computer science is about, like some think that if you are a mathematician, then you should be good at mental arithmetic and vice versa. So if the grandson is a computer whiz, who is very good in solving grandma's computer problems, she thinks he should become a computer scientist. To judge whether this is true, you should know what computer science really is about. This book gives some insight.
An unrespectful and somewhat inaccurate alternative title for this book could be Computer science for dummies and MacCormick is very good in explaining to the grandmas what computer scientists have done for her, even if before she didn't care too much. All is kept at a very low level. No mathematics, no unexplained technical terms, and all concepts introduced by analogy through a non-technical everyday-life situation. The text reads as if you are listening to MacCormick giving a Christmas lecture.
MacCormick has chosen nine algorithms (although nine-ish is more appropriate since the precise number nine is disputable) to bring his message. Why these algorithms? This is explained in the introduction: algorithms that everybody is using, solving concrete problems and realizing some abstract ideas. I believe these arguments are a bit shaky because in my view they apply to many other algorithms as well. I suspect the criteria are written ad hoc like an introduction most often is. Of course he also has to explain what an algorithm is and what makes an algorithm `great'. The latter is simple because they gave MacCormick a personal `aha'-experience when the key-idea is exposed (which he calls a `trick', i.e. the 'thing-that-does-the-trick'). That does explain the selection, because there is no argument possible against personal preference. But whatever the reason is, the selected algorithms solve problems that everyone is familiar with while most people never thought about the mechanisms behind them and MacCormick is good at explaining the basics in a very accessible way.
So here is what has been selected. Chapter 2 and 3 deal with search engines for the WWW. The origin of indexing the web pages which made AltaVista great and then the ranking mechanism used by Google. Mind you, although PageRank is basically solving a huge eigenvalue problem, nothing of that sort is ever mentioned, but the basics of hyperlinks, authorities and random surfer are neatly explained with a toy problem where people are referring to a page giving a recipe for scrambled eggs. This kind of remark also applies to the other chapters. For example, the next one explains public key cryptography, and although there are pretty hard mathematics behind involving prime numbers, elliptic curves etc., the idea is explained making mixtures of paint public, which can not be unmixed, and yet two people can communicate a common color keeping one ingredient private, without other observers being able to find out what their common color is. Paints become numbers only at a later stage of the chapter. Similarly other mechanisms are introduced in the subsequent chapters like error correcting codes, pattern recognition (e.g., automatic reading of ZIP codes) via decision trees and neural nets, data compression (encoding and jpeg are discussed, but words like `Fourier' or `wavelet' are carefully avoided). A somewhat unexpected subject for me is the one on databases. This explains how databases are protected against crashes or unfinished operations and how replicas are kept synchronized avoiding inconsistencies at all cost. Digital signatures on the other hand are a more obvious choice.
All these algorithms are tacitly used on a daily basis by millions of people when they use a keyboard or a touch screen, often without realizing that these things are happening at the other side of the screen. The next chapter however is somewhat out of the main track. What MacCormick tries to do is explain the undecidability problem. The impossibility to construct an algorithm that always leads to a correct yes-no answer. Here he starts with convincing the reader that any program on a computer can run on any file and strongly encourages to try that out. I hope that the experiments of the grandmas following this advise will not damage their computer too much so that their grandsons can restore the original operational mode. Anyway the author hypothetically constructs a sequence of programs that read programs as input and give a yes-no answer (along the lines explaining what a proof by contradiction is). He so can show the reader that they must come to the conclusion that it is impossible to write a crash-detecting program since it will give a contradiction when it uses itself as input. This brings the reader who managed to finish the chapter up to the halting problem, the Church-Turing thesis and the related philosophical considerations
Whether you agree with the choice of algorithms made or not, they form a good set of samples of what made some of the ICT applications a success. The world would be different if they had not been there. MacCormick did an excellent job is explaining the basic ingredients without needing any programming skills. Anyone who is used to a computer is able to grasp the ideas. His afterthoughts about what the great algorithms of the next generation or century will be are a bit thin, but nobody really knows what the future will be. If we ever get quantum computing to work, we shall have to rethink almost everything. This brings us back to the title of the book. The algorithms that were discussed have certainly influenced current computing, but whether they will change the future is not certain.
Thus be warned: no mathematics, no Computer Science with capitals but easy reading for everyone from 9 till 99. If you are a computer scientist yourself, you might find ideas about how to explain things, or you might find this book an excellent idea to give as a present to grandma so that you don't have to explain yourself. You could hardly do better.