The Computing Universe, A journey through a revolution
Although several mathematicians have contributed to the birth of computers and computer science, computer science has now grown into a mature field that is the result of a very productive collaboration between scientists developing the hardware and the software. Nowadays, computers and hence computer science have a global impact on society so that it is important that one should have an idea of what computer science really stands for. Many are however have a fuzzy or even a wrong idea of what it really means. It is not a matter of sitting behind a computer all day long. Neither is a smart hacker automatically a good computer scientist. Computer science is so much more than skillful programming. Add to this that this science draws from many scientific disciplines and it is obvious that it is impossible to cover every aspect in just one book. The authors did nevertheless a marvelous job in sketching how the discipline has grown in such a short time from some abstract ideas and electromechanical devices into a science developing the software mastering global social networks running on hardware that nowadays is relying on nanotechnology. Many of the ICT buzz words are placed in their historical context and their meaning is clearly explained avoiding all the technical ballast.
Certainly in the beginning (and the history is not that old) one was computer scientist as a secondary activity. Educated as a mathematician, an engineer, or a physicist your career could be shifted towards computer science. So we find in this book many mathematicians, engineers, physicists who contributed to the field (the namelist index has over 650 names). So I somewhat disagree with the statement of the introduction saying that ``It is curious that schoolchildren are taught the names and achievements of great mathematicians, physicists, chemists, and biologists but not about the great computer pioneers''. I believe that names like Steve Jobs, Mark Zuckerberg, Bill Gates, Larry Page, are nowadays pretty well known. You may think of them more as smart business men, rather than computer scientists, but they certainly helped developing the science. But also names like Alan Turing, von Neumann, Konrad Zuse, Ada Lovelace and other pioneers who were the computer scientists avant la lettre may be pretty well known. It is one of the purposes of this book to throw in many more names beyond these.
Because the book is so broad, giving a detailed summary of the contents is almost impossible. The first chapters follow more or less the historical development. In about 150 pages you are brought from 1930 to the first personal computers of the late seventies. Meanwhile you meet the pioneers, the first electronic computers, Boolean algebra, circuits, programming languages, algorithms, Moore's law, etc. The second half of the book is more by topic, the chronology becoming somewhat fuzzy because many things happened simultaneously. Such topics are computer games, networks, the World Wide Web, hacking and cryptography, artificial intelligence, neural nets, and natural language processing. The last three chapters give a sneak preview of what is the expected future. Miniaturization hits the wall of nanoscience, and while quantum computing is still in the running, DNA computing is emerging. And on the applications side we have assimilated the simulation age of the 1950's and the communication era of the 1980's and have now engaged in the embodiment age. The latter entails dealing with uncertainties, robots, the Internet of Things, the mind-body problem, etc. Here we are on the edge of science fiction, which is the subject of the last chapter: the computer, robotics, and advanced artificial intelligence, as described in many science fiction books.
The structure of the book is not linear. There are the successive chapters where the text is interrupted by many colourful illustrations, but the pages have also a wide margin (making the book somewhat wider than a usual text book) with extra illustrations, sometimes with brief additional information. Each chapter ends with a list of concepts that were discussed. But besides these chapters there are also several additional "frames". These discuss topics somehow related to the chapter, but which can be read independently. Their length can vary from half a page to seven or eight pages. In that sense it is a bit of a coffee table book that you can pick up for the pictures or to read some of the shorter comments that come with the illustrations. Of course, then you loose the context and the fabric. To go more systematically through a certain topic (for example early history, the Internet, artificial intelligence,...) there is an appendix in which is explained what to read if you are only interested in one of these. There is also a book list for further reading organized by chapter. The name index is impressive, as we mentioned before. The separate subject index is essential to recover a specific item in this sea of information.
The book is really a pleasure to read. It is the broadest possible introduction to computer science for the layman one can imagine. The average computer user is not interested in the top-notch technical details, but rather in what these nerdy terms really mean, why he or she is annoyed with certain features. Placing them in their historical context is the best way to explain and understand how they came about. You learn about Gödel's incompleteness theorems, yes, but you also learn about the origin of the Ctr-Alt-Del key combination, of the 'Error 404 File not found' message, and you are told who is the inventor of emoticons. Besides the many illustrations, including cartoons and even plain jokes, the text is interlaced with many historical quotes, some of which are quite amusing. I cannot resist repeating here the following quotes by the Dutch computer scientist Edsger Dijkstra:
"The use of COBOL cripples the mind; its teaching should therefore be regarded as a criminal offense."
"It is practically impossible to teach good programming to students that have had a prior exposure to BASIC: as potential programmers they are mentally mutilated beyond hope of regeneration".