A Mind at Play: How Claude Shannon Invented the Information Age

According to Yuval Harari, dataism will be the new religion of the Homo Deus, the sequel of the current homo sapiens. This homo deus will be only a small subsystem in a gigantic data processing network. Whether this is where evolution will bring us at the end of the 21st century, well beyond the technological singularity, is still speculative. Nevertheless, it is blatantly clear that already today information and communication has become an essential element of our society. Who controls information and how it is communicated is in control of society. One of the people if not the one who made information and communication the subject of a whole new science is Claude Shannon (1916-2001). Surprisingly enough, until now, no proper book with his full biography existed. His collected works were published and shorter biographies and obituaries did appear, but not a full-size biography like the current book.

Shannon had a master degree in electrical engineering from MIT. At that time Boole algebra was known and taught, but it was just formal logic belonging to a philosophy course, detached from any practical consideration. It was Shannon, working in the neighbourhood of the differential analyser of Vannevar Bush at MIT who made the link in his master thesis. The machine had hundreds of switches and Shannon saw the connection and used algebra to simplify the circuits. This connection may be obvious to us, but in those days circuit design was more an art than a science. His thesis was a big success. Bush recognized immediately Shannon's qualities and sent him to the Eugenics Record Office where he designed for his PhD an algebra to describe the formation of chromosomes pairs. This however turned out not to be very practical and Shannon gave up all research in this direction.

Shannon spent a summer visiting Bell Labs and got a grant for the Institute of Advanced Studies in Princeton, an accumulation point for the top notch scientists from all over the world: Einstein, Weyl, von Neumann, Gödel, Oppenheim,... Shannon was more a tinkering engineer than an abstract mathematician and didn't feel at home in Princeton. That's why later he preferred the engineering spirit of AT&T Bell Labs and the freedom of choosing his own research projects over a more secure but also more sterile academic career. Early 1940 when he was finalising his PhD, he married Norma Wolf but that was only for a short while since they divorced late summer 1941, close to the end of his stay in Princeton. Norma claimed that he had a depression. Anyway World War II reached the US and science was switched to a war modus. He got a position at Bell Labs where he worked for the National Defence Research Committee computing ballistic trajectories and doing cryptographic research in the SIGSALY project (a secure speech system based on Bell Lab's vocoder).

Making long days at the Bell Labs, his true work was done at home after working hours. It of course was related to elements that showed up in his work in the Labs. Nyquist and Hartley were the first trying to capture the information content in a message. It however required the genius mind of Shannon to add statistics to their ideas. Messages are often redundant. If a part (e.g. a letter or a word) can be predicted with high probability, its information content is low, while an unexpected element has a high information value. This idea is captured in the formula $H=-\sum_i p_i \log_2 p_i$ that Shannon came up with. In this formula $p_i$ is the probability of the $i$th symbol occurring. A new unit of information had to be invented which became the bit in the binary case. When it was recognized that this formula corresponded to what is called entropy in physics, its realm became even bigger. Indeed, it did not only describe the information content of a message but it governed the whole physical reality we live in. Once the information content of the parts of a message are obtained, it becomes clear which parts can be removed without much harming the information content of the message. Hence it is the basis for encoding. A message can be represented in a more compact form by removing redundancy and it can be applied to any form of information that can be transformed into a string of bits: written text, audio, images, whatever. This is what made today's Internet possible. His paper was published in 1948 followed shortly by another one dealing with channel capacity, i.e., the maximal number of bits per second that can be safely sent over a noisy channel: $C=B\log_2(1+S)$ ($B$ is the bandwidth and $S$ is the signal-to-noise ratio). Together with Warren Weaver, Shannon also published his results in the form of a book A Mathematical Theory of Communication in 1949 which gradually conquered the world and became a big success. Shannon was not the best of marketeers, and this is how Weaver came in. Although the essence is Shannon's finding, people referred (and they sometimes still do) to the Shannon-Weaver theory.

These papers and his book, although not immediately, settled Shannon's fame. He just turned 32. Doob, in those days the pope of statistics, wrote a bad review of their book in Mathematical Reviews reproaching the authors a lack of mathematical exactness. Also Wiener claimed to have this theory earlier in his Cybernetics book. They were both proved to be wrong in the end. Part was also due to the reservation of Shannon, the tinkerer par excellence who was more interested in new challenges than in publishing papers. Anyway Shannon was eventually recognized for his contributions, got invitations, prizes and solicitations. He preferred to stay at Bell Labs and did what he liked to do most: tinker and do some freewheeling research and occasionally turn his tinkering results into papers. Most famous is Theseus, the mechanical mouse that could learn to find its way out of a maze and also his useless machine (a box with a switch on top; when the switch was turned on, a mechanical hand appeared from the box to turn off the switch and disappeared back in the box). Nevertheless such playful experiments led to meaningful research and papers on artificial intelligence.

Meanwhile Shannon remarried in 1949. With his wife Betty he formed a happy couple for the rest of his life. They had three children. In 1959 MIT did an offer that could not be refused and Shannon became after all a university professor, but with great freedom in teaching and research. He also loved juggling and riding a monocycle. He even prepared a paper on juggling, which never got published though. During the 1980's the first signs of Alzheimer showed. He died in 2001.

The authors of this book are not mathematicians or engineers as they admit in their acknowledgements. This shows a bit because Shannon's work after his breakthrough in 1948-49 is only superfluously covered. They did however a very good job in explaining what Shannon did before and how this related to his main achievements and they did explain quite well the meaning of the two formulas I mentioned above. Of course exposing the roots of information theory is the most important incentive of why someone would care to write a biography of Claude Shannon at all. They are however good biographers, and so we get a short biographical sketch of about everyone who is introduced as being related, and hence possibly influential to Shannon. They did interviews with first hand witnesses and family members still alive. They may also have some literary aspirations. I liked the account about the telegraph cable across the Atlantic connecting the two continents in 1858. Another example: At the end of the book there is a set of pictures. One of the pictures shows the young Shannon next to a Piper Cub during his study days in MIT when he was trained as a pilot. His instructor didn't want him at first "because his brain was too valuable to risk", but the president allowed him to take the lessons. The authors write in this context that his flights with "cheap propeller crafts, blades buzzing like an overgrown wasp" always brought him down safely. This description of the propellers doesn't add much to the biography of Shannon, but it are these small additions that make the book all but a dull enumeration of facts and events. This is clearly a biography written for the general public. This is also how professional mathematicians, engineers or historians should read it: not for the mathematics, and not to acquire additional precise historical facts and dates. A somewhat more technical exposition about the interplay between Boole's and Shannon's work can be found in Paul Nahin's The Logician and the Engineer. How George Boole and Claude Shannon Created the Information Age (2012).

What we learn most from this biography is how Shannon was as a person: A tinkerer and a loner who preferred to work with his door closed, but kind and patient if one cared to enter. These are the descriptions that prevail throughout the book. Clearly, looking for the solution of a puzzle was an inquisitive play for Shannon. A game he preferred to play on his own and that he liked as much as he liked to play the clarinet. He was not the only one. Feynman did too, although much less of a loner and he played the bongos instead. And there is John Conway who liked to hop around from one mathematical topic to another as if in a toy shop. He constructed mathematical polytopes that hung from the ceiling in his office and just like Shannon, his administration was hopeless and incoming correspondence disappearing in a black hole. So the title of this book is well chosen. The title of Siobhan Roberts' biography on Conway sounds similar: Genius at Play: The Curious Mind of John Horton Conway (2015). However not so much is said about Shannon's family, except that after his father died, he broke with his mother and only kept some contact with his older sister. Not much is said about the family life with Betty and the children, except that Betty was his sound board and that she actually corrected his papers. There is an extensive list of notes and a bibliography, but perhaps a time line would have helped. Sometimes the account of his work obscures a bit the precise sequence of events. This is a recurring problem in biographies: keeping coherence in explaining a scientific idea requires spanning several phases in the life of the person, which may force to give up the exact sequence of events. Many things happen at the same time in a lifetime. Anyway a very readable and human biography that I enjoyed very much reading.

Adhemar Bultheel
Book details

This is the first full-size biography of Claude Shannon, who with his two seminal papers, published in 1948, founded information theory. The book is written for a general public. No mathematical knowledge is required. Some of the work is explained tough: Boole algebra, some elements from cryptography and of course Shannon's entropy formula. The emphasis lies on Shannon as a tinkerer and a loner. He was an (electrical) engineer (his master degree) much more than he was a mathematician (his PhD).

Author:  Publisher: 
978-1476766683 (hbk)
$27.00 (hbk)