This book is a nice introductory text for a modern course on basic facts in the theory of probability and information. The author always has in mind that many students, especially those specializing in informatics and/or technical sciences, do not often have a firm background in traditional mathematics. Therefore he attempts to keep the development of material gently paced and user-friendly. In the volume, a systematic development of probability and information is presented, much of which would be suitable for a two-semester course of an undergraduate degree. This would then provide a background for further courses, both from a pure and an applied point of view, in probability and statistics.

Some aspects of the subject that are particularly stressed in the volume are: (i) a strong debate on the foundations of probability, which is polarized between “Bayesians” and “frequentists”; (ii) the classical approach, where equally likely events are automatically assigned equal probabilities and a principle of maximal entropy originating from statistical mechanics; (iii) use of Boolean algebras (quite natural now for all students of informatics) instead of the σ-algebras of Kolmogorov - notice that this restriction to finite additivity is made for purely pedagogical and not ideological reasons; and (iv) use of Bernoulli (alternative) random variables as basic stones, the “generators” of many other important discrete probabilistic distributions. The author presents herein what he sees as the core of probability and information. To prevent the book becoming too large, development of some concepts has been shifted to exercises, in particular when they have a marginal application in other parts of the book. Fully worked solutions of exercises can be found on the author's web page.