Stochastic processes such as Brownian motion $B(t)$ and its white noise derivative $\omega(t)=\frac{dB(t)}{dt}$, are well known. The processes that are considered in this book are described as $x(\tau)=x_0+\int_0^\tau f(\tau-t)g(t)\omega(t)d t$ where $x_0$ is the initial condition, $f(\tau-t)$ is a function characterizing the memory and $g(t)$ is responsible for some additional modification. A memoryless Markov process corresponds to $f=1$. If also $g=1$, we get Brownian motion.

For different possible choices of $f$ and $g$, one may compute the probability that the process $x$ starts with $x_0$ at time $t=0$ and ends with $x_{\tau}$ at time $t=\tau$. If we denote this probability as $P(x,\tau)$, then it satisfies a diffusion equation, which is a PDE of the form $\frac{\partial}{\partial \tau}P(x,\tau)=[\frac{1}{2}\frac{\partial}{\partial \tau}\int_0^\tau [f(\tau-t)g(t)]^2dt]\frac{\partial^2}{\partial x^2}P(x,\tau)$. That can be solved with different (e.g. periodic) boundary conditions.

Different applications are discussed. For example consider a dynamic system $h(t)$ with relative logarithmic change $x(t)=\ln[h(t+\Delta t)/h(t)]$. Then one might be interested in $P(x,\tau)$ when $h$ corresponds to the number of times a word is used in the year $t$, or if may represent the growth of a company, etc. When considering $x(t)$ as a time series, one may compute $P(x,\tau)$ of the variance of $x$ which is of interest for applications like typhoon tracking, or tracking a particle in microreheology.

Of course many things simplify if there is no memory (Markov process). For example the Brownian motion of a particle in a fluid. For some such applications, the $x$ may become a vector. The probability $P(x,\tau)$ will solve the Fokker-Planck equation for the process solving a stochastic diffusion equation $dx(t)=a(t,x(t))dt+b(t,x(t))dB(t)$, and the theory developed so far can be used to solve it. This idea is elaborated in the context of neurophysics, first for a single neuron, and then for multiple neurons with complications such as correlations and when considering a process with memory. For applications in biopolymers, the Fokker-Planck equation is again solved to describe diffusion, but also the helical structure of DNA is investigated using cylindrical coordinates. This results in a probability $W(n,L)$ to have $n$ windings over length $L$.

The remaining chapters deal with quantum mechanics. The evolution here involves a state $\Psi(\mathbf{r},t)$ for position $\mathbf{r}$ at time $t$. The paths followed from $\Psi(\mathbf{r}_0,t_0)$ to $\Psi(\mathbf{r}_1,t_1)$ are stochastic and are described by a Markov process as introduced above. The transition in non-relativistic theory is described by a propagator. It plays a role similar to the probability $P(x,t)$ in the previous applications. Free particles as well as different potentials are analyzed. Also periodic boundary conditions are considered as well as an infinite wall potential or a particle confined in a box. For the relativistic theory, one has to solve different equations, but the Feynman path integral technique of the previous case can be generalized to derive the Green's function for the Dirac equation or to solve for the the propagator in a uniform magnetic field.

The book can be divided into two parts. The first six chapters are rather general, with only lesser elaborated applications. Three applications are more elaborated in the second half of the book: neurophysics, biopolymers, and quantum mechanics. Hence the book will be most attractive to researches familiar with at least one of these applications. The mathematician just familiar with stochastic processes may have a harder time to appreciate the second half.

The strong, but at the same time also the weak point of this book, is that it brings together in a concise volume all these white noise techniques that are applied in widespread applications with different notation and vocabulary and applications that are seemingly totally unrelated. The effect is that you find here a condensation of the common central idea, but also entailing a diversity which may easily confuse the readers. With all my sympathy for the courageous enterprise of the authors to bring all this together in one volume, I think uniformisation could have been better. For example, the different chapters do not always approach the problem from the same starting point. Also notation is not always consistent $E$ and $\mathbb{E}$ or $\langle\ \rangle$ for expectation while $E$ denotes energy in the quantum chapters, sometimes $g$, sometimes $h$ is used for the modifying function, sometimes $\tau$, sometimes $T$ for the end value of $t$, $f$ is used for the memory function, but it gets many other different meanings at different occasions, conditional probability is denoted as $P(x,t;x_0,t_0)$ in one section and in the next as $P(x,t|x_0,t_0)$, etc. All this is seemingly unimportant nitpicking, but it only adds to the confusion of readers who know only one of the applications or who are just generally interested in learning the idea of white noise analysis. So my advise for the latter is to first acquire sufficient sophistication in stochastic diffusion processes before embarking on this book. The book is a nice summary, widening the view on applications for the specialists, but to make it a systematic introduction for an average mathematician would require a serious inflation of the number of pages.