Description : The general theory of stochastic processes and the more specialized theory of Markov processes evolved enormously in the second half of the last century. In parallel, the theory of controlled Markov chains (or Markov decision processes) was being pioneered by control engineers and operations researchers. Researchers in Markov processes and controlled Markov chains have been, for a long time, aware of the synergies between these two subject areas. However, this may be the first volume dedicated to highlighting these synergies and, almost certainly, it is the first volume that emphasizes the contributions of the vibrant and growing Chinese school of probability. The chapters that appear in this book reflect both the maturity and the vitality of modern day Markov processes and controlled Markov chains. They also will provide an opportunity to trace the connections that have emerged between the work done by members of the Chinese school of probability and the work done by the European, US, Central and South American and Asian scholars.
Description : Markov processes are among the most important stochastic processes for both theory and applications. This book develops the general theory of these processes, and applies this theory to various special examples. The initial chapter is devoted to the most important classical example - one dimensional Brownian motion. This, together with a chapter on continuous time Markov chains, provides the motivation for the general setup based on semigroups and generators. Chapters on stochastic calculus and probabilistic potential theory give an introduction to some of the key areas of application of Brownian motion and its relatives. A chapter on interacting particle systems treats a more recently developed class of Markov processes that have as their origin problems in physics and biology. This is a textbook for a graduate course that can follow one that covers basic probabilistic limit theorems and discrete time processes.
Description : The modem theory of Markov processes has its origins in the studies of A. A. MARKOV (1906-1907) on sequences of experiments "connected in a chain" and in the attempts to describe mathematically the physical phenomenon known as Brownian motion (L. BACHELlER 1900, A. EIN STEIN 1905). The first correct mathematical construction of a Markov process with continuous trajectories was given by N. WIENER in 1923. (This process is often called the Wiener process.) The general theory of Markov processes was developed in the 1930's and 1940's by A. N. KOL MOGOROV, W. FELLER, W. DOEBLlN, P. LEVY, J. L. DOOB, and others. During the past ten years the theory of Markov processes has entered a new period of intensive development. The methods of the theory of semigroups of linear operators made possible further progress in the classification of Markov processes by their infinitesimal characteristics. The broad classes of Markov processes with continuous trajectories be came the main object of study. The connections between Markov pro cesses and classical analysis were further developed. It has become possible not only to apply the results and methods of analysis to the problems of probability theory, but also to investigate analytic problems using probabilistic methods. Remarkable new connections between Markov processes and potential theory were revealed. The foundations of the theory were reviewed critically: the new concept of strong Markov process acquired for the whole theory of Markov processes great importance.
Description : This graduate-level text explores the relationship between Markov processes and potential theory, in addition to aspects of the theory of additive functionals. Topics include Markov processes, excessive functions, multiplicative functionals and subprocesses, and additive functionals and their potentials. A concluding chapter examines dual processes and potential theory. 1968 edition.
Description : This book provides a rigorous but elementary introduction to the theory of Markov Processes on a countable state space. It should be accessible to students with a solid undergraduate background in mathematics, including students from engineering, economics, physics, and biology. Topics covered are: Doeblin's theory, general ergodic properties, and continuous time processes. Applications are dispersed throughout the book. In addition, a whole chapter is devoted to reversible processes and the use of their associated Dirichlet forms to estimate the rate of convergence to equilibrium. These results are then applied to the analysis of the Metropolis (a.k.a simulated annealing) algorithm. The corrected and enlarged 2nd edition contains a new chapter in which the author develops computational methods for Markov chains on a finite state space. Most intriguing is the section with a new technique for computing stationary measures, which is applied to derivations of Wilson's algorithm and Kirchoff's formula for spanning trees in a connected graph.
Description : Measure-valued branching processes arise as high density limits of branching particle systems. The Dawson-Watanabe superprocess is a special class of those. The author constructs superprocesses with Borel right underlying motions and general branching mechanisms and shows the existence of their Borel right realizations. He then uses transformations to derive the existence and regularity of several different forms of the superprocesses. This treatment simplifies the constructions and gives useful perspectives. Martingale problems of superprocesses are discussed under Feller type assumptions. The most important feature of the book is the systematic treatment of immigration superprocesses and generalized Ornstein--Uhlenbeck processes based on skew convolution semigroups. The volume addresses researchers in measure-valued processes, branching processes, stochastic analysis, biological and genetic models, and graduate students in probability theory and stochastic processes.
Description : A considerable number of problems in the statistics of random processes are formulated within the following scheme. On a certain probability space (Q, ff, P) a partially observable random process (lJ,~) = (lJ ~/), t :;::-: 0, is given with only the second component n ~ = (~/), t:;::-: 0, observed. At any time t it is required, based on ~h = g., ° s sst}, to estimate the unobservable state lJ/. This problem of estimating (in other words, the filtering problem) 0/ from ~h will be discussed in this book. It is well known that if M(lJ;)