Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Reading LevelReading Level
-
Content TypeContent Type
-
YearFrom:-To:
-
More FiltersMore FiltersItem TypeIs Full-Text AvailableSubjectPublisherSourceDonorLanguagePlace of PublicationContributorsLocation
Done
Filters
Reset
58,544
result(s) for
"Markov analysis"
Sort by:
Ergodicity of Markov Processes via Nonstandard Analysis
by
Duanmu, Haosui
,
Weiss, William
,
Rosenthal, Jeffrey S.
in
Ergodic theory
,
Markov processes
,
Nonstandard mathematical analysis
2021
The Markov chain ergodic theorem is well-understood if either the time-line or the state space is discrete. However, there does not
exist a very clear result for general state space continuous-time Markov processes. Using methods from mathematical logic and
nonstandard analysis, we introduce a class of hyperfinite Markov processes-namely, general Markov processes which behave like finite
state space discrete-time Markov processes. We show that, under moderate conditions, the transition probability of hyperfinite Markov
processes align with the transition probability of standard Markov processes. The Markov chain ergodic theorem for hyperfinite Markov
processes will then imply the Markov chain ergodic theorem for general state space continuous-time Markov processes.
Conformal Graph Directed Markov Systems on Carnot Groups
by
Tyson, Jeremy
,
Chousionis, Vasilis
,
Urbański, Mariusz
in
Conformal mapping
,
Hausdorff measures
,
Markov processes
2020
We develop a comprehensive theory of conformal graph directed Markov systems in the non-Riemannian setting of Carnot groups equipped
with a sub-Riemannian metric. In particular, we develop the thermodynamic formalism and show that, under natural hypotheses, the limit
set of an Carnot conformal GDMS has Hausdorff dimension given by Bowen’s parameter. We illustrate our results for a variety of examples
of both linear and nonlinear iterated function systems and graph directed Markov systems in such sub-Riemannian spaces. These include
the Heisenberg continued fractions introduced by Lukyanenko and Vandehey as well as Kleinian and Schottky groups associated to the
non-real classical rank one hyperbolic spaces.
Symmetric Markov processes, time change, and boundary theory
by
Masatoshi Fukushima
,
Zhen-Qing Chen
in
Absolute continuity
,
Bilinear form
,
Borel right process
2011,2012
This book gives a comprehensive and self-contained introduction to the theory of symmetric Markov processes and symmetric quasi-regular Dirichlet forms. In a detailed and accessible manner, Zhen-Qing Chen and Masatoshi Fukushima cover the essential elements and applications of the theory of symmetric Markov processes, including recurrence/transience criteria, probabilistic potential theory, additive functional theory, and time change theory. The authors develop the theory in a general framework of symmetric quasi-regular Dirichlet forms in a unified manner with that of regular Dirichlet forms, emphasizing the role of extended Dirichlet spaces and the rich interplay between the probabilistic and analytic aspects of the theory. Chen and Fukushima then address the latest advances in the theory, presented here for the first time in any book. Topics include the characterization of time-changed Markov processes in terms of Douglas integrals and a systematic account of reflected Dirichlet spaces, and the important roles such advances play in the boundary theory of symmetric Markov processes.
This volume is an ideal resource for researchers and practitioners, and can also serve as a textbook for advanced graduate students. It includes examples, appendixes, and exercises with solutions.
Fractional Dynamics on Networks and Lattices
This book analyzes stochastic processes on networks and regular structures such as lattices by employing the Markovian random walk approach.Part 1 is devoted to the study of local and non-local random walks.
Large deviations for additive functionals of Markov chains
2013
For a Markov chain {X?} with general state space S and f:S?R ?, the large deviation principle for {n ?1 ? ??=1 f(X?)} is proved under a condition on the chain which is weaker than uniform recurrence but stronger than geometric recurrence and an integrability condition on f , for a broad class of initial distributions. This result is extended to the case when f takes values in a separable Banach space. Assuming only geometric ergodicity and under a non-degeneracy condition, a local large deviation result is proved for bounded f. A central analytical tool is the transform kernel, whose required properties, including new results, are established. The rate function in the large deviation results is expressed in terms of the convergence parameter of the transform kernel.
Markov chains : from theory to implementation and experimentation
2017
A fascinating and instructive guide to Markov chains for experienced users and newcomers alike
This unique guide to Markov chains approaches the subject along the four convergent lines of mathematics, implementation, simulation, and experimentation. It introduces readers to the art of stochastic modeling, shows how to design computer implementations, and provides extensive worked examples with case studies.
Markov Chains: From Theory to Implementation and Experimentation begins with a general introduction to the history of probability theory in which the author uses quantifiable examples to illustrate how probability theory arrived at the concept of discrete-time and the Markov model from experiments involving independent variables. An introduction to simple stochastic matrices and transition probabilities is followed by a simulation of a two-state Markov chain. The notion of steady state is explored in connection with the long-run distribution behavior of the Markov chain. Predictions based on Markov chains with more than two states are examined, followed by a discussion of the notion of absorbing Markov chains. Also covered in detail are topics relating to the average time spent in a state, various chain configurations, and n-state Markov chain simulations used for verifying experiments involving various diagram configurations.
• Fascinating historical notes shed light on the key ideas that led to the development of the Markov model and its variants
• Various configurations of Markov Chains and their limitations are explored at length
• Numerous examples—from basic to complex—are presented in a comparative manner using a variety of color graphics
• All algorithms presented can be analyzed in either Visual Basic, Java Script, or PHP
• Designed to be useful to professional statisticians as well as readers without extensive knowledge of probability theory
Covering both the theory underlying the Markov model and an array of Markov chain implementations, within a common conceptual framework, Markov Chains: From Theory to Implementation and Experimentation is a stimulating introduction to and a valuable reference for those wishing to deepen their understanding of this extremely valuable statistical tool.
Paul A. Gagniuc, PhD, is Associate Professor at Polytechnic University of Bucharest, Romania. He obtained his MS and his PhD in genetics at the University of Bucharest. Dr. Gagniuc's work has been published in numerous high profile scientific journals, ranging from the Public Library of Science to BioMed Central and Nature journals. He is the recipient of several awards for exceptional scientific results and a highly active figure in the review process for different scientific areas.
Handbook of Markov Chain Monte Carlo
by
Gelman, Andrew
,
Meng, Xiao-Li
,
Brooks, Steve
in
Markov processes
,
Markov processes -- Case studies
,
Monte Carlo method
2011
Handbook of Markov Chain Monte Carlo brings together the major advances that have occurred in recent years while incorporating enough introductory material for new users of MCMC. Along with thorough coverage of the theoretical foundations and algorithmic and computational methodology, this comprehensive handbook includes substantial realistic case studies from a variety of disciplines. These case studies demonstrate the application of MCMC methods and serve as a series of templates for the construction, implementation, and choice of MCMC methodology.
Labelled Markov processes
by
Panangaden, Prakash
in
COMPUTERS
,
Formal Specification (Software Engineering, Mathematical Logic)
,
Markov processes
2009
Labelled Markov processes are probabilistic versions of labelled transition systems with continuous state spaces. The book covers basic probability and measure theory on continuous state spaces and then develops the theory of LMPs. The main topics covered are bisimulation, the logical characterization of bisimulation, metrics and approximation theory. An unusual feature of the book is the connection made with categorical and domain theoretic concepts.