Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Item TypeItem Type
-
SubjectSubject
-
YearFrom:-To:
-
More FiltersMore FiltersSourceLanguage
Done
Filters
Reset
3,905
result(s) for
"Random measures"
Sort by:
Spatially independent martingales, intersections, and applications
by
Suomala, Ville
,
Shmerkin, Pablo
in
Intersection theory (Mathematics)
,
Martingales (Mathematics)
,
Random measures
2018
We define a class of random measures, spatially independent martingales, which we view as a natural generalization of the canonical
random discrete set, and which includes as special cases many variants of fractal percolation and Poissonian cut-outs. We pair the
random measures with deterministic families of parametrized measures
One-dimensional empirical measures, order statistics, and Kantorovich transport distances
2019
This work is devoted to the study of rates of convergence of the empirical measures \\mu_{n} = \\frac {1}{n} \\sum_{k=1}^n \\delta_{X_k}, n \\geq 1, over a sample (X_{k})_{k \\geq 1} of independent identically distributed real-valued random variables towards the common distribution \\mu in Kantorovich transport distances W_p. The focus is on finite range bounds on the expected Kantorovich distances \\mathbb{E}(W_{p}(\\mu_{n},\\mu )) or \\big [ \\mathbb{E}(W_{p}^p(\\mu_{n},\\mu )) \\big ]^1/p in terms of moments and analytic conditions on the measure \\mu and its distribution function. The study describes a variety of rates, from the standard one \\frac {1}{\\sqrt n} to slower rates, and both lower and upper-bounds on \\mathbb{E}(W_{p}(\\mu_{n},\\mu )) for fixed n in various instances. Order statistics, reduction to uniform samples and analysis of beta distributions, inverse distribution functions, log-concavity are main tools in the investigation. Two detailed appendices collect classical and some new facts on inverse distribution functions and beta distributions and their densities necessary to the investigation.
Truncated random measures
2019
Completely random measures (CRMs) and their normalizations are a rich source of Bayesian nonparametric priors. Examples include the beta, gamma, and Dirichlet processes. In this paper, we detail two major classes of sequential CRM representations—series representations and superposition representations—within which we organize both novel and existing sequential representations that can be used for simulation and posterior inference. These two classes and their constituent representations subsume existing ones that have previously been developed in an ad hoc manner for specific processes. Since a complete infinite-dimensional CRM cannot be used explicitly for computation, sequential representations are often truncated for tractability. We provide truncation error analyses for each type of sequential representation, as well as their normalized versions, thereby generalizing and improving upon existing truncation error bounds in the literature. We analyze the computational complexity of the sequential representations, which in conjunction with our error bounds allows us to directly compare representations and discuss their relative efficiency. We include numerous applications of our theoretical results to commonly-used (normalized) CRMs, demonstrating that our results enable a straightforward representation and analysis of CRMs that has not previously been available in a Bayesian nonparametric context.
Journal Article
MCMC for Normalized Random Measure Mixture Models
2013
This paper concerns the use of Markov chain Monte Carlo methods for posterior sampling in Bayesian nonparametric mixture models with normalized random measure priors. Making use of some recent posterior characterizations for the class of normalized random measures, we propose novel Markov chain Monte Carlo methods of both marginal type and conditional type. The proposed marginal samplers are generalizations of Neal's well-regarded Algorithm 8 for Dirichlet process mixture models, whereas the conditional sampler is a variation of those recently introduced in the literature. For both the marginal and conditional methods, we consider as a running example a mixture model with an underlying normalized generalized Gamma process prior, and describe comparative simulation results demonstrating the efficacies of the proposed methods.
Journal Article
Modeling with Normalized Random Measure Mixture Models
by
Nieto-Barajas, Luis E.
,
Barrios, Ernesto
,
Prünster, Igor
in
A priori knowledge
,
Barrios
,
Bayesian nonparametrics
2013
The Dirichlet process mixture model and more general mixtures based on discrete random probability measures have been shown to be flexible and accurate models for density estimation and clustering. The goal of this paper is to illustrate the use of normalized random measures as mixing measures in nonparametric hierarchical mixture models and point out how possible computational issues can be successfully addressed. To this end, we first provide a concise and accessible introduction to normalized random measures with independent increments. Then, we explain in detail a particular way of sampling from the posterior using the Ferguson-Klass representation. We develop a thorough comparative analysis for location-scale mixtures that considers a set of alternatives for the mixture kernel and for the nonparametric component. Simulation results indicate that normalized random measure mixtures potentially represent a valid default choice for density estimation problems. As a byproduct of this study an R package to fit these models was produced and is available in the Comprehensive R Archive Network (CRAN).
Journal Article
Bayesian nonparametric inference beyond the Gibbs-type framework
by
Prünster, Igor
,
Camerlenghi, Federico
,
Lijoi, Antonio
in
Bayesian analysis
,
Bayesian nonparametrics
,
completely random measure
2018
The definition and investigation of general classes of nonparametric priors has recently been an active research line in Bayesian statistics. Among the various proposals, the Gibbs-type family, which includes the Dirichlet process as a special case, stands out as the most tractable class of nonparametric priors for exchangeable sequences of observations. This is the consequence of a key simplifying assumption on the learning mechanism, which, however, has justification except that of ensuring mathematical tractability. In this paper, we remove such an assumption and investigate a general class of random probability measures going beyond the Gibbs-type framework. More specifically, we present a nonparametric hierarchical structure based on transformations of completely random measures, which extends the popular hierarchical Dirichlet process. This class of priors preserves a good degree of tractability, given that we are able to determine the fundamental quantities for Bayesian inference. In particular, we derive the induced partition structure and the prediction rules and characterize the posterior distribution. These theoretical results are also crucial to devise both a marginal and a conditional algorithm for posterior inference. An illustration concerning prediction in genomic sequencing is also provided.
Journal Article
Physics of stochastic processes
by
Mahnke, Reinhard
,
Lubashevsky, Ihor
,
Kaupuzs, Jevgenijs
in
Mathematical & Computational Physics
,
Problems, exercises, etc
,
Random measures
2009
Based on lectures given by one of the authors with many years of experience in teaching stochastic processes, this textbook is unique in combining basic mathematical and physical theory with numerous simple and sophisticated examples as well as detailed calculations. In addition, applications from different fields are included so as to strengthen the background learned in the first part of the book. With its exercises at the end of each chapter (and solutions only available to lecturers) this book will benefit students and researchers at different educational levels. Solutions manual available for lecturers on www.wiley-vch.de.
Posterior Analysis for Normalized Random Measures with Independent Increments
by
JAMES, LANCELOT F.
,
PRÜNSTER, IGOR
,
LIJOI, ANTONIO
in
Bayesian Nonparametrics
,
Density
,
Density estimation
2009
One of the main research areas in Bayesian Nonparametrics is the proposal and study of priors which generalize the Dirichlet process. In this paper, we provide a comprehensive Bayesian non-parametric analysis of random probabilities which are obtained by normalizing random measures with independent increments (NRMI). Special cases of these priors have already shown to be useful for statistical applications such as mixture models and species sampling problems. However, in order to fully exploit these priors, the derivation of the posterior distribution of NRMIs is crucial: here we achieve this goal and, indeed, provide explicit and tractable expressions suitable for practical implementation. The posterior distribution of an NRMI turns out to be a mixture with respect to the distribution of a specific latent variable. The analysis is completed by the derivation of the corresponding predictive distributions and by a thorough investigation of the marginal structure. These results allow to derive a generalized Blackwell-MacQueen sampling scheme, which is then adapted to cover also mixture models driven by general NRMIs.
Journal Article
A Class of Normalized Random Measures with an Exact Predictive Sampling Scheme
by
FAVARO, STEFANO
,
TRIPPA, LORENZO
in
Bayesian analysis
,
Bayesian non-parametrics
,
completely random measures
2012
In this article, we define and investigate a novel class of non-parametric prior distributions, termed the class C. Such class of priors is dense with respect to the homogeneous normalized random measures with independent increments and it is characterized by a richer predictive structure than those arising from other widely used priors. Our interest in the class C is mainly motivated by Bayesian non-parametric analysis of some species sampling problems concerning the evaluation of the species relative abundances in a population. We study both the probability distribution of the number of species present in a sample and the probability of discovering a new species conditionally on an observed sample. Finally, by using the coupling from the past method, we provide an exact sampling scheme for the system of predictive distributions characterizing the class C.
Journal Article