Asset Details
MbrlCatalogueTitleDetail
Do you wish to reserve the book?
High-dimensional Bayesian inference via the unadjusted Langevin algorithm
by
MOULINES, ÉRIC
, DURMUS, ALAIN
in
Mathematics
2019
Hey, we have placed the reservation for you!
By the way, why not check out events that you can attend while you pick your title.
You are currently in the queue to collect this book. You will be notified once it is your turn to collect the book.
Oops! Something went wrong.
Looks like we were not able to place the reservation. Kindly try again later.
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
Do you wish to request the book?
High-dimensional Bayesian inference via the unadjusted Langevin algorithm
by
MOULINES, ÉRIC
, DURMUS, ALAIN
in
Mathematics
2019
Please be aware that the book you have requested cannot be checked out. If you would like to checkout this book, you can reserve another copy
We have requested the book for you!
Your request is successful and it will be processed during the Library working hours. Please check the status of your request in My Requests.
Oops! Something went wrong.
Looks like we were not able to place your request. Kindly try again later.
High-dimensional Bayesian inference via the unadjusted Langevin algorithm
Journal Article
High-dimensional Bayesian inference via the unadjusted Langevin algorithm
2019
Request Book From Autostore
and Choose the Collection Method
Overview
We consider in this paper the problem of sampling a high-dimensional probability distribution π having a density w.r.t the Lebesgue measure on ℝd, known up to a normalization constant
x
↦
π
x
=
e
−
U
x
/
∫
ℝ
d
e
−
U
y
d
y
. Such problem naturally occurs for example in Bayesian inference and machine learning. Under the assumption that U is continuously differentiable, ▽U is globally Lipschitz and U is strongly convex, we obtain non-asymptotic bounds for the convergence to stationarity in Wasserstein distance of order 2 and total variation distance of the sampling method based on the Euler discretization of the Langevin stochastic differential equation, for both constant and decreasing step sizes. The dependence on the dimension of the state space of these bounds is explicit. The convergence of an appropriately weighted empirical measure is also investigated and bounds for the mean square error and exponential deviation inequality are reported for functions which are measurable and bounded. An illustration to Bayesian inference for binary regression is presented to support our claims.
Publisher
International Statistical Institute (ISI),Bernoulli Society for Mathematical Statistics and Probability
Subject
MBRLCatalogueRelatedBooks
Related Items
Related Items
This website uses cookies to ensure you get the best experience on our website.