Asset Details
MbrlCatalogueTitleDetail
Do you wish to reserve the book?
Large-Scale Bayesian Optimal Experimental Design with Derivative-Informed Projected Neural Network
by
O’Leary-Roseberry, Thomas
, Ghattas, Omar
, Wu, Keyi
, Chen, Peng
in
Algorithms
/ Approximation
/ Bayesian analysis
/ Computational Mathematics and Numerical Analysis
/ Design of experiments
/ Design optimization
/ Error analysis
/ Greedy algorithms
/ Inverse problems
/ Inverse scattering
/ Mathematical and Computational Engineering
/ Mathematical and Computational Physics
/ Mathematical functions
/ Mathematics
/ Mathematics and Statistics
/ Monte Carlo simulation
/ Neural networks
/ Parameter uncertainty
/ Partial differential equations
/ Sensors
/ Smoothness
/ Special Issue on Machine Learning on Scientific Computing
/ Theoretical
2023
Hey, we have placed the reservation for you!
By the way, why not check out events that you can attend while you pick your title.
You are currently in the queue to collect this book. You will be notified once it is your turn to collect the book.
Oops! Something went wrong.
Looks like we were not able to place the reservation. Kindly try again later.
Are you sure you want to remove the book from the shelf?
Large-Scale Bayesian Optimal Experimental Design with Derivative-Informed Projected Neural Network
by
O’Leary-Roseberry, Thomas
, Ghattas, Omar
, Wu, Keyi
, Chen, Peng
in
Algorithms
/ Approximation
/ Bayesian analysis
/ Computational Mathematics and Numerical Analysis
/ Design of experiments
/ Design optimization
/ Error analysis
/ Greedy algorithms
/ Inverse problems
/ Inverse scattering
/ Mathematical and Computational Engineering
/ Mathematical and Computational Physics
/ Mathematical functions
/ Mathematics
/ Mathematics and Statistics
/ Monte Carlo simulation
/ Neural networks
/ Parameter uncertainty
/ Partial differential equations
/ Sensors
/ Smoothness
/ Special Issue on Machine Learning on Scientific Computing
/ Theoretical
2023
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
Do you wish to request the book?
Large-Scale Bayesian Optimal Experimental Design with Derivative-Informed Projected Neural Network
by
O’Leary-Roseberry, Thomas
, Ghattas, Omar
, Wu, Keyi
, Chen, Peng
in
Algorithms
/ Approximation
/ Bayesian analysis
/ Computational Mathematics and Numerical Analysis
/ Design of experiments
/ Design optimization
/ Error analysis
/ Greedy algorithms
/ Inverse problems
/ Inverse scattering
/ Mathematical and Computational Engineering
/ Mathematical and Computational Physics
/ Mathematical functions
/ Mathematics
/ Mathematics and Statistics
/ Monte Carlo simulation
/ Neural networks
/ Parameter uncertainty
/ Partial differential equations
/ Sensors
/ Smoothness
/ Special Issue on Machine Learning on Scientific Computing
/ Theoretical
2023
Please be aware that the book you have requested cannot be checked out. If you would like to checkout this book, you can reserve another copy
We have requested the book for you!
Your request is successful and it will be processed during the Library working hours. Please check the status of your request in My Requests.
Oops! Something went wrong.
Looks like we were not able to place your request. Kindly try again later.
Large-Scale Bayesian Optimal Experimental Design with Derivative-Informed Projected Neural Network
Journal Article
Large-Scale Bayesian Optimal Experimental Design with Derivative-Informed Projected Neural Network
2023
Request Book From Autostore
and Choose the Collection Method
Overview
We address the solution of large-scale Bayesian optimal experimental design (OED) problems governed by partial differential equations (PDEs) with infinite-dimensional parameter fields. The OED problem seeks to find sensor locations that maximize the expected information gain (EIG) in the solution of the underlying Bayesian inverse problem. Computation of the EIG is usually prohibitive for PDE-based OED problems. To make the evaluation of the EIG tractable, we approximate the (PDE-based) parameter-to-observable map with a derivative-informed projected neural network (DIPNet) surrogate, which exploits the geometry, smoothness, and intrinsic low-dimensionality of the map using a small and dimension-independent number of PDE solves. The surrogate is then deployed within a greedy algorithm-based solution of the OED problem such that no further PDE solves are required. We analyze the EIG approximation error in terms of the generalization error of the DIPNet and show they are of the same order. Finally, the efficiency and accuracy of the method are demonstrated via numerical experiments on OED problems governed by inverse scattering and inverse reactive transport with up to 16,641 uncertain parameters and 100 experimental design variables, where we observe up to three orders of magnitude speedup relative to a reference double loop Monte Carlo method.
Publisher
Springer US,Springer Nature B.V,Springer
This website uses cookies to ensure you get the best experience on our website.