Asset Details
MbrlCatalogueTitleDetail
Do you wish to reserve the book?
Learning nonlinear operators via DeepONet based on the universal approximation theorem of operators
by
Karniadakis, George Em
, Lu, Lu
, Zhang, Zhongqiang
, Pang, Guofei
, Jin, Pengzhan
in
639/705/1041
/ 639/705/1042
/ Accuracy
/ Approximation
/ Artificial neural networks
/ Banach spaces
/ Complex systems
/ Computer Science
/ Continuity (mathematics)
/ Deep learning
/ Differential equations
/ Dynamical systems
/ Engineering
/ Errors
/ Function space
/ Machine learning
/ Mathematical analysis
/ Neural networks
/ Operators (mathematics)
/ Optimization
/ Ordinary differential equations
/ Partial differential equations
/ Robotics
/ Theorems
2021
Hey, we have placed the reservation for you!
By the way, why not check out events that you can attend while you pick your title.
You are currently in the queue to collect this book. You will be notified once it is your turn to collect the book.
Oops! Something went wrong.
Looks like we were not able to place the reservation. Kindly try again later.
Are you sure you want to remove the book from the shelf?
Learning nonlinear operators via DeepONet based on the universal approximation theorem of operators
by
Karniadakis, George Em
, Lu, Lu
, Zhang, Zhongqiang
, Pang, Guofei
, Jin, Pengzhan
in
639/705/1041
/ 639/705/1042
/ Accuracy
/ Approximation
/ Artificial neural networks
/ Banach spaces
/ Complex systems
/ Computer Science
/ Continuity (mathematics)
/ Deep learning
/ Differential equations
/ Dynamical systems
/ Engineering
/ Errors
/ Function space
/ Machine learning
/ Mathematical analysis
/ Neural networks
/ Operators (mathematics)
/ Optimization
/ Ordinary differential equations
/ Partial differential equations
/ Robotics
/ Theorems
2021
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
Do you wish to request the book?
Learning nonlinear operators via DeepONet based on the universal approximation theorem of operators
by
Karniadakis, George Em
, Lu, Lu
, Zhang, Zhongqiang
, Pang, Guofei
, Jin, Pengzhan
in
639/705/1041
/ 639/705/1042
/ Accuracy
/ Approximation
/ Artificial neural networks
/ Banach spaces
/ Complex systems
/ Computer Science
/ Continuity (mathematics)
/ Deep learning
/ Differential equations
/ Dynamical systems
/ Engineering
/ Errors
/ Function space
/ Machine learning
/ Mathematical analysis
/ Neural networks
/ Operators (mathematics)
/ Optimization
/ Ordinary differential equations
/ Partial differential equations
/ Robotics
/ Theorems
2021
Please be aware that the book you have requested cannot be checked out. If you would like to checkout this book, you can reserve another copy
We have requested the book for you!
Your request is successful and it will be processed during the Library working hours. Please check the status of your request in My Requests.
Oops! Something went wrong.
Looks like we were not able to place your request. Kindly try again later.
Learning nonlinear operators via DeepONet based on the universal approximation theorem of operators
Journal Article
Learning nonlinear operators via DeepONet based on the universal approximation theorem of operators
2021
Request Book From Autostore
and Choose the Collection Method
Overview
It is widely known that neural networks (NNs) are universal approximators of continuous functions. However, a less known but powerful result is that a NN with a single hidden layer can accurately approximate any nonlinear continuous operator. This universal approximation theorem of operators is suggestive of the structure and potential of deep neural networks (DNNs) in learning continuous operators or complex systems from streams of scattered data. Here, we thus extend this theorem to DNNs. We design a new network with small generalization error, the deep operator network (DeepONet), which consists of a DNN for encoding the discrete input function space (branch net) and another DNN for encoding the domain of the output functions (trunk net). We demonstrate that DeepONet can learn various explicit operators, such as integrals and fractional Laplacians, as well as implicit operators that represent deterministic and stochastic differential equations. We study different formulations of the input function space and its effect on the generalization error for 16 different diverse applications.
Neural networks are known as universal approximators of continuous functions, but they can also approximate any mathematical operator (mapping a function to another function), which is an important capability for complex systems such as robotics control. A new deep neural network called DeepONet can lean various mathematical operators with small generalization error.
Publisher
Nature Publishing Group UK,Nature Publishing Group,The Author(s), Springer Nature
Subject
This website uses cookies to ensure you get the best experience on our website.