Asset Details
MbrlCatalogueTitleDetail
Do you wish to reserve the book?
Physics-informed neural networks with hybrid Kolmogorov-Arnold network and augmented Lagrangian function for solving partial differential equations
by
Shen, Tao
, Zhang, Zhaoyang
, Zhang, Yinxing
, Zhang, Weiyi
, Wang, Qingwang
in
639/705/1041
/ 639/705/117
/ Augmented Lagrangian function
/ Boundary conditions
/ Deep learning
/ Differential equations
/ Humanities and Social Sciences
/ Kolmogorov-Arnold network
/ Mathematics
/ multidisciplinary
/ Neural networks
/ Partial differential equations
/ Physics
/ Physics-informed neural networks
/ Science
/ Science (multidisciplinary)
2025
Hey, we have placed the reservation for you!
By the way, why not check out events that you can attend while you pick your title.
You are currently in the queue to collect this book. You will be notified once it is your turn to collect the book.
Oops! Something went wrong.
Looks like we were not able to place the reservation. Kindly try again later.
Are you sure you want to remove the book from the shelf?
Physics-informed neural networks with hybrid Kolmogorov-Arnold network and augmented Lagrangian function for solving partial differential equations
by
Shen, Tao
, Zhang, Zhaoyang
, Zhang, Yinxing
, Zhang, Weiyi
, Wang, Qingwang
in
639/705/1041
/ 639/705/117
/ Augmented Lagrangian function
/ Boundary conditions
/ Deep learning
/ Differential equations
/ Humanities and Social Sciences
/ Kolmogorov-Arnold network
/ Mathematics
/ multidisciplinary
/ Neural networks
/ Partial differential equations
/ Physics
/ Physics-informed neural networks
/ Science
/ Science (multidisciplinary)
2025
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
Do you wish to request the book?
Physics-informed neural networks with hybrid Kolmogorov-Arnold network and augmented Lagrangian function for solving partial differential equations
by
Shen, Tao
, Zhang, Zhaoyang
, Zhang, Yinxing
, Zhang, Weiyi
, Wang, Qingwang
in
639/705/1041
/ 639/705/117
/ Augmented Lagrangian function
/ Boundary conditions
/ Deep learning
/ Differential equations
/ Humanities and Social Sciences
/ Kolmogorov-Arnold network
/ Mathematics
/ multidisciplinary
/ Neural networks
/ Partial differential equations
/ Physics
/ Physics-informed neural networks
/ Science
/ Science (multidisciplinary)
2025
Please be aware that the book you have requested cannot be checked out. If you would like to checkout this book, you can reserve another copy
We have requested the book for you!
Your request is successful and it will be processed during the Library working hours. Please check the status of your request in My Requests.
Oops! Something went wrong.
Looks like we were not able to place your request. Kindly try again later.
Physics-informed neural networks with hybrid Kolmogorov-Arnold network and augmented Lagrangian function for solving partial differential equations
Journal Article
Physics-informed neural networks with hybrid Kolmogorov-Arnold network and augmented Lagrangian function for solving partial differential equations
2025
Request Book From Autostore
and Choose the Collection Method
Overview
Physics-informed neural networks (PINNs) have emerged as a fundamental approach within deep learning for the resolution of partial differential equations (PDEs). Nevertheless, conventional multilayer perceptrons (MLPs) are characterized by a lack of interpretability and encounter the spectral bias problem, which diminishes their accuracy and interpretability when used as an approximation function within the diverse forms of PINNs. Moreover, these methods are susceptible to the over-inflation of penalty factors during optimization, potentially leading to pathological optimization with an imbalance between various constraints. In this study, we are inspired by the Kolmogorov-Arnold network (KAN) to address mathematical physics problems and introduce a hybrid encoder-decoder model to tackle these challenges, termed AL-PKAN. Specifically, the proposed model initially encodes the interdependencies of input sequences into a high-dimensional latent space through the gated recurrent unit (GRU) module. Subsequently, the KAN module is employed to disintegrate the multivariate function within the latent space into a set of trainable univariate activation functions, formulated as linear combinations of B-spline functions for the purpose of spline interpolation of the estimated function. Furthermore, we formulate an augmented Lagrangian function to redefine the loss function of the proposed model, which incorporates initial and boundary conditions into the Lagrangian multiplier terms, rendering the penalty factors and Lagrangian multipliers as learnable parameters that facilitate the dynamic modulation of the balance among various constraint terms. Ultimately, the proposed model exhibits remarkable accuracy and generalizability in a series of benchmark experiments, thereby highlighting the promising capabilities and application horizons of KAN within PINNs.
Publisher
Nature Publishing Group UK,Nature Publishing Group,Nature Portfolio
This website uses cookies to ensure you get the best experience on our website.