Asset Details
MbrlCatalogueTitleDetail
Do you wish to reserve the book?
Regularisation of neural networks by enforcing Lipschitz continuity
by
Pfahringer Bernhard
, Cree, Michael J
, Frank Eibe
, Gouk Henry
in
Computation
/ Mathematical models
/ Neural networks
/ Norms
/ Optimization
/ Regularization
/ Training
/ Upper bounds
2021
Hey, we have placed the reservation for you!
By the way, why not check out events that you can attend while you pick your title.
You are currently in the queue to collect this book. You will be notified once it is your turn to collect the book.
Oops! Something went wrong.
Looks like we were not able to place the reservation. Kindly try again later.
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
Do you wish to request the book?
Regularisation of neural networks by enforcing Lipschitz continuity
by
Pfahringer Bernhard
, Cree, Michael J
, Frank Eibe
, Gouk Henry
in
Computation
/ Mathematical models
/ Neural networks
/ Norms
/ Optimization
/ Regularization
/ Training
/ Upper bounds
2021
Please be aware that the book you have requested cannot be checked out. If you would like to checkout this book, you can reserve another copy
We have requested the book for you!
Your request is successful and it will be processed during the Library working hours. Please check the status of your request in My Requests.
Oops! Something went wrong.
Looks like we were not able to place your request. Kindly try again later.
Regularisation of neural networks by enforcing Lipschitz continuity
Journal Article
Regularisation of neural networks by enforcing Lipschitz continuity
2021
Request Book From Autostore
and Choose the Collection Method
Overview
We investigate the effect of explicitly enforcing the Lipschitz continuity of neural networks with respect to their inputs. To this end, we provide a simple technique for computing an upper bound to the Lipschitz constant—for multiple p-norms—of a feed forward neural network composed of commonly used layer types. Our technique is then used to formulate training a neural network with a bounded Lipschitz constant as a constrained optimisation problem that can be solved using projected stochastic gradient methods. Our evaluation study shows that the performance of the resulting models exceeds that of models trained with other common regularisers. We also provide evidence that the hyperparameters are intuitive to tune, demonstrate how the choice of norm for computing the Lipschitz constant impacts the resulting model, and show that the performance gains provided by our method are particularly noticeable when only a small amount of training data is available.
Publisher
Springer Nature B.V
Subject
MBRLCatalogueRelatedBooks
Related Items
Related Items
This website uses cookies to ensure you get the best experience on our website.