Asset Details
MbrlCatalogueTitleDetail
Do you wish to reserve the book?
Shrinkage Initialization for Smooth Learning of Neural Networks
by
Wang, Limin
, Cheng, Miao
, Zou, Hongwei
, Zhou, Feiyan
in
Learning
/ Neural networks
2025
Hey, we have placed the reservation for you!
By the way, why not check out events that you can attend while you pick your title.
You are currently in the queue to collect this book. You will be notified once it is your turn to collect the book.
Oops! Something went wrong.
Looks like we were not able to place the reservation. Kindly try again later.
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
Do you wish to request the book?
Shrinkage Initialization for Smooth Learning of Neural Networks
by
Wang, Limin
, Cheng, Miao
, Zou, Hongwei
, Zhou, Feiyan
in
Learning
/ Neural networks
2025
Please be aware that the book you have requested cannot be checked out. If you would like to checkout this book, you can reserve another copy
We have requested the book for you!
Your request is successful and it will be processed during the Library working hours. Please check the status of your request in My Requests.
Oops! Something went wrong.
Looks like we were not able to place your request. Kindly try again later.
Shrinkage Initialization for Smooth Learning of Neural Networks
Paper
Shrinkage Initialization for Smooth Learning of Neural Networks
2025
Request Book From Autostore
and Choose the Collection Method
Overview
The successes of intelligent systems have quite relied on the artificial learning of information, which lead to the broad applications of neural learning solutions. As a common sense, the training of neural networks can be largely improved by specifically defined initialization, neuron layers as well as the activation functions. Though there are sequential layer based initialization available, the generalized solution to initial stages is still desired. In this work, an improved approach to initialization of neural learning is presented, which adopts the shrinkage approach to initialize the transformation of each layer of networks. It can be universally adapted for the structures of any networks with random layers, while stable performance can be attained. Furthermore, the smooth learning of networks is adopted in this work, due to the diverse influence on neural learning. Experimental results on several artificial data sets demonstrate that, the proposed method is able to present robust results with the shrinkage initialization, and competent for smooth learning of neural networks.
Publisher
Cornell University Library, arXiv.org
Subject
This website uses cookies to ensure you get the best experience on our website.