Asset Details
MbrlCatalogueTitleDetail
Do you wish to reserve the book?
Constraints on Hyper-parameters in Deep Learning Convolutional Neural Networks
by
Botalb, Abdelaziz
, Al-Saggaf, Ubaid M.
, Alfakeh, Sulhi Ali
, Alsaggaf, Abdulrahman U.
, Moinuddin, Muhammad
, Faisal, Muhammad
in
Algorithms
/ Artificial neural networks
/ Back propagation networks
/ Deep learning
/ Machine learning
/ Neural networks
/ Optimization
/ Parameters
/ Training
2022
Hey, we have placed the reservation for you!
By the way, why not check out events that you can attend while you pick your title.
You are currently in the queue to collect this book. You will be notified once it is your turn to collect the book.
Oops! Something went wrong.
Looks like we were not able to place the reservation. Kindly try again later.
Are you sure you want to remove the book from the shelf?
Constraints on Hyper-parameters in Deep Learning Convolutional Neural Networks
by
Botalb, Abdelaziz
, Al-Saggaf, Ubaid M.
, Alfakeh, Sulhi Ali
, Alsaggaf, Abdulrahman U.
, Moinuddin, Muhammad
, Faisal, Muhammad
in
Algorithms
/ Artificial neural networks
/ Back propagation networks
/ Deep learning
/ Machine learning
/ Neural networks
/ Optimization
/ Parameters
/ Training
2022
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
Do you wish to request the book?
Constraints on Hyper-parameters in Deep Learning Convolutional Neural Networks
by
Botalb, Abdelaziz
, Al-Saggaf, Ubaid M.
, Alfakeh, Sulhi Ali
, Alsaggaf, Abdulrahman U.
, Moinuddin, Muhammad
, Faisal, Muhammad
in
Algorithms
/ Artificial neural networks
/ Back propagation networks
/ Deep learning
/ Machine learning
/ Neural networks
/ Optimization
/ Parameters
/ Training
2022
Please be aware that the book you have requested cannot be checked out. If you would like to checkout this book, you can reserve another copy
We have requested the book for you!
Your request is successful and it will be processed during the Library working hours. Please check the status of your request in My Requests.
Oops! Something went wrong.
Looks like we were not able to place your request. Kindly try again later.
Constraints on Hyper-parameters in Deep Learning Convolutional Neural Networks
Journal Article
Constraints on Hyper-parameters in Deep Learning Convolutional Neural Networks
2022
Request Book From Autostore
and Choose the Collection Method
Overview
Convolutional Neural Network (CNN), a type of Deep Learning, has a very large number of hyper-meters in contrast to the Artificial Neural Network (ANN) which makes the task of CNN training more demanding. The reason why the task of tuning parameters optimization is difficult in the CNN is the existence of a huge optimization space comprising a large number of hyper-parameters such as the number of layers, number of neurons, number of kernels, stride, padding, rows or columns truncation, parameters of the backpropagation algorithm, etc. Moreover, most of the existing techniques in the literature for the selection of these parameters are based on random practice which is developed for some specific datasets. In this work, we empirically investigated and proved that CNN performance is linked not only to choosing the right hyper-parameters but also to its implementation. More specifically, it is found that the performance is also depending on how it deals when the CNN operations require setting of hyper-parameters that do not symmetrically fit the input volume. We demonstrated two different implementations, crop or pad the input volume to make it fit. Our analysis shows that padding performs better than cropping in terms of prediction accuracy (85.58% in contrast to 82.62%) while takes lesser training time (8 minutes lesser).
Publisher
Science and Information (SAI) Organization Limited
This website uses cookies to ensure you get the best experience on our website.