Asset Details
MbrlCatalogueTitleDetail
Do you wish to reserve the book?
GradFreeBits: Gradient-Free Bit Allocation for Mixed-Precision Neural Networks
by
Bodner, Benjamin Jacob
, Ben-Shalom, Gil
, Treister, Eran
in
Algorithms
/ Benchmarking
/ gradient-free optimization
/ Methods
/ mixed-precision quantization
/ neural network compression
/ Neural networks
/ Neural Networks, Computer
/ Normal distribution
/ Optimization
/ quantization
/ Training
2022
Hey, we have placed the reservation for you!
By the way, why not check out events that you can attend while you pick your title.
You are currently in the queue to collect this book. You will be notified once it is your turn to collect the book.
Oops! Something went wrong.
Looks like we were not able to place the reservation. Kindly try again later.
Are you sure you want to remove the book from the shelf?
GradFreeBits: Gradient-Free Bit Allocation for Mixed-Precision Neural Networks
by
Bodner, Benjamin Jacob
, Ben-Shalom, Gil
, Treister, Eran
in
Algorithms
/ Benchmarking
/ gradient-free optimization
/ Methods
/ mixed-precision quantization
/ neural network compression
/ Neural networks
/ Neural Networks, Computer
/ Normal distribution
/ Optimization
/ quantization
/ Training
2022
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
Do you wish to request the book?
GradFreeBits: Gradient-Free Bit Allocation for Mixed-Precision Neural Networks
by
Bodner, Benjamin Jacob
, Ben-Shalom, Gil
, Treister, Eran
in
Algorithms
/ Benchmarking
/ gradient-free optimization
/ Methods
/ mixed-precision quantization
/ neural network compression
/ Neural networks
/ Neural Networks, Computer
/ Normal distribution
/ Optimization
/ quantization
/ Training
2022
Please be aware that the book you have requested cannot be checked out. If you would like to checkout this book, you can reserve another copy
We have requested the book for you!
Your request is successful and it will be processed during the Library working hours. Please check the status of your request in My Requests.
Oops! Something went wrong.
Looks like we were not able to place your request. Kindly try again later.
GradFreeBits: Gradient-Free Bit Allocation for Mixed-Precision Neural Networks
Journal Article
GradFreeBits: Gradient-Free Bit Allocation for Mixed-Precision Neural Networks
2022
Request Book From Autostore
and Choose the Collection Method
Overview
Quantized neural networks (QNNs) are among the main approaches for deploying deep neural networks on low-resource edge devices. Training QNNs using different levels of precision throughout the network (mixed-precision quantization) typically achieves superior trade-offs between performance and computational load. However, optimizing the different precision levels of QNNs can be complicated, as the values of the bit allocations are discrete and difficult to differentiate for. Moreover, adequately accounting for the dependencies between the bit allocation of different layers is not straightforward. To meet these challenges, in this work, we propose GradFreeBits: a novel joint optimization scheme for training mixed-precision QNNs, which alternates between gradient-based optimization for the weights and gradient-free optimization for the bit allocation. Our method achieves a better or on par performance with the current state-of-the-art low-precision classification networks on CIFAR10/100 and ImageNet, semantic segmentation networks on Cityscapes, and several graph neural networks benchmarks. Furthermore, our approach can be extended to a variety of other applications involving neural networks used in conjunction with parameters that are difficult to optimize for.
Publisher
MDPI AG,MDPI
This website uses cookies to ensure you get the best experience on our website.