Asset Details
MbrlCatalogueTitleDetail
Do you wish to reserve the book?
Weight-Entanglement Meets Gradient-Based Neural Architecture Search
by
Hutter, Frank
, Safari, Mahmoud
, Krishnakumar, Arjun
, Rhea, Sanjay Sukthanker
in
Black boxes
/ Compatibility
/ Entanglement
/ Neural architecture search
/ Performance enhancement
/ Training
2023
Hey, we have placed the reservation for you!
By the way, why not check out events that you can attend while you pick your title.
You are currently in the queue to collect this book. You will be notified once it is your turn to collect the book.
Oops! Something went wrong.
Looks like we were not able to place the reservation. Kindly try again later.
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
Do you wish to request the book?
Weight-Entanglement Meets Gradient-Based Neural Architecture Search
by
Hutter, Frank
, Safari, Mahmoud
, Krishnakumar, Arjun
, Rhea, Sanjay Sukthanker
in
Black boxes
/ Compatibility
/ Entanglement
/ Neural architecture search
/ Performance enhancement
/ Training
2023
Please be aware that the book you have requested cannot be checked out. If you would like to checkout this book, you can reserve another copy
We have requested the book for you!
Your request is successful and it will be processed during the Library working hours. Please check the status of your request in My Requests.
Oops! Something went wrong.
Looks like we were not able to place your request. Kindly try again later.
Weight-Entanglement Meets Gradient-Based Neural Architecture Search
Paper
Weight-Entanglement Meets Gradient-Based Neural Architecture Search
2023
Request Book From Autostore
and Choose the Collection Method
Overview
Weight sharing is a fundamental concept in neural architecture search (NAS), enabling gradient-based methods to explore cell-based architecture spaces significantly faster than traditional blackbox approaches. In parallel, weight \\emph{entanglement} has emerged as a technique for intricate parameter sharing among architectures within macro-level search spaces. %However, the macro structure of such spaces poses compatibility challenges for gradient-based NAS methods. %As a result, blackbox optimization methods have been commonly employed, particularly in conjunction with supernet training, to maintain search efficiency. %Due to the inherent differences in the structure of these search spaces, these Since weight-entanglement poses compatibility challenges for gradient-based NAS methods, these two paradigms have largely developed independently in parallel sub-communities. This paper aims to bridge the gap between these sub-communities by proposing a novel scheme to adapt gradient-based methods for weight-entangled spaces. This enables us to conduct an in-depth comparative assessment and analysis of the performance of gradient-based NAS in weight-entangled search spaces. Our findings reveal that this integration of weight-entanglement and gradient-based NAS brings forth the various benefits of gradient-based methods (enhanced performance, improved supernet training properties and superior any-time performance), while preserving the memory efficiency of weight-entangled spaces. The code for our work is openly accessible \\href{https://anonymous.4open.science/r/TangleNAS-527C}{here}
Publisher
Cornell University Library, arXiv.org
MBRLCatalogueRelatedBooks
Related Items
Related Items
This website uses cookies to ensure you get the best experience on our website.