Asset Details
MbrlCatalogueTitleDetail
Do you wish to reserve the book?
SGNet: Efficient Snow Removal Deep Network with a Global Windowing Transformer
by
Shan, Lie
, Zhang, Haoxiang
, Cheng, Bodong
in
Computational efficiency
/ Computer vision
/ Computing costs
/ Decomposition
/ deep learning
/ efficient network
/ Electric transformers
/ Image resolution
/ Image restoration
/ Image segmentation
/ image snow removal
/ Machine vision
/ Methods
/ Neural networks
/ Object recognition
/ Semantic segmentation
/ Snow removal
/ Telecommunication systems
/ transformer
/ Transformers
/ Wavelet transforms
/ Weather
2024
Hey, we have placed the reservation for you!
By the way, why not check out events that you can attend while you pick your title.
You are currently in the queue to collect this book. You will be notified once it is your turn to collect the book.
Oops! Something went wrong.
Looks like we were not able to place the reservation. Kindly try again later.
Are you sure you want to remove the book from the shelf?
SGNet: Efficient Snow Removal Deep Network with a Global Windowing Transformer
by
Shan, Lie
, Zhang, Haoxiang
, Cheng, Bodong
in
Computational efficiency
/ Computer vision
/ Computing costs
/ Decomposition
/ deep learning
/ efficient network
/ Electric transformers
/ Image resolution
/ Image restoration
/ Image segmentation
/ image snow removal
/ Machine vision
/ Methods
/ Neural networks
/ Object recognition
/ Semantic segmentation
/ Snow removal
/ Telecommunication systems
/ transformer
/ Transformers
/ Wavelet transforms
/ Weather
2024
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
Do you wish to request the book?
SGNet: Efficient Snow Removal Deep Network with a Global Windowing Transformer
by
Shan, Lie
, Zhang, Haoxiang
, Cheng, Bodong
in
Computational efficiency
/ Computer vision
/ Computing costs
/ Decomposition
/ deep learning
/ efficient network
/ Electric transformers
/ Image resolution
/ Image restoration
/ Image segmentation
/ image snow removal
/ Machine vision
/ Methods
/ Neural networks
/ Object recognition
/ Semantic segmentation
/ Snow removal
/ Telecommunication systems
/ transformer
/ Transformers
/ Wavelet transforms
/ Weather
2024
Please be aware that the book you have requested cannot be checked out. If you would like to checkout this book, you can reserve another copy
We have requested the book for you!
Your request is successful and it will be processed during the Library working hours. Please check the status of your request in My Requests.
Oops! Something went wrong.
Looks like we were not able to place your request. Kindly try again later.
SGNet: Efficient Snow Removal Deep Network with a Global Windowing Transformer
Journal Article
SGNet: Efficient Snow Removal Deep Network with a Global Windowing Transformer
2024
Request Book From Autostore
and Choose the Collection Method
Overview
Image restoration under adverse weather conditions poses a challenging task. Previous research efforts have predominantly focused on eliminating rain and fog phenomena from images. However, snow, being another common atmospheric occurrence, also significantly impacts advanced computer vision tasks such as object detection and semantic segmentation. Recently, there has been a surge of methods specifically targeting snow removal, with the majority employing visual Transformers as the backbone network to enhance restoration effectiveness. Nevertheless, due to the quadratic computations required by Transformers to model long-range dependencies, this significantly escalates the time and space consumption of deep learning models. To address this issue, this paper proposes an efficient snow removal Transformer with a global windowing network (SGNet). This method forgoes the localized windowing strategy of previous visual Transformers, opting instead to partition the image into multiple low-resolution subimages containing global information using wavelet sampling, thereby ensuring higher performance while reducing computational overhead. Extensive experimentation demonstrates that our approach achieves outstanding performance across a wide range of benchmark datasets and can rival methods employing CNNs in terms of computational cost.
This website uses cookies to ensure you get the best experience on our website.