Asset Details
MbrlCatalogueTitleDetail
Do you wish to reserve the book?
DPAL-BERT: A Faster and Lighter Question Answering Model
by
AlQahtani, Salman A.
, AlSanad, Ahmed
, Yin, Zhengtong
, Cai, Zhuohang
, Li, Xiaolu
, Yin, Lirong
, Chen, Xiaobing
, Wang, Ruiyang
, Wang, Lei
, Zheng, Wenfeng
, Lu, Siyu
in
Accuracy
/ Datasets
/ Distillation
/ Efficiency
/ Evolutionary algorithms
/ Inference
/ Knowledge
/ Language
/ Natural language processing
/ Probability distribution
/ Questions
2024
Hey, we have placed the reservation for you!
By the way, why not check out events that you can attend while you pick your title.
You are currently in the queue to collect this book. You will be notified once it is your turn to collect the book.
Oops! Something went wrong.
Looks like we were not able to place the reservation. Kindly try again later.
Are you sure you want to remove the book from the shelf?
DPAL-BERT: A Faster and Lighter Question Answering Model
by
AlQahtani, Salman A.
, AlSanad, Ahmed
, Yin, Zhengtong
, Cai, Zhuohang
, Li, Xiaolu
, Yin, Lirong
, Chen, Xiaobing
, Wang, Ruiyang
, Wang, Lei
, Zheng, Wenfeng
, Lu, Siyu
in
Accuracy
/ Datasets
/ Distillation
/ Efficiency
/ Evolutionary algorithms
/ Inference
/ Knowledge
/ Language
/ Natural language processing
/ Probability distribution
/ Questions
2024
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
Do you wish to request the book?
DPAL-BERT: A Faster and Lighter Question Answering Model
by
AlQahtani, Salman A.
, AlSanad, Ahmed
, Yin, Zhengtong
, Cai, Zhuohang
, Li, Xiaolu
, Yin, Lirong
, Chen, Xiaobing
, Wang, Ruiyang
, Wang, Lei
, Zheng, Wenfeng
, Lu, Siyu
in
Accuracy
/ Datasets
/ Distillation
/ Efficiency
/ Evolutionary algorithms
/ Inference
/ Knowledge
/ Language
/ Natural language processing
/ Probability distribution
/ Questions
2024
Please be aware that the book you have requested cannot be checked out. If you would like to checkout this book, you can reserve another copy
We have requested the book for you!
Your request is successful and it will be processed during the Library working hours. Please check the status of your request in My Requests.
Oops! Something went wrong.
Looks like we were not able to place your request. Kindly try again later.
Journal Article
DPAL-BERT: A Faster and Lighter Question Answering Model
2024
Request Book From Autostore
and Choose the Collection Method
Overview
Recent advancements in natural language processing have given rise to numerous pre-training language models in question-answering systems. However, with the constant evolution of algorithms, data, and computing power, the increasing size and complexity of these models have led to increased training costs and reduced efficiency. This study aims to minimize the inference time of such models while maintaining computational performance. It also proposes a novel Distillation model for PAL-BERT (DPAL-BERT), specifically, employs knowledge distillation, using the PAL-BERT model as the teacher model to train two student models: DPAL-BERT-Bi and DPAL-BERT-C. This research enhances the dataset through techniques such as masking, replacement, and n-gram sampling to optimize knowledge transfer. The experimental results showed that the distilled models greatly outperform models trained from scratch. In addition, although the distilled models exhibit a slight decrease in performance compared to PAL-BERT, they significantly reduce inference time to just 0.25% of the original. This demonstrates the effectiveness of the proposed approach in balancing model performance and efficiency.
Publisher
Tech Science Press
Subject
MBRLCatalogueRelatedBooks
Related Items
Related Items
We currently cannot retrieve any items related to this title. Kindly check back at a later time.
This website uses cookies to ensure you get the best experience on our website.