Asset Details
MbrlCatalogueTitleDetail
Do you wish to reserve the book?
Improving the Domain Adaptation of Retrieval Augmented Generation (RAG) Models for Open Domain Question Answering
by
Wen, Elliott
, Nanayakkara, Suranga
, Rana, Rajib
, Kaluarachchi, Tharindu
, Weerasekera, Rivindu
, Siriwardhana, Shamane
in
Adaptation
/ Augmentation
/ Computational linguistics
/ COVID-19
/ Credibility
/ Health care
/ Health services
/ Information sources
/ Internet
/ Knowledge
/ Knowledge base
/ Knowledge bases (artificial intelligence)
/ Question answer sequences
/ Questions
/ Retrieval
/ Training
2023
Hey, we have placed the reservation for you!
By the way, why not check out events that you can attend while you pick your title.
You are currently in the queue to collect this book. You will be notified once it is your turn to collect the book.
Oops! Something went wrong.
Looks like we were not able to place the reservation. Kindly try again later.
Are you sure you want to remove the book from the shelf?
Improving the Domain Adaptation of Retrieval Augmented Generation (RAG) Models for Open Domain Question Answering
by
Wen, Elliott
, Nanayakkara, Suranga
, Rana, Rajib
, Kaluarachchi, Tharindu
, Weerasekera, Rivindu
, Siriwardhana, Shamane
in
Adaptation
/ Augmentation
/ Computational linguistics
/ COVID-19
/ Credibility
/ Health care
/ Health services
/ Information sources
/ Internet
/ Knowledge
/ Knowledge base
/ Knowledge bases (artificial intelligence)
/ Question answer sequences
/ Questions
/ Retrieval
/ Training
2023
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
Do you wish to request the book?
Improving the Domain Adaptation of Retrieval Augmented Generation (RAG) Models for Open Domain Question Answering
by
Wen, Elliott
, Nanayakkara, Suranga
, Rana, Rajib
, Kaluarachchi, Tharindu
, Weerasekera, Rivindu
, Siriwardhana, Shamane
in
Adaptation
/ Augmentation
/ Computational linguistics
/ COVID-19
/ Credibility
/ Health care
/ Health services
/ Information sources
/ Internet
/ Knowledge
/ Knowledge base
/ Knowledge bases (artificial intelligence)
/ Question answer sequences
/ Questions
/ Retrieval
/ Training
2023
Please be aware that the book you have requested cannot be checked out. If you would like to checkout this book, you can reserve another copy
We have requested the book for you!
Your request is successful and it will be processed during the Library working hours. Please check the status of your request in My Requests.
Oops! Something went wrong.
Looks like we were not able to place your request. Kindly try again later.
Improving the Domain Adaptation of Retrieval Augmented Generation (RAG) Models for Open Domain Question Answering
Journal Article
Improving the Domain Adaptation of Retrieval Augmented Generation (RAG) Models for Open Domain Question Answering
2023
Request Book From Autostore
and Choose the Collection Method
Overview
Retrieval Augment Generation (RAG) is a recent advancement in Open-Domain Question Answering (ODQA). RAG has only been trained and explored with a Wikipedia-based external knowledge base and is not optimized for use in other specialized domains such as healthcare and news. In this paper, we evaluate the impact of joint training of the retriever and generator components of RAG for the task of domain adaptation in ODQA. We propose
, an extension to RAG that can adapt to a domain-specific knowledge base by updating all components of the external knowledge base during training. In addition, we introduce an auxiliary training signal to inject more domain-specific knowledge. This auxiliary signal forces
to reconstruct a given sentence by accessing the relevant information from the external knowledge base. Our novel contribution is that, unlike RAG, RAG-end2end does joint training of the retriever and generator for the end QA task and domain adaptation. We evaluate our approach with datasets from three domains: COVID-19, News, and Conversations, and achieve significant performance improvements compared to the original RAG model. Our work has been open-sourced through the HuggingFace Transformers library, attesting to our work’s credibility and technical consistency.
Publisher
MIT Press,MIT Press Journals, The,The MIT Press
This website uses cookies to ensure you get the best experience on our website.