Asset Details
MbrlCatalogueTitleDetail
Do you wish to reserve the book?
OmniDialog: An Omnipotent Pre-training Model for Task-Oriented Dialogue System
by
Fu, Jinlan
, Yang, Mingtao
, See-Kiong Ng
in
Datasets
/ Learning
/ Performance evaluation
/ Questions
/ Tracking
/ Training
2023
Hey, we have placed the reservation for you!
By the way, why not check out events that you can attend while you pick your title.
You are currently in the queue to collect this book. You will be notified once it is your turn to collect the book.
Oops! Something went wrong.
Looks like we were not able to place the reservation. Kindly try again later.
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
Do you wish to request the book?
OmniDialog: An Omnipotent Pre-training Model for Task-Oriented Dialogue System
by
Fu, Jinlan
, Yang, Mingtao
, See-Kiong Ng
in
Datasets
/ Learning
/ Performance evaluation
/ Questions
/ Tracking
/ Training
2023
Please be aware that the book you have requested cannot be checked out. If you would like to checkout this book, you can reserve another copy
We have requested the book for you!
Your request is successful and it will be processed during the Library working hours. Please check the status of your request in My Requests.
Oops! Something went wrong.
Looks like we were not able to place your request. Kindly try again later.
OmniDialog: An Omnipotent Pre-training Model for Task-Oriented Dialogue System
Paper
OmniDialog: An Omnipotent Pre-training Model for Task-Oriented Dialogue System
2023
Request Book From Autostore
and Choose the Collection Method
Overview
Pre-trained conversation models (PCMs) have demonstrated remarkable results in task-oriented dialogue (TOD) systems. Many PCMs focus predominantly on dialogue management tasks like dialogue state tracking, dialogue generation tasks like response generation, or both. However, the existing PCMs seldom consider dialogue comprehension tasks, such as dialogue question answering and summarization tasks. These tasks allow PCMs to glean dialogue context from various angles. This observation naturally raises the question: Can the performance of downstream dialogue tasks be enhanced if a PCM is pre-trained on dialogue management, generation, and comprehension tasks? To investigate this, we proposed an Omnipotent Dialogue pre-training model (OmniDialog). It unifies these three dialogue tasks into a monolithic framework by multi-task learning, fostering inter-task communication. The pre-training corpus of OmniDialog spans \\(\\mathbf{7}\\) dialogue-focused tasks, drawing from \\(\\mathbf{15}\\) datasets and encompassing over \\(\\mathbf{3.2}\\) million dialogue utterances. To our knowledge, OmniDialog is a pioneering PCM pre-trained across dialogue management, generation, and comprehension domains. We evaluated its performance across four tasks: dialogue summarization, end-to-end dialogue modeling, dialogue state tracking, and intent classification. The results underscore its efficacy in domain transfer learning, low-resource, and full-dataset scenarios. Furthermore, to glean a nuanced understanding of OmniDialog's strengths and potential pitfalls, we designed a fine-grained analysis framework for dialogue-centric tasks. Experimental results show that the OmniDialog is good at hard samples, such as long dialogues and lengthy responses.
This website uses cookies to ensure you get the best experience on our website.