Asset Details
MbrlCatalogueTitleDetail
Do you wish to reserve the book?
The Effectiveness of Intermediate-Task Training for Code-Switched Natural Language Understanding
by
Jyothi, Preethi
, Pathak, Shreya
, Mohammad Ali Rehan
, Prasad, Archiki
in
Data mining
/ Language
/ Multilingualism
/ Natural language
/ Sentiment analysis
2021
Hey, we have placed the reservation for you!
By the way, why not check out events that you can attend while you pick your title.
You are currently in the queue to collect this book. You will be notified once it is your turn to collect the book.
Oops! Something went wrong.
Looks like we were not able to place the reservation. Kindly try again later.
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
Do you wish to request the book?
The Effectiveness of Intermediate-Task Training for Code-Switched Natural Language Understanding
by
Jyothi, Preethi
, Pathak, Shreya
, Mohammad Ali Rehan
, Prasad, Archiki
in
Data mining
/ Language
/ Multilingualism
/ Natural language
/ Sentiment analysis
2021
Please be aware that the book you have requested cannot be checked out. If you would like to checkout this book, you can reserve another copy
We have requested the book for you!
Your request is successful and it will be processed during the Library working hours. Please check the status of your request in My Requests.
Oops! Something went wrong.
Looks like we were not able to place your request. Kindly try again later.
The Effectiveness of Intermediate-Task Training for Code-Switched Natural Language Understanding
Paper
The Effectiveness of Intermediate-Task Training for Code-Switched Natural Language Understanding
2021
Request Book From Autostore
and Choose the Collection Method
Overview
While recent benchmarks have spurred a lot of new work on improving the generalization of pretrained multilingual language models on multilingual tasks, techniques to improve code-switched natural language understanding tasks have been far less explored. In this work, we propose the use of bilingual intermediate pretraining as a reliable technique to derive large and consistent performance gains on three different NLP tasks using code-switched text. We achieve substantial absolute improvements of 7.87%, 20.15%, and 10.99%, on the mean accuracies and F1 scores over previous state-of-the-art systems for Hindi-English Natural Language Inference (NLI), Question Answering (QA) tasks, and Spanish-English Sentiment Analysis (SA) respectively. We show consistent performance gains on four different code-switched language-pairs (Hindi-English, Spanish-English, Tamil-English and Malayalam-English) for SA. We also present a code-switched masked language modelling (MLM) pretraining technique that consistently benefits SA compared to standard MLM pretraining using real code-switched text.
Publisher
Cornell University Library, arXiv.org
Subject
This website uses cookies to ensure you get the best experience on our website.