Asset Details
MbrlCatalogueTitleDetail
Do you wish to reserve the book?
Meta-Whisper: Speech-Based Meta-ICL for ASR on Low-Resource Languages
by
Ming-Hao Hsu
, Hung-yi, Lee
, Huang, Kuan Po
in
Automatic speech recognition
/ K-nearest neighbors algorithm
/ Languages
2024
Hey, we have placed the reservation for you!
By the way, why not check out events that you can attend while you pick your title.
You are currently in the queue to collect this book. You will be notified once it is your turn to collect the book.
Oops! Something went wrong.
Looks like we were not able to place the reservation. Kindly try again later.
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
Do you wish to request the book?
Meta-Whisper: Speech-Based Meta-ICL for ASR on Low-Resource Languages
by
Ming-Hao Hsu
, Hung-yi, Lee
, Huang, Kuan Po
in
Automatic speech recognition
/ K-nearest neighbors algorithm
/ Languages
2024
Please be aware that the book you have requested cannot be checked out. If you would like to checkout this book, you can reserve another copy
We have requested the book for you!
Your request is successful and it will be processed during the Library working hours. Please check the status of your request in My Requests.
Oops! Something went wrong.
Looks like we were not able to place your request. Kindly try again later.
Meta-Whisper: Speech-Based Meta-ICL for ASR on Low-Resource Languages
Paper
Meta-Whisper: Speech-Based Meta-ICL for ASR on Low-Resource Languages
2024
Request Book From Autostore
and Choose the Collection Method
Overview
This paper presents Meta-Whisper, a novel approach to improve automatic speech recognition (ASR) for low-resource languages using the Whisper model. By leveraging Meta In-Context Learning (Meta-ICL) and a k-Nearest Neighbors (KNN) algorithm for sample selection, Meta-Whisper enhances Whisper's ability to recognize speech in unfamiliar languages without extensive fine-tuning. Experiments on the ML-SUPERB dataset show that Meta-Whisper significantly reduces the Character Error Rate (CER) for low-resource languages compared to the original Whisper model. This method offers a promising solution for developing more adaptable multilingual ASR systems, particularly for languages with limited resources.
Publisher
Cornell University Library, arXiv.org
This website uses cookies to ensure you get the best experience on our website.