Asset Details
MbrlCatalogueTitleDetail
Do you wish to reserve the book?
What does Chinese BERT learn about syntactic knowledge?
by
Zheng, Jianyu
, Liu, Ying
in
Analysis
/ Artificial Intelligence
/ BERT
/ Chinese
/ Computational Linguistics
/ Data Mining and Machine Learning
/ Fine-tune
/ Language processing
/ Natural language interfaces
/ NLP
/ Syntax
2023
Hey, we have placed the reservation for you!
By the way, why not check out events that you can attend while you pick your title.
You are currently in the queue to collect this book. You will be notified once it is your turn to collect the book.
Oops! Something went wrong.
Looks like we were not able to place the reservation. Kindly try again later.
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
Do you wish to request the book?
What does Chinese BERT learn about syntactic knowledge?
by
Zheng, Jianyu
, Liu, Ying
in
Analysis
/ Artificial Intelligence
/ BERT
/ Chinese
/ Computational Linguistics
/ Data Mining and Machine Learning
/ Fine-tune
/ Language processing
/ Natural language interfaces
/ NLP
/ Syntax
2023
Please be aware that the book you have requested cannot be checked out. If you would like to checkout this book, you can reserve another copy
We have requested the book for you!
Your request is successful and it will be processed during the Library working hours. Please check the status of your request in My Requests.
Oops! Something went wrong.
Looks like we were not able to place your request. Kindly try again later.
Journal Article
What does Chinese BERT learn about syntactic knowledge?
2023
Request Book From Autostore
and Choose the Collection Method
Overview
Pre-trained language models such as Bidirectional Encoder Representations from Transformers (BERT) have been applied to a wide range of natural language processing (NLP) tasks and obtained significantly positive results. A growing body of research has investigated the reason why BERT is so efficient and what language knowledge BERT is able to learn. However, most of these works focused almost exclusively on English. Few studies have explored the language information, particularly syntactic information, that BERT has learned in Chinese, which is written as sequences of characters. In this study, we adopted some probing methods for identifying syntactic knowledge stored in the attention heads and hidden states of Chinese BERT. The results suggest that some individual heads and combination of heads do well in encoding corresponding and overall syntactic relations, respectively. The hidden representation of each layer also contained syntactic information to different degrees. We also analyzed the fine-tuned models of Chinese BERT for different tasks, covering all levels. Our results suggest that these fine-turned models reflect changes in conserving language structure. These findings help explain why Chinese BERT can show such large improvements across many language-processing tasks.
Publisher
PeerJ. Ltd,PeerJ Inc
This website uses cookies to ensure you get the best experience on our website.