Asset Details
MbrlCatalogueTitleDetail
Do you wish to reserve the book?
Research on user granularity-level personalized social text generation technology
by
Gao, J T
, Ma, R
, Yang, L D
, Gao, Y B
in
Coders
/ Customization
/ Decoding
/ Encoders-Decoders
/ Modules
/ Physics
2022
Hey, we have placed the reservation for you!
By the way, why not check out events that you can attend while you pick your title.
You are currently in the queue to collect this book. You will be notified once it is your turn to collect the book.
Oops! Something went wrong.
Looks like we were not able to place the reservation. Kindly try again later.
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
Do you wish to request the book?
Research on user granularity-level personalized social text generation technology
by
Gao, J T
, Ma, R
, Yang, L D
, Gao, Y B
in
Coders
/ Customization
/ Decoding
/ Encoders-Decoders
/ Modules
/ Physics
2022
Please be aware that the book you have requested cannot be checked out. If you would like to checkout this book, you can reserve another copy
We have requested the book for you!
Your request is successful and it will be processed during the Library working hours. Please check the status of your request in My Requests.
Oops! Something went wrong.
Looks like we were not able to place your request. Kindly try again later.
Research on user granularity-level personalized social text generation technology
Journal Article
Research on user granularity-level personalized social text generation technology
2022
Request Book From Autostore
and Choose the Collection Method
Overview
With the introduction of large-scale pre-trained language models, breakthroughs have been made in text generation technology research. On this basis, in order to assist users to complete personalized creation, this paper proposes a user-level fine-grained control generation model. First, we design the Encoder-Decoder framework based on the GPT2 structure, and model and encode the user’s static personalized information on the Encoder side. Then add a bidirectional independent attention module to receive the personalized feature vector on the Decoder side. The attention module in the original GPT2 structure captures the dynamic personalized features in the user text, namely writing style, expression way, etc. Next, the scores of each attention module are weighted and fused to participate in the subsequent decoding to automatically generate social texts constrained by the user’s personalized feature attributes. However, the semantic sparsity of the user’s basic information will cause occasional conflicts between the generated text and some personalized features. Therefore, we use the Alignment module to perform the secondary enhancement and generation of consistent understanding between the output data of the Decoder and the user’s personalized features, and finally realize the personalized social text generation. Experiments show that compared with the GPT2 baseline model, the fluency of the model is improved by 0.3%-0.6%, and on the basis of no loss of language fluency, the social text generated by the model can have significant user personalization characteristics, among which personalization and consistency the two evaluation indicators of sexuality both increased significantly by 8.4% and 9%.
Publisher
IOP Publishing
Subject
This website uses cookies to ensure you get the best experience on our website.