Asset Details
MbrlCatalogueTitleDetail
Do you wish to reserve the book?
Contextual Distillation Model for Diversified Recommendation
by
Li, Fan
, Han, Kunyan
, Yang, Song
, Chen, Hechang
, Xu, Si
, Wang, Dingmin
, Tang, Shisong
, Zhou, Guorui
, Han, Bing
in
Algorithms
/ Context
/ Distillation
/ Ranking
/ User experience
2024
Hey, we have placed the reservation for you!
By the way, why not check out events that you can attend while you pick your title.
You are currently in the queue to collect this book. You will be notified once it is your turn to collect the book.
Oops! Something went wrong.
Looks like we were not able to place the reservation. Kindly try again later.
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
Do you wish to request the book?
Contextual Distillation Model for Diversified Recommendation
by
Li, Fan
, Han, Kunyan
, Yang, Song
, Chen, Hechang
, Xu, Si
, Wang, Dingmin
, Tang, Shisong
, Zhou, Guorui
, Han, Bing
in
Algorithms
/ Context
/ Distillation
/ Ranking
/ User experience
2024
Please be aware that the book you have requested cannot be checked out. If you would like to checkout this book, you can reserve another copy
We have requested the book for you!
Your request is successful and it will be processed during the Library working hours. Please check the status of your request in My Requests.
Oops! Something went wrong.
Looks like we were not able to place your request. Kindly try again later.
Contextual Distillation Model for Diversified Recommendation
Paper
Contextual Distillation Model for Diversified Recommendation
2024
Request Book From Autostore
and Choose the Collection Method
Overview
The diversity of recommendation is equally crucial as accuracy in improving user experience. Existing studies, e.g., Determinantal Point Process (DPP) and Maximal Marginal Relevance (MMR), employ a greedy paradigm to iteratively select items that optimize both accuracy and diversity. However, prior methods typically exhibit quadratic complexity, limiting their applications to the re-ranking stage and are not applicable to other recommendation stages with a larger pool of candidate items, such as the pre-ranking and ranking stages. In this paper, we propose Contextual Distillation Model (CDM), an efficient recommendation model that addresses diversification, suitable for the deployment in all stages of industrial recommendation pipelines. Specifically, CDM utilizes the candidate items in the same user request as context to enhance the diversification of the results. We propose a contrastive context encoder that employs attention mechanisms to model both positive and negative contexts. For the training of CDM, we compare each target item with its context embedding and utilize the knowledge distillation framework to learn the win probability of each target item under the MMR algorithm, where the teacher is derived from MMR outputs. During inference, ranking is performed through a linear combination of the recommendation and student model scores, ensuring both diversity and efficiency. We perform offline evaluations on two industrial datasets and conduct online A/B test of CDM on the short-video platform KuaiShou. The considerable enhancements observed in both recommendation quality and diversity, as shown by metrics, provide strong superiority for the effectiveness of CDM.
Publisher
Cornell University Library, arXiv.org
Subject
This website uses cookies to ensure you get the best experience on our website.