Asset Details
MbrlCatalogueTitleDetail
Do you wish to reserve the book?
Hierarchical Self-Supervised Learning for Knowledge-Aware Recommendation
by
Zhou, Sihang
, Huang, Jian
, Wang, Dong
, Zhou, Cong
in
Collaboration
/ Computational linguistics
/ Knowledge
/ knowledge graph
/ Language processing
/ Lopez, Jennifer
/ Natural language interfaces
/ Neighborhoods
/ Paradigms
/ recommendation
/ Recommender systems
/ self-supervised learning
/ Semantics
/ Sparsity
/ User behavior
2024
Hey, we have placed the reservation for you!
By the way, why not check out events that you can attend while you pick your title.
You are currently in the queue to collect this book. You will be notified once it is your turn to collect the book.
Oops! Something went wrong.
Looks like we were not able to place the reservation. Kindly try again later.
Are you sure you want to remove the book from the shelf?
Hierarchical Self-Supervised Learning for Knowledge-Aware Recommendation
by
Zhou, Sihang
, Huang, Jian
, Wang, Dong
, Zhou, Cong
in
Collaboration
/ Computational linguistics
/ Knowledge
/ knowledge graph
/ Language processing
/ Lopez, Jennifer
/ Natural language interfaces
/ Neighborhoods
/ Paradigms
/ recommendation
/ Recommender systems
/ self-supervised learning
/ Semantics
/ Sparsity
/ User behavior
2024
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
Do you wish to request the book?
Hierarchical Self-Supervised Learning for Knowledge-Aware Recommendation
by
Zhou, Sihang
, Huang, Jian
, Wang, Dong
, Zhou, Cong
in
Collaboration
/ Computational linguistics
/ Knowledge
/ knowledge graph
/ Language processing
/ Lopez, Jennifer
/ Natural language interfaces
/ Neighborhoods
/ Paradigms
/ recommendation
/ Recommender systems
/ self-supervised learning
/ Semantics
/ Sparsity
/ User behavior
2024
Please be aware that the book you have requested cannot be checked out. If you would like to checkout this book, you can reserve another copy
We have requested the book for you!
Your request is successful and it will be processed during the Library working hours. Please check the status of your request in My Requests.
Oops! Something went wrong.
Looks like we were not able to place your request. Kindly try again later.
Hierarchical Self-Supervised Learning for Knowledge-Aware Recommendation
Journal Article
Hierarchical Self-Supervised Learning for Knowledge-Aware Recommendation
2024
Request Book From Autostore
and Choose the Collection Method
Overview
Knowledge-aware recommendation systems have shown superior performance by connecting user item interaction graph (UIG) with knowledge graph (KG) and enriching semantic connections collected by the corresponding networks. Among the existing methods, self-supervised learning has attracted the most attention for its significant effects in extracting node self-discrimination auxiliary supervision, which can largely improve the recommending rationality. However, existing methods usually employ a single (either node or edge) perspective for representation learning, over-emphasizing the pair-wise topology structure in the graph, thus overlooking the important semantic information among neighborhood-wise connection, limiting the recommendation performance. To solve the problem, we propose Hierarchical self-supervised learning for Knowledge-aware Recommendation (HKRec). The hierarchical property of the method is shown in two perspectives. First, to better reveal the knowledge graph semantic relations, we design a Triple-Graph Masked Autoencoder (T-GMAE) to force the network to estimate the masked node features, node connections, and node degrees. Second, to better align the user-item recommendation knowledge with the common knowledge, we conduct contrastive learning in a hybrid way, i.e., both neighborhood-level and edge-level dropout are adopted in a parallel way to allow more comprehensive information distillation. We conduct an in-depth experimental evaluation on three real-world datasets, comparing our proposed HKRec with state-of-the-art baseline models to demonstrate its effectiveness and superiority. Respectively, Recall@20 and NDCG@20 improved by 2.2% to 24.95% and 3.38% to 22.32% in the Last-FM dataset, by 7.0% to 23.82% and 5.7% to 39.66% in the MIND dataset, and by 1.76% to 34.73% and 1.62% to 35.13% in the Alibaba-iFashion dataset.
Publisher
MDPI AG
This website uses cookies to ensure you get the best experience on our website.