Asset Details
MbrlCatalogueTitleDetail
Do you wish to reserve the book?
Projected Hamming Dissimilarity for Bit-Level Importance Coding in Collaborative Filtering
by
Lioma, Christina
, Simonsen, Jakob Grue
, Hansen, Casper
, Hansen, Christian
in
Collaboration
/ Filtration
/ Learning
/ Representations
/ Weighting
2021
Hey, we have placed the reservation for you!
By the way, why not check out events that you can attend while you pick your title.
You are currently in the queue to collect this book. You will be notified once it is your turn to collect the book.
Oops! Something went wrong.
Looks like we were not able to place the reservation. Kindly try again later.
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
Do you wish to request the book?
Projected Hamming Dissimilarity for Bit-Level Importance Coding in Collaborative Filtering
by
Lioma, Christina
, Simonsen, Jakob Grue
, Hansen, Casper
, Hansen, Christian
in
Collaboration
/ Filtration
/ Learning
/ Representations
/ Weighting
2021
Please be aware that the book you have requested cannot be checked out. If you would like to checkout this book, you can reserve another copy
We have requested the book for you!
Your request is successful and it will be processed during the Library working hours. Please check the status of your request in My Requests.
Oops! Something went wrong.
Looks like we were not able to place your request. Kindly try again later.
Projected Hamming Dissimilarity for Bit-Level Importance Coding in Collaborative Filtering
Paper
Projected Hamming Dissimilarity for Bit-Level Importance Coding in Collaborative Filtering
2021
Request Book From Autostore
and Choose the Collection Method
Overview
When reasoning about tasks that involve large amounts of data, a common approach is to represent data items as objects in the Hamming space where operations can be done efficiently and effectively. Object similarity can then be computed by learning binary representations (hash codes) of the objects and computing their Hamming distance. While this is highly efficient, each bit dimension is equally weighted, which means that potentially discriminative information of the data is lost. A more expressive alternative is to use real-valued vector representations and compute their inner product; this allows varying the weight of each dimension but is many magnitudes slower. To fix this, we derive a new way of measuring the dissimilarity between two objects in the Hamming space with binary weighting of each dimension (i.e., disabling bits): we consider a field-agnostic dissimilarity that projects the vector of one object onto the vector of the other. When working in the Hamming space, this results in a novel projected Hamming dissimilarity, which by choice of projection, effectively allows a binary importance weighting of the hash code of one object through the hash code of the other. We propose a variational hashing model for learning hash codes optimized for this projected Hamming dissimilarity, and experimentally evaluate it in collaborative filtering experiments. The resultant hash codes lead to effectiveness gains of up to +7% in NDCG and +14% in MRR compared to state-of-the-art hashing-based collaborative filtering baselines, while requiring no additional storage and no computational overhead compared to using the Hamming distance.
Publisher
Cornell University Library, arXiv.org
Subject
This website uses cookies to ensure you get the best experience on our website.