| | |
| | |
Stat |
Members: 3645 Articles: 2'501'711 Articles rated: 2609
19 April 2024 |
|
| | | |
|
Article overview
| |
|
Simplify and Robustify Negative Sampling for Implicit Collaborative Filtering | Jingtao Ding
; Yuhan Quan
; Quanming Yao
; Yong Li
; Depeng Jin
; | Date: |
7 Sep 2020 | Abstract: | Negative sampling approaches are prevalent in implicit collaborative
filtering for obtaining negative labels from massive unlabeled data. As two
major concerns in negative sampling, efficiency and effectiveness are still not
fully achieved by recent works that use complicate structures and overlook risk
of false negative instances. In this paper, we first provide a novel
understanding of negative instances by empirically observing that only a few
instances are potentially important for model learning, and false negatives
tend to have stable predictions over many training iterations. Above findings
motivate us to simplify the model by sampling from designed memory that only
stores a few important candidates and, more importantly, tackle the untouched
false negative problem by favouring high-variance samples stored in memory,
which achieves efficient sampling of true negatives with high-quality.
Empirical results on two synthetic datasets and three real-world datasets
demonstrate both robustness and superiorities of our negative sampling method. | Source: | arXiv, 2009.03376 | Services: | Forum | Review | PDF | Favorites |
|
|
No review found.
Did you like this article?
Note: answers to reviews or questions about the article must be posted in the forum section.
Authors are not allowed to review their own article. They can use the forum section.
browser Mozilla/5.0 AppleWebKit/537.36 (KHTML, like Gecko; compatible; ClaudeBot/1.0; +claudebot@anthropic.com)
|
| |
|
|
|
| News, job offers and information for researchers and scientists:
| |