| | |
| | |
Stat |
Members: 3645 Articles: 2'506'133 Articles rated: 2609
26 April 2024 |
|
| | | |
|
Article overview
| |
|
Rethinking Importance Weighting for Deep Learning under Distribution Shift | Tongtong Fang
; Nan Lu
; Gang Niu
; Masashi Sugiyama
; | Date: |
8 Jun 2020 | Abstract: | Under distribution shift (DS) where the training data distribution differs
from the test one, a powerful technique is importance weighting (IW) which
handles DS in two separate steps: weight estimation (WE) estimates the
test-over-training density ratio and weighted classification (WC) trains the
classifier from weighted training data. However, IW cannot work well on complex
data, since WE is incompatible with deep learning. In this paper, we rethink IW
and theoretically show it suffers from a circular dependency: we need not only
WE for WC, but also WC for WE where a trained deep classifier is used as the
feature extractor (FE). To cut off the dependency, we try to pretrain FE from
unweighted training data, which leads to biased FE. To overcome the bias, we
propose an end-to-end solution dynamic IW that iterates between WE and WC and
combines them in a seamless manner, and hence our WE can also enjoy deep
networks and stochastic optimizers indirectly. Experiments with two
representative DSs on Fashion-MNIST and CIFAR-10/100 demonstrate that dynamic
IW compares favorably with state-of-the-art methods. | Source: | arXiv, 2006.4662 | Services: | Forum | Review | PDF | Favorites |
|
|
No review found.
Did you like this article?
Note: answers to reviews or questions about the article must be posted in the forum section.
Authors are not allowed to review their own article. They can use the forum section.
browser Mozilla/5.0 AppleWebKit/537.36 (KHTML, like Gecko; compatible; ClaudeBot/1.0; +claudebot@anthropic.com)
|
| |
|
|
|
| News, job offers and information for researchers and scientists:
| |