| | |
| | |
Stat |
Members: 3669 Articles: 2'599'751 Articles rated: 2609
24 March 2025 |
|
| | | |
|
Article overview
| |
|
Unifying Approaches in Data Subset Selection via Fisher Information and Information-Theoretic Quantities | Andreas Kirsch
; Yarin Gal
; | Date: |
1 Aug 2022 | Abstract: | The mutual information between predictions and model parameters -- also
referred to as expected information gain or BALD in machine learning --
measures informativeness. It is a popular acquisition function in Bayesian
active learning and Bayesian optimal experiment design. In data subset
selection, i.e. active learning and active sampling, several recent works use
Fisher information, Hessians, similarity matrices based on the gradients, or
simply the gradient lengths to compute the acquisition scores that guide sample
selection. Are these different approaches connected, and if so how? In this
paper, we revisit the Fisher information and use it to show how several
otherwise disparate methods are connected as approximations of
information-theoretic quantities. | Source: | arXiv, 2208.00549 | Services: | Forum | Review | PDF | Favorites |
|
|
No review found.
Did you like this article?
Note: answers to reviews or questions about the article must be posted in the forum section.
Authors are not allowed to review their own article. They can use the forum section.
|
| |
|
|
|