| | |
| | |
Stat |
Members: 3645 Articles: 2'504'585 Articles rated: 2609
24 April 2024 |
|
| | | |
|
Article overview
| |
|
Variational Gaussian Dropout is not Bayesian | Jiri Hron
; Alexander G. de G. Matthews
; Zoubin Ghahramani
; | Date: |
8 Nov 2017 | Abstract: | Gaussian multiplicative noise is commonly used as a stochastic regularisation
technique in training of deterministic neural networks. A recent paper
reinterpreted the technique as a specific algorithm for approximate inference
in Bayesian neural networks; several extensions ensued. We show that the
log-uniform prior used in all the above publications does not generally induce
a proper posterior, and thus Bayesian inference in such models is ill-posed.
Independent of the log-uniform prior, the correlated weight noise approximation
has further issues leading to either infinite objective or high risk of
overfitting. The above implies that the reported sparsity of obtained solutions
cannot be explained by Bayesian or the related minimum description length
arguments. We thus study the objective from a non-Bayesian perspective, provide
its previously unknown analytical form which allows exact gradient evaluation,
and show that the later proposed additive reparametrisation introduces minima
not present in the original multiplicative parametrisation. Implications and
future research directions are discussed. | Source: | arXiv, 1711.2989 | Services: | Forum | Review | PDF | Favorites |
|
|
No review found.
Did you like this article?
Note: answers to reviews or questions about the article must be posted in the forum section.
Authors are not allowed to review their own article. They can use the forum section.
browser Mozilla/5.0 AppleWebKit/537.36 (KHTML, like Gecko; compatible; ClaudeBot/1.0; +claudebot@anthropic.com)
|
| |
|
|
|
| News, job offers and information for researchers and scientists:
| |