| | |
| | |
Stat |
Members: 3645 Articles: 2'504'928 Articles rated: 2609
25 April 2024 |
|
| | | |
|
Article overview
| |
|
Dropout as a Bayesian Approximation: Appendix | Yarin Gal
; Zoubin Ghahramani
; | Date: |
6 Jun 2015 | Abstract: | We show that a multilayer perceptron (MLP) with arbitrary depth and
nonlinearities, with dropout applied after every weight layer, is
mathematically equivalent to an approximation to a well known Bayesian model.
This interpretation offers an explanation to some of dropout’s key properties,
such as its robustness to over-fitting. Our interpretation allows us to reason
about uncertainty in deep learning, and allows the introduction of the Bayesian
machinery into existing deep learning frameworks in a principled way.
This document is an appendix for the main paper "Dropout as a Bayesian
Approximation: Representing Model Uncertainty in Deep Learning" by Gal and
Ghahramani, 2015. | Source: | arXiv, 1506.2157 | Services: | Forum | Review | PDF | Favorites |
|
|
No review found.
Did you like this article?
Note: answers to reviews or questions about the article must be posted in the forum section.
Authors are not allowed to review their own article. They can use the forum section.
browser Mozilla/5.0 AppleWebKit/537.36 (KHTML, like Gecko; compatible; ClaudeBot/1.0; +claudebot@anthropic.com)
|
| |
|
|
|
| News, job offers and information for researchers and scientists:
| |