| | |
| | |
Stat |
Members: 3645 Articles: 2'501'711 Articles rated: 2609
20 April 2024 |
|
| | | |
|
Article overview
| |
|
Overcoming Challenges in Fixed Point Training of Deep Convolutional Networks | Darryl D. Lin
; Sachin S. Talathi
; | Date: |
8 Jul 2016 | Abstract: | It is known that training deep neural networks, in particular, deep
convolutional networks, with aggressively reduced numerical precision is
challenging. The stochastic gradient descent algorithm becomes unstable in the
presence of noisy gradient updates resulting from arithmetic with limited
numeric precision. One of the well-accepted solutions facilitating the training
of low precision fixed point networks is stochastic rounding. However, to the
best of our knowledge, the source of the instability in training neural
networks with noisy gradient updates has not been well investigated. This work
is an attempt to draw a theoretical connection between low numerical precision
and training algorithm stability. In doing so, we will also propose and verify
through experiments methods that are able to improve the training performance
of deep convolutional networks in fixed point. | Source: | arXiv, 1607.2241 | Services: | Forum | Review | PDF | Favorites |
|
|
No review found.
Did you like this article?
Note: answers to reviews or questions about the article must be posted in the forum section.
Authors are not allowed to review their own article. They can use the forum section.
browser Mozilla/5.0 AppleWebKit/537.36 (KHTML, like Gecko; compatible; ClaudeBot/1.0; +claudebot@anthropic.com)
|
| |
|
|
|
| News, job offers and information for researchers and scientists:
| |