| | |
| | |
Stat |
Members: 3645 Articles: 2'506'133 Articles rated: 2609
26 April 2024 |
|
| | | |
|
Article overview
| |
|
On the Sparsity of Neural Machine Translation Models | Yong Wang
; Longyue Wang
; Victor O.K. Li
; Zhaopeng Tu
; | Date: |
6 Oct 2020 | Abstract: | Modern neural machine translation (NMT) models employ a large number of
parameters, which leads to serious over-parameterization and typically causes
the underutilization of computational resources. In response to this problem,
we empirically investigate whether the redundant parameters can be reused to
achieve better performance. Experiments and analyses are systematically
conducted on different datasets and NMT architectures. We show that: 1) the
pruned parameters can be rejuvenated to improve the baseline model by up to
+0.8 BLEU points; 2) the rejuvenated parameters are reallocated to enhance the
ability of modeling low-level lexical information. | Source: | arXiv, 2010.02646 | Services: | Forum | Review | PDF | Favorites |
|
|
No review found.
Did you like this article?
Note: answers to reviews or questions about the article must be posted in the forum section.
Authors are not allowed to review their own article. They can use the forum section.
browser Mozilla/5.0 AppleWebKit/537.36 (KHTML, like Gecko; compatible; ClaudeBot/1.0; +claudebot@anthropic.com)
|
| |
|
|
|
| News, job offers and information for researchers and scientists:
| |