| | |
| | |
Stat |
Members: 3645 Articles: 2'504'585 Articles rated: 2609
25 April 2024 |
|
| | | |
|
Article overview
| |
|
Go From the General to the Particular: Multi-Domain Translation with Domain Transformation Networks | Yong Wang
; Longyue Wang
; Shuming Shi
; Victor O.K. Li
; Zhaopeng Tu
; | Date: |
22 Nov 2019 | Abstract: | The key challenge of multi-domain translation lies in simultaneously encoding
both the general knowledge shared across domains and the particular knowledge
distinctive to each domain in a unified model. Previous work shows that the
standard neural machine translation (NMT) model, trained on mixed-domain data,
generally captures the general knowledge, but misses the domain-specific
knowledge. In response to this problem, we augment NMT model with additional
domain transformation networks to transform the general representations to
domain-specific representations, which are subsequently fed to the NMT decoder.
To guarantee the knowledge transformation, we also propose two complementary
supervision signals by leveraging the power of knowledge distillation and
adversarial learning. Experimental results on several language pairs, covering
both balanced and unbalanced multi-domain translation, demonstrate the
effectiveness and universality of the proposed approach. Encouragingly, the
proposed unified model achieves comparable results with the fine-tuning
approach that requires multiple models to preserve the particular knowledge.
Further analyses reveal that the domain transformation networks successfully
capture the domain-specific knowledge as expected. | Source: | arXiv, 1911.9912 | Services: | Forum | Review | PDF | Favorites |
|
|
No review found.
Did you like this article?
Note: answers to reviews or questions about the article must be posted in the forum section.
Authors are not allowed to review their own article. They can use the forum section.
browser Mozilla/5.0 AppleWebKit/537.36 (KHTML, like Gecko; compatible; ClaudeBot/1.0; +claudebot@anthropic.com)
|
| |
|
|
|
| News, job offers and information for researchers and scientists:
| |