forgot password?
register here
Research articles
  search articles
  reviews guidelines
  articles index
My Pages
my alerts
  my messages
  my reviews
  my favorites
Members: 3652
Articles: 2'545'386
Articles rated: 2609

24 June 2024
  » arxiv » 2302.00340

 Article overview

Attention Link: An Efficient Attention-Based Low Resource Machine Translation Architecture
Zeping Min ;
Date 1 Feb 2023
AbstractTransformers have achieved great success in machine translation, but transformer-based NMT models often require millions of bilingual parallel corpus for training. In this paper, we propose a novel architecture named as attention link (AL) to help improve transformer models’ performance, especially in low training resources. We theoretically demonstrate the superiority of our attention link architecture in low training resources. Besides, we have done a large number of experiments, including en-de, de-en, en-fr, en-it, it-en, en-ro translation tasks on the IWSLT14 dataset as well as real low resources scene on bn-gu and gu-ta translation tasks on the CVIT PIB dataset. All the experiment results show our attention link is powerful and can lead to a significant improvement. In addition, we achieve a 37.9 BLEU score, a new sota, on the IWSLT14 de-en task by combining our attention link and other advanced methods.
Source arXiv, 2302.00340
Services Forum | Review | PDF | Favorites   
Visitor rating: did you like this article? no 1   2   3   4   5   yes

No review found.
 Did you like this article?

This article or document is ...
of broad interest:
Global appreciation:

  Note: answers to reviews or questions about the article must be posted in the forum section.
Authors are not allowed to review their own article. They can use the forum section.
» my Online CV
» Free

home  |  contact  |  terms of use  |  sitemap
Copyright © 2005-2024 - Scimetrica