| | |
| | |
Stat |
Members: 3657 Articles: 2'599'751 Articles rated: 2609
14 October 2024 |
|
| | | |
|
Article overview
| |
|
Fair Comparison between Efficient Attentions | Jiuk Hong
; Chaehyeon Lee
; Soyoun Bang
; Heechul Jung
; | Date: |
1 Jun 2022 | Abstract: | Transformers have been successfully used in various fields and are becoming
the standard tools in computer vision. However, self-attention, a core
component of transformers, has a quadratic complexity problem, which limits the
use of transformers in various vision tasks that require dense prediction. Many
studies aiming at solving this problem have been reported proposed. However, no
comparative study of these methods using the same scale has been reported due
to different model configurations, training schemes, and new methods. In our
paper, we validate these efficient attention models on the ImageNet1K
classification task by changing only the attention operation and examining
which efficient attention is better. | Source: | arXiv, 2206.00244 | Services: | Forum | Review | PDF | Favorites |
|
|
No review found.
Did you like this article?
Note: answers to reviews or questions about the article must be posted in the forum section.
Authors are not allowed to review their own article. They can use the forum section.
|
| |
|
|
|