| | |
| | |
Stat |
Members: 3657 Articles: 2'599'751 Articles rated: 2609
06 October 2024 |
|
| | | |
|
Article overview
| |
|
Transformer with Fourier Integral Attentions | Tan Nguyen
; Minh Pham
; Tam Nguyen
; Khai Nguyen
; Stanley J. Osher
; Nhat Ho
; | Date: |
1 Jun 2022 | Abstract: | Multi-head attention empowers the recent success of transformers, the
state-of-the-art models that have achieved remarkable success in sequence
modeling and beyond. These attention mechanisms compute the pairwise dot
products between the queries and keys, which results from the use of
unnormalized Gaussian kernels with the assumption that the queries follow a
mixture of Gaussian distribution. There is no guarantee that this assumption is
valid in practice. In response, we first interpret attention in transformers as
a nonparametric kernel regression. We then propose the FourierFormer, a new
class of transformers in which the dot-product kernels are replaced by the
novel generalized Fourier integral kernels. Different from the dot-product
kernels, where we need to choose a good covariance matrix to capture the
dependency of the features of data, the generalized Fourier integral kernels
can automatically capture such dependency and remove the need to tune the
covariance matrix. We theoretically prove that our proposed Fourier integral
kernels can efficiently approximate any key and query distributions. Compared
to the conventional transformers with dot-product attention, FourierFormers
attain better accuracy and reduce the redundancy between attention heads. We
empirically corroborate the advantages of FourierFormers over the baseline
transformers in a variety of practical applications including language modeling
and image classification. | Source: | arXiv, 2206.00206 | Services: | Forum | Review | PDF | Favorites |
|
|
No review found.
Did you like this article?
Note: answers to reviews or questions about the article must be posted in the forum section.
Authors are not allowed to review their own article. They can use the forum section.
|
| |
|
|
|