| | |
| | |
Stat |
Members: 3669 Articles: 2'599'751 Articles rated: 2609
16 March 2025 |
|
| | | |
|
Article overview
| |
|
1DFormer: Learning 1D Landmark Representations via Transformer for Facial Landmark Tracking | Shi Yin
; Shijie Huan
; Defu Lian
; Shangfei Wang
; Jinshui Hu
; Tao Guo
; Bing Yin
; Baocai Yin
; Cong Liu
; | Date: |
1 Nov 2023 | Abstract: | Recently, heatmap regression methods based on 1D landmark representations
have shown prominent performance on locating facial landmarks. However,
previous methods ignored to make deep explorations on the good potentials of 1D
landmark representations for sequential and structural modeling of multiple
landmarks to track facial landmarks. To address this limitation, we propose a
Transformer architecture, namely 1DFormer, which learns informative 1D landmark
representations by capturing the dynamic and the geometric patterns of
landmarks via token communications in both temporal and spatial dimensions for
facial landmark tracking. For temporal modeling, we propose a recurrent token
mixing mechanism, an axis-landmark-positional embedding mechanism, as well as a
confidence-enhanced multi-head attention mechanism to adaptively and robustly
embed long-term landmark dynamics into their 1D representations; for structure
modeling, we design intra-group and inter-group structure modeling mechanisms
to encode the component-level as well as global-level facial structure patterns
as a refinement for the 1D representations of landmarks through token
communications in the spatial dimension via 1D convolutional layers.
Experimental results on the 300VW and the TF databases show that 1DFormer
successfully models the long-range sequential patterns as well as the inherent
facial structures to learn informative 1D representations of landmark
sequences, and achieves state-of-the-art performance on facial landmark
tracking. | Source: | arXiv, 2311.00241 | Services: | Forum | Review | PDF | Favorites |
|
|
No review found.
Did you like this article?
Note: answers to reviews or questions about the article must be posted in the forum section.
Authors are not allowed to review their own article. They can use the forum section.
|
| |
|
|
|