| | |
| | |
Stat |
Members: 3643 Articles: 2'488'730 Articles rated: 2609
29 March 2024 |
|
| | | |
|
Article overview
| |
|
Action Quality Assessment with Temporal Parsing Transformer | Yang Bai
; Desen Zhou
; Songyang Zhang
; Jian Wang
; Errui Ding
; Yu Guan
; Yang Long
; Jingdong Wang
; | Date: |
19 Jul 2022 | Abstract: | Action Quality Assessment(AQA) is important for action understanding and
resolving the task poses unique challenges due to subtle visual differences.
Existing state-of-the-art methods typically rely on the holistic video
representations for score regression or ranking, which limits the
generalization to capture fine-grained intra-class variation. To overcome the
above limitation, we propose a temporal parsing transformer to decompose the
holistic feature into temporal part-level representations. Specifically, we
utilize a set of learnable queries to represent the atomic temporal patterns
for a specific action. Our decoding process converts the frame representations
to a fixed number of temporally ordered part representations. To obtain the
quality score, we adopt the state-of-the-art contrastive regression based on
the part representations. Since existing AQA datasets do not provide temporal
part-level labels or partitions, we propose two novel loss functions on the
cross attention responses of the decoder: a ranking loss to ensure the
learnable queries to satisfy the temporal order in cross attention and a
sparsity loss to encourage the part representations to be more discriminative.
Extensive experiments show that our proposed method outperforms prior work on
three public AQA benchmarks by a considerable margin. | Source: | arXiv, 2207.09270 | Services: | Forum | Review | PDF | Favorites |
|
|
No review found.
Did you like this article?
Note: answers to reviews or questions about the article must be posted in the forum section.
Authors are not allowed to review their own article. They can use the forum section.
browser claudebot
|
| |
|
|
|
| News, job offers and information for researchers and scientists:
| |