| | |
| | |
Stat |
Members: 3645 Articles: 2'501'711 Articles rated: 2609
19 April 2024 |
|
| | | |
|
Article overview
| |
|
ForgeryNet -- Face Forgery Analysis Challenge 2021: Methods and Results | Yinan He
; Lu Sheng
; Jing Shao
; Ziwei Liu
; Zhaofan Zou
; Zhizhi Guo
; Shan Jiang
; Curitis Sun
; Guosheng Zhang
; Keyao Wang
; Haixiao Yue
; Zhibin Hong
; Wanguo Wang
; Zhenyu Li
; Qi Wang
; Zhenli Wang
; Ronghao Xu
; Mingwen Zhang
; Zhiheng Wang
; Zhenhang Huang
; Tianming Zhang
; Ningning Zhao
; | Date: |
15 Dec 2021 | Abstract: | The rapid progress of photorealistic synthesis techniques has reached a
critical point where the boundary between real and manipulated images starts to
blur. Recently, a mega-scale deep face forgery dataset, ForgeryNet which
comprised of 2.9 million images and 221,247 videos has been released. It is by
far the largest publicly available in terms of data-scale, manipulations (7
image-level approaches, 8 video-level approaches), perturbations (36
independent and more mixed perturbations), and annotations (6.3 million
classification labels, 2.9 million manipulated area annotations, and 221,247
temporal forgery segment labels). This paper reports methods and results in the
ForgeryNet - Face Forgery Analysis Challenge 2021, which employs the ForgeryNet
benchmark. The model evaluation is conducted offline on the private test set. A
total of 186 participants registered for the competition, and 11 teams made
valid submissions. We will analyze the top-ranked solutions and present some
discussion on future work directions. | Source: | arXiv, 2112.08325 | Services: | Forum | Review | PDF | Favorites |
|
|
No review found.
Did you like this article?
Note: answers to reviews or questions about the article must be posted in the forum section.
Authors are not allowed to review their own article. They can use the forum section.
browser Mozilla/5.0 AppleWebKit/537.36 (KHTML, like Gecko; compatible; ClaudeBot/1.0; +claudebot@anthropic.com)
|
| |
|
|
|
| News, job offers and information for researchers and scientists:
| |