Deep learning for deepfakes creation and detection: A survey
- Authors
- Thanh Thi Nguyen; Quoc Viet Hung Nguyen; Dung Tien Nguyen; Duc Thanh Nguyen; Thien Huynh-The; Nahavandi, Saeid; Thanh Tam Nguyen; Quoc-Viet Pham; Nguyen, Cuong M.
- Issue Date
- Oct-2022
- Publisher
- ACADEMIC PRESS INC ELSEVIER SCIENCE
- Keywords
- Deepfakes; Face manipulation; Artificial intelligence; Deep learning; Autoencoders; GAN; Forensics; Survey
- Citation
- COMPUTER VISION AND IMAGE UNDERSTANDING, v.223
- Journal Title
- COMPUTER VISION AND IMAGE UNDERSTANDING
- Volume
- 223
- URI
- https://scholarworks.bwise.kr/kumoh/handle/2020.sw.kumoh/28407
- DOI
- 10.1016/j.cviu.2022.103525
- ISSN
- 1077-3142
1090-235X
- Abstract
- Deep learning has been successfully applied to solve various complex problems ranging from big data analytics to computer vision and human-level control. Deep learning advances however have also been employed to create software that can cause threats to privacy, democracy and national security. One of those deep learningpowered applications recently emerged is deepfake. Deepfake algorithms can create fake images and videos that humans cannot distinguish them from authentic ones. The proposal of technologies that can automatically detect and assess the integrity of digital visual media is therefore indispensable. This paper presents a survey of algorithms used to create deepfakes and, more importantly, methods proposed to detect deepfakes in the literature to date. We present extensive discussions on challenges, research trends and directions related to deepfake technologies. By reviewing the background of deepfakes and state-of-the-art deepfake detection methods, this study provides a comprehensive overview of deepfake techniques and facilitates the development of new and more robust methods to deal with the increasingly challenging deepfakes.
- Files in This Item
-
- Appears in
Collections - ETC > 1. Journal Articles
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.