Boosting deepfake detection features with attention units

One of the emerging problems of deep learning technology is deepfake videos with easy access to powerful and inexpensive computing power, The internet is littered with fake material like fake photos, videos, and audios. People’s identities, privacy and reputations are at risk due to the widespread p...

Full description

Saved in:
Bibliographic Details
Main Authors: Waseem, Saima, Abu-Bakar, Syed A. R., Omar, Zaid, Ahmed, Bilal Ashfaq, Baloch, Saba
Format: Conference or Workshop Item
Published: 2023
Subjects:
Online Access:http://eprints.utm.my/107873/
http://dx.doi.org/10.1145/3631991.3632037
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:One of the emerging problems of deep learning technology is deepfake videos with easy access to powerful and inexpensive computing power, The internet is littered with fake material like fake photos, videos, and audios. People’s identities, privacy and reputations are at risk due to the widespread proliferation of fake media content. Since videos can have a potentially destructive effect on society, establishing their legitimacy is crucial. Thus, we investigate different attention mechanisms in this paper for deepfake detection. In videos, attention mechanisms are responsible for directing the convolutional Neural Network’s (CNNs) emphasis to the most critical parts of the frame in terms of both content and context. Therefore, we answer the question: How do you apply attention to deepfake detection? And what form of attention is effective for deepfake detection tasks? To address these concerns, we conduct research and experimental testing on videos that have been manipulated using four different methods drawn from the FaceForensics++ dataset. We conduct a cross-data evaluat ion for the network with and without attention to assess the network’s capacity to detect previously unseen manipulated images. The proposed approach outperformed conventional Convolutional Neural Networks for deepfake detection by 8% AUC performance.