Frontal View Gait Recognition With Fusion of Depth Features From a Time of Flight Camera

— Frontal view gait recognition for people identification has been carried out using single RGB, stereo RGB, Kinect 1.0, and Doppler radar. However, existing methods based on these camera technologies suffer from several problems. Therefore, we propose a four-part method for frontal view gait recog...

Full description

Saved in:
Bibliographic Details
Main Authors: Tengku Mohd Afendi, Zulcaffle, Fatih, Kurugollu, Crookes, Danny, Ahmed, Bouridane, Mohsen, Farid
Format: Article
Language:English
Published: IEEE Xplore 2019
Subjects:
Online Access:http://ir.unimas.my/id/eprint/23508/1/Tengku.pdf
http://ir.unimas.my/id/eprint/23508/
https://ieeexplore.ieee.org/document/8466800
Tags: Add Tag
No Tags, Be the first to tag this record!
id my.unimas.ir.23508
record_format eprints
spelling my.unimas.ir.235082021-04-28T13:12:55Z http://ir.unimas.my/id/eprint/23508/ Frontal View Gait Recognition With Fusion of Depth Features From a Time of Flight Camera Tengku Mohd Afendi, Zulcaffle Fatih, Kurugollu Crookes, Danny Ahmed, Bouridane Mohsen, Farid QA75 Electronic computers. Computer science QA76 Computer software — Frontal view gait recognition for people identification has been carried out using single RGB, stereo RGB, Kinect 1.0, and Doppler radar. However, existing methods based on these camera technologies suffer from several problems. Therefore, we propose a four-part method for frontal view gait recognition based on the fusion of multiple features acquired from a Time-of-Flight (ToF) camera. We have developed a gait data set captured by a ToF camera. The data set includes two sessions recorded seven months apart, with 46 and 33 subjects, respectively, each with six walks with five covariates. The four-part method includes: a new human silhouette extraction algorithm that reduces the multiple reflection problem experienced by ToF cameras; a frame selection method based on a new gait cycle detection algorithm; four new gait image representations; and a novel fusion classifier. Rigorous experiments are carried out to compare the proposed method with state-of-the-art methods. The results show distinct improvements over recognition rates for all covariates. The proposed method outperforms all major existing approaches for all covariates and results in 66.1% and 81.0% Rank 1 and Rank 5 recognition rates, respectively, in overall covariates, compared with a best state-of-the-art method performance of 35.7% and 57.7%. IEEE Xplore 2019 Article PeerReviewed text en http://ir.unimas.my/id/eprint/23508/1/Tengku.pdf Tengku Mohd Afendi, Zulcaffle and Fatih, Kurugollu and Crookes, Danny and Ahmed, Bouridane and Mohsen, Farid (2019) Frontal View Gait Recognition With Fusion of Depth Features From a Time of Flight Camera. IEEE Transactions on Information Forensics and Security, 14 (4). pp. 1067-1082. ISSN 1556-6013 https://ieeexplore.ieee.org/document/8466800 DOI: 10.1109/TIFS.2018.2870594
institution Universiti Malaysia Sarawak
building Centre for Academic Information Services (CAIS)
collection Institutional Repository
continent Asia
country Malaysia
content_provider Universiti Malaysia Sarawak
content_source UNIMAS Institutional Repository
url_provider http://ir.unimas.my/
language English
topic QA75 Electronic computers. Computer science
QA76 Computer software
spellingShingle QA75 Electronic computers. Computer science
QA76 Computer software
Tengku Mohd Afendi, Zulcaffle
Fatih, Kurugollu
Crookes, Danny
Ahmed, Bouridane
Mohsen, Farid
Frontal View Gait Recognition With Fusion of Depth Features From a Time of Flight Camera
description — Frontal view gait recognition for people identification has been carried out using single RGB, stereo RGB, Kinect 1.0, and Doppler radar. However, existing methods based on these camera technologies suffer from several problems. Therefore, we propose a four-part method for frontal view gait recognition based on the fusion of multiple features acquired from a Time-of-Flight (ToF) camera. We have developed a gait data set captured by a ToF camera. The data set includes two sessions recorded seven months apart, with 46 and 33 subjects, respectively, each with six walks with five covariates. The four-part method includes: a new human silhouette extraction algorithm that reduces the multiple reflection problem experienced by ToF cameras; a frame selection method based on a new gait cycle detection algorithm; four new gait image representations; and a novel fusion classifier. Rigorous experiments are carried out to compare the proposed method with state-of-the-art methods. The results show distinct improvements over recognition rates for all covariates. The proposed method outperforms all major existing approaches for all covariates and results in 66.1% and 81.0% Rank 1 and Rank 5 recognition rates, respectively, in overall covariates, compared with a best state-of-the-art method performance of 35.7% and 57.7%.
format Article
author Tengku Mohd Afendi, Zulcaffle
Fatih, Kurugollu
Crookes, Danny
Ahmed, Bouridane
Mohsen, Farid
author_facet Tengku Mohd Afendi, Zulcaffle
Fatih, Kurugollu
Crookes, Danny
Ahmed, Bouridane
Mohsen, Farid
author_sort Tengku Mohd Afendi, Zulcaffle
title Frontal View Gait Recognition With Fusion of Depth Features From a Time of Flight Camera
title_short Frontal View Gait Recognition With Fusion of Depth Features From a Time of Flight Camera
title_full Frontal View Gait Recognition With Fusion of Depth Features From a Time of Flight Camera
title_fullStr Frontal View Gait Recognition With Fusion of Depth Features From a Time of Flight Camera
title_full_unstemmed Frontal View Gait Recognition With Fusion of Depth Features From a Time of Flight Camera
title_sort frontal view gait recognition with fusion of depth features from a time of flight camera
publisher IEEE Xplore
publishDate 2019
url http://ir.unimas.my/id/eprint/23508/1/Tengku.pdf
http://ir.unimas.my/id/eprint/23508/
https://ieeexplore.ieee.org/document/8466800
_version_ 1698700786172690432
score 13.214268