Classification of facial part movement acquired from Kinect V1 and Kinect V2

The aim of this study is to determine the motion sensor with better performance in facial part movements recognition among Kinect v1 and Kinect v2. This study has applied some classification methods such as neural network, complex decision tree, cubic SVM, fine Gaussian SVM, fine kNN and QDA in the...

Full description

Saved in:
Bibliographic Details
Main Authors: Sheng, Guang Heng, Rosdiyana, Samad, Mahfuzah, Mustafa, Zainah, Md. Zain, Nor Rul Hasma, Abdullah, Dwi, Pebrianti
Format: Conference or Workshop Item
Language:English
English
Published: Springer 2021
Subjects:
Online Access:http://umpir.ump.edu.my/id/eprint/33563/1/Classification%20of%20facial%20part%20movement%20acquired%20from%20Kinect%20V1%20.pdf
http://umpir.ump.edu.my/id/eprint/33563/2/Classification%20of%20facial%20part%20movement%20acquired%20from%20Kinect%20V1_FULL.pdf
http://umpir.ump.edu.my/id/eprint/33563/
https://doi.org/10.1007/978-981-15-5281-6_65
Tags: Add Tag
No Tags, Be the first to tag this record!
id my.ump.umpir.33563
record_format eprints
spelling my.ump.umpir.335632022-03-23T08:19:00Z http://umpir.ump.edu.my/id/eprint/33563/ Classification of facial part movement acquired from Kinect V1 and Kinect V2 Sheng, Guang Heng Rosdiyana, Samad Mahfuzah, Mustafa Zainah, Md. Zain Nor Rul Hasma, Abdullah Dwi, Pebrianti TK Electrical engineering. Electronics Nuclear engineering The aim of this study is to determine the motion sensor with better performance in facial part movements recognition among Kinect v1 and Kinect v2. This study has applied some classification methods such as neural network, complex decision tree, cubic SVM, fine Gaussian SVM, fine kNN and QDA in the dataset obtained from Kinect v1 and Kinect v2. The facial part movement is detected and extracted in 11 features and 15 classes. The chosen classifications are then applied to train and test the dataset. Kinect sensor that has the dataset with highest testing accuracy will be selected to develop an assistive facial exercise application in terms of tracking performance and detection accuracy. Springer 2021 Conference or Workshop Item PeerReviewed pdf en http://umpir.ump.edu.my/id/eprint/33563/1/Classification%20of%20facial%20part%20movement%20acquired%20from%20Kinect%20V1%20.pdf pdf en http://umpir.ump.edu.my/id/eprint/33563/2/Classification%20of%20facial%20part%20movement%20acquired%20from%20Kinect%20V1_FULL.pdf Sheng, Guang Heng and Rosdiyana, Samad and Mahfuzah, Mustafa and Zainah, Md. Zain and Nor Rul Hasma, Abdullah and Dwi, Pebrianti (2021) Classification of facial part movement acquired from Kinect V1 and Kinect V2. In: Lecture Notes in Electrical Engineering; 11th National Technical Symposium on Unmanned System Technology, NUSYS 2019, 2 - 3 December 2019 , Kuantan, Malaysia. 911 -924., 666. ISSN 1876-1100 ISBN 9789811552816 https://doi.org/10.1007/978-981-15-5281-6_65
institution Universiti Malaysia Pahang
building UMP Library
collection Institutional Repository
continent Asia
country Malaysia
content_provider Universiti Malaysia Pahang
content_source UMP Institutional Repository
url_provider http://umpir.ump.edu.my/
language English
English
topic TK Electrical engineering. Electronics Nuclear engineering
spellingShingle TK Electrical engineering. Electronics Nuclear engineering
Sheng, Guang Heng
Rosdiyana, Samad
Mahfuzah, Mustafa
Zainah, Md. Zain
Nor Rul Hasma, Abdullah
Dwi, Pebrianti
Classification of facial part movement acquired from Kinect V1 and Kinect V2
description The aim of this study is to determine the motion sensor with better performance in facial part movements recognition among Kinect v1 and Kinect v2. This study has applied some classification methods such as neural network, complex decision tree, cubic SVM, fine Gaussian SVM, fine kNN and QDA in the dataset obtained from Kinect v1 and Kinect v2. The facial part movement is detected and extracted in 11 features and 15 classes. The chosen classifications are then applied to train and test the dataset. Kinect sensor that has the dataset with highest testing accuracy will be selected to develop an assistive facial exercise application in terms of tracking performance and detection accuracy.
format Conference or Workshop Item
author Sheng, Guang Heng
Rosdiyana, Samad
Mahfuzah, Mustafa
Zainah, Md. Zain
Nor Rul Hasma, Abdullah
Dwi, Pebrianti
author_facet Sheng, Guang Heng
Rosdiyana, Samad
Mahfuzah, Mustafa
Zainah, Md. Zain
Nor Rul Hasma, Abdullah
Dwi, Pebrianti
author_sort Sheng, Guang Heng
title Classification of facial part movement acquired from Kinect V1 and Kinect V2
title_short Classification of facial part movement acquired from Kinect V1 and Kinect V2
title_full Classification of facial part movement acquired from Kinect V1 and Kinect V2
title_fullStr Classification of facial part movement acquired from Kinect V1 and Kinect V2
title_full_unstemmed Classification of facial part movement acquired from Kinect V1 and Kinect V2
title_sort classification of facial part movement acquired from kinect v1 and kinect v2
publisher Springer
publishDate 2021
url http://umpir.ump.edu.my/id/eprint/33563/1/Classification%20of%20facial%20part%20movement%20acquired%20from%20Kinect%20V1%20.pdf
http://umpir.ump.edu.my/id/eprint/33563/2/Classification%20of%20facial%20part%20movement%20acquired%20from%20Kinect%20V1_FULL.pdf
http://umpir.ump.edu.my/id/eprint/33563/
https://doi.org/10.1007/978-981-15-5281-6_65
_version_ 1729703436293767168
score 13.160551