Parameter tuning for enhancing inter-subject emotion classification in four classes for vr-eeg predictive analytics

The following research describes the potential in classifying emotions using wearable EEG headset while using a virtual environment to stimulate the responses of the users. Current developments on emotion classification have always steered towards the use of a clinical-grade EEG headset with a 2D mo...

Full description

Saved in:
Bibliographic Details
Main Authors: azmi Sofian Suhaimi, James Mountstephens, Jason Teo
Format: Article
Language:English
Published: 2020
Subjects:
Online Access:https://eprints.ums.edu.my/id/eprint/25668/1/Parameter%20tuning%20for%20enhancing%20inter-subject%20emotion%20classification%20in%20four%20classes%20for%20vr-eeg%20predictive%20analytics.pdf
https://eprints.ums.edu.my/id/eprint/25668/
Tags: Add Tag
No Tags, Be the first to tag this record!
id my.ums.eprints.25668
record_format eprints
spelling my.ums.eprints.256682020-07-22T03:39:39Z https://eprints.ums.edu.my/id/eprint/25668/ Parameter tuning for enhancing inter-subject emotion classification in four classes for vr-eeg predictive analytics azmi Sofian Suhaimi James Mountstephens Jason Teo TJ Mechanical engineering and machinery The following research describes the potential in classifying emotions using wearable EEG headset while using a virtual environment to stimulate the responses of the users. Current developments on emotion classification have always steered towards the use of a clinical-grade EEG headset with a 2D monitor screen for stimuli evocations which may introduce additional artifacts or inaccurate readings into the dataset due to users unable to provide their full attention from the given stimuli even though the stimuli presentated should have been advantageous in provoking emotional reactions. Furthermore, the clinical-grade EEG headset requires a lengthy duration to setup and avoiding any hindrance such as hairs hindering the electrodes from collecting the brainwave signals or electrodes coming loose thus requiring additional time to work to fix the issue. With the lengthy duration of setting up the EEG headset, the user may expereince fatigue and become incapable of responding naturally to the emotion being presented from the stimuli. Therefore, this research introduces the use of a wearable low-cost EEG headset with dry electrodes that requires only a trivial amount of time to set up and a Virtual Reality (VR) headset for the presentation of the emotional stimuli in an immersive VR environment which is paired with earphones to provide the full immersive experience needed for the evocation of the emotion. The 360 video stimuli are designed and stitched together according to the arousal-valence space (AVS) model with each quadrant having an 80-second stimuli presentation period followed by a 10-second rest period in between quadrants. The EEG dataset is then collected through the use of a wearable low-cost EEG using four channels located at TP9, TP10, AF7, AF8. The collected dataset is then fed into the machine learning algorithms, namely KNN, SVM and Deep Learning with the dataset focused on inter-subject test approaches using 10-fold cross-validation. The results obtained found that SVM using Radial Basis Function Kernel 1 achieved the highest accuracy at 85.01%. This suggests that the use of a wearable low-cost EEG headset with a significantly lower resolution signal compared to clinical-grade equipment which utilizes only a very limited number of electrodes appears to be highly promising as an emotion classification BCI tool and may thus spur up open up myriad practical, affordable and cost-friendly solutions in applying to the medical, education, military, and entertainment domains. 2020 Article PeerReviewed text en https://eprints.ums.edu.my/id/eprint/25668/1/Parameter%20tuning%20for%20enhancing%20inter-subject%20emotion%20classification%20in%20four%20classes%20for%20vr-eeg%20predictive%20analytics.pdf azmi Sofian Suhaimi and James Mountstephens and Jason Teo (2020) Parameter tuning for enhancing inter-subject emotion classification in four classes for vr-eeg predictive analytics. International Journal of Advanced Science and Technology, 29 (6s). p. 1483.
institution Universiti Malaysia Sabah
building UMS Library
collection Institutional Repository
continent Asia
country Malaysia
content_provider Universiti Malaysia Sabah
content_source UMS Institutional Repository
url_provider http://eprints.ums.edu.my/
language English
topic TJ Mechanical engineering and machinery
spellingShingle TJ Mechanical engineering and machinery
azmi Sofian Suhaimi
James Mountstephens
Jason Teo
Parameter tuning for enhancing inter-subject emotion classification in four classes for vr-eeg predictive analytics
description The following research describes the potential in classifying emotions using wearable EEG headset while using a virtual environment to stimulate the responses of the users. Current developments on emotion classification have always steered towards the use of a clinical-grade EEG headset with a 2D monitor screen for stimuli evocations which may introduce additional artifacts or inaccurate readings into the dataset due to users unable to provide their full attention from the given stimuli even though the stimuli presentated should have been advantageous in provoking emotional reactions. Furthermore, the clinical-grade EEG headset requires a lengthy duration to setup and avoiding any hindrance such as hairs hindering the electrodes from collecting the brainwave signals or electrodes coming loose thus requiring additional time to work to fix the issue. With the lengthy duration of setting up the EEG headset, the user may expereince fatigue and become incapable of responding naturally to the emotion being presented from the stimuli. Therefore, this research introduces the use of a wearable low-cost EEG headset with dry electrodes that requires only a trivial amount of time to set up and a Virtual Reality (VR) headset for the presentation of the emotional stimuli in an immersive VR environment which is paired with earphones to provide the full immersive experience needed for the evocation of the emotion. The 360 video stimuli are designed and stitched together according to the arousal-valence space (AVS) model with each quadrant having an 80-second stimuli presentation period followed by a 10-second rest period in between quadrants. The EEG dataset is then collected through the use of a wearable low-cost EEG using four channels located at TP9, TP10, AF7, AF8. The collected dataset is then fed into the machine learning algorithms, namely KNN, SVM and Deep Learning with the dataset focused on inter-subject test approaches using 10-fold cross-validation. The results obtained found that SVM using Radial Basis Function Kernel 1 achieved the highest accuracy at 85.01%. This suggests that the use of a wearable low-cost EEG headset with a significantly lower resolution signal compared to clinical-grade equipment which utilizes only a very limited number of electrodes appears to be highly promising as an emotion classification BCI tool and may thus spur up open up myriad practical, affordable and cost-friendly solutions in applying to the medical, education, military, and entertainment domains.
format Article
author azmi Sofian Suhaimi
James Mountstephens
Jason Teo
author_facet azmi Sofian Suhaimi
James Mountstephens
Jason Teo
author_sort azmi Sofian Suhaimi
title Parameter tuning for enhancing inter-subject emotion classification in four classes for vr-eeg predictive analytics
title_short Parameter tuning for enhancing inter-subject emotion classification in four classes for vr-eeg predictive analytics
title_full Parameter tuning for enhancing inter-subject emotion classification in four classes for vr-eeg predictive analytics
title_fullStr Parameter tuning for enhancing inter-subject emotion classification in four classes for vr-eeg predictive analytics
title_full_unstemmed Parameter tuning for enhancing inter-subject emotion classification in four classes for vr-eeg predictive analytics
title_sort parameter tuning for enhancing inter-subject emotion classification in four classes for vr-eeg predictive analytics
publishDate 2020
url https://eprints.ums.edu.my/id/eprint/25668/1/Parameter%20tuning%20for%20enhancing%20inter-subject%20emotion%20classification%20in%20four%20classes%20for%20vr-eeg%20predictive%20analytics.pdf
https://eprints.ums.edu.my/id/eprint/25668/
_version_ 1760230398071144448
score 13.160551