Recognition of Radar-Based Deaf Sign Language Using Convolution Neural Network
The difficulties in the communication between the deaf and normal people through sign language can be overcome by implementing deep learning in the gestures signal recognition. The use of the Convolution Neural Network (CNN) in distinguishing radar-based gesture signals of deaf sign language has not...
Saved in:
Main Authors: | , , , |
---|---|
Other Authors: | |
Format: | Article |
Published: |
Penerbit UTHM
2024
|
Subjects: | |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | The difficulties in the communication between the deaf and normal people through sign language can be overcome by implementing deep learning in the gestures signal recognition. The use of the Convolution Neural Network (CNN) in distinguishing radar-based gesture signals of deaf sign language has not been investigated. This paper describes the recognition of gestures of deaf sign language using radar and CNN. Six gestures of deaf sign language were acquired from normal subjects using a radar system and processed. Short-time Fourier Transform was performed to extract the gestures features and the classification was performed using CNN. The performance of CNN was examined using two types of inputs |
---|