Gesture recognition of everyday activities

Multimodal recognition system is becoming a more common interaction tools in the fields of ubiquitous and wearable computing. Recent technologies and development of multimodal in human computer interaction have encouraged the notion and analysis of multimodal in human daily life activities. This...

Full description

Saved in:
Bibliographic Details
Main Author: Ku Nurul Fazira, Ku Azir
Format: Thesis
Language:English
Published: University of Birmingham 2016
Subjects:
Online Access:http://dspace.unimap.edu.my:80/xmlui/handle/123456789/40860
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Multimodal recognition system is becoming a more common interaction tools in the fields of ubiquitous and wearable computing. Recent technologies and development of multimodal in human computer interaction have encouraged the notion and analysis of multimodal in human daily life activities. This thesis explores the concept of multimodal i.e. speech and gesture recognition in everyday life activities. It propose an approach to recognize goal of activity based on detecting and analyzing sequence of gesture, speech, object, actions and locations that are being manipulated by the users. In domains such as cooking, where there are involve many similar and repeated of objects and actions can be a valuable and interest area to study in determining the concept of multimodal in everyday activities. An experiment of gesture and speech in cooking activity were analysed in term of object manipulation and sequence of actions by using video analysis and RFID tagged objects. There were compared with multimodal in computerized interaction. This study has demonstrate multimodal also been used during cooking activity. Combination of speech and gesture results set sequence of actions which be used to determine the goal of activity through ontology multimodal. It also demonstrate a set of actions, objects and locations sequence guide to a new multimodalities in real life activities.