Institutions | About Us | Help | Gaeilge
rian logo

Go Back
A Framework for Continuous Multimodal Sign Language Recognition
Kelly, Daniel; Delannoy, Jane Reilly; McDonald, John; Markham, Charles
We present a multimodal system for the recognition of manual signs and non-manual signals within continuous sign language sentences. In sign language, information is mainly conveyed through hand gestures (Manual Signs). Non-manual signals, such as facial expressions, head movements, body postures and torso movements, are used to express a large part of the grammar and some aspects of the syntax of sign language. In this paper we propose a multichannel HMM based system to recognize manual signs and non-manual signals. We choose a single non-manual signal, head movement, to evaluate our framework when recognizing non-manual signals. Manual signs and non-manual signals are processed independently using continuous multidimensional HMMs and a HMM threshold model. Experiments conducted demonstrate that our system achieved a detection ratio of 0.95 and a reliability measure of 0.93.
Keyword(s): Sign Language; Non-Manual Signals; HMM
Publication Date:
Type: Book chapter
Peer-Reviewed: Yes
Institution: Maynooth University
Citation(s): Kelly, Daniel and Delannoy, Jane Reilly and McDonald, John and Markham, Charles (2009) A Framework for Continuous Multimodal Sign Language Recognition. In: ICMI-MLMI '09 Proceedings of the 2009 international conference on Multimodal interfaces. ACM, pp. 351-358. ISBN 9781605587721
Publisher(s): ACM
File Format(s): other
Related Link(s):
First Indexed: 2020-04-02 06:30:55 Last Updated: 2020-04-02 06:30:55