Multi-modal interaction system for smart tv environments

Injae Lee, Jihun Cha, Ohseok Kwon

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

2 Scopus citations

Abstract

This paper presents a multi-modal interaction system based on gaze tracking and gesture recognition. It allows a user to select a specific menu or object by gazing at them, and then control them by hand gesture recognition. It will give users a convenient way to consume various multimedia in smart TV environment.

Original languageEnglish
Title of host publicationProceedings - 2014 IEEE International Symposium on Multimedia, ISM 2014
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages263-264
Number of pages2
ISBN (Electronic)9781479943111
DOIs
StatePublished - 5 Feb 2015
Externally publishedYes
Event16th IEEE International Symposium on Multimedia, ISM 2014 - Taichung, Taiwan, Province of China
Duration: 10 Dec 201412 Dec 2014

Publication series

NameProceedings - 2014 IEEE International Symposium on Multimedia, ISM 2014

Conference

Conference16th IEEE International Symposium on Multimedia, ISM 2014
Country/TerritoryTaiwan, Province of China
CityTaichung
Period10/12/1412/12/14

Keywords

  • gaze tracking
  • gesture
  • interaction
  • modality
  • multi-modal
  • smart TV

Fingerprint

Dive into the research topics of 'Multi-modal interaction system for smart tv environments'. Together they form a unique fingerprint.

Cite this