Eye Gesture in a Mixed Reality Environment - ENAC - École nationale de l'aviation civile Accéder directement au contenu
Communication Dans Un Congrès Année : 2019

Eye Gesture in a Mixed Reality Environment

Almoctar Hassoumi
  • Fonction : Auteur
  • PersonId : 1015038

Résumé

Using a simple approach, we demonstrate that eye gestures could provide a highly accurate interaction modality in a mixed reality environment. Such interaction has been proposed for desktop and mobile devices. Recently, Gaze gesture has gained a special interest in Human-Computer Interaction and granted new interaction possibilities, particularly for accessibility. We introduce a new approach to investigate how gaze tracking technologies could help people with ALS or other motor impairments to interact with computing devices. In this paper, we propose a touch-free, eye movement based entry mechanism for mixed reality environments that can be used without any prior calibration. We evaluate the usability of the system with 7 participants, describe the implementation of the method and discuss its advantages over traditional input modalities.
Fichier principal
Vignette du fichier
HUCAPP_2019_24_CR (5).pdf (8.82 Mo) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-02073441 , version 1 (24-05-2019)

Licence

Paternité - Pas d'utilisation commerciale - Pas de modification

Identifiants

Citer

Almoctar Hassoumi, Christophe Hurter. Eye Gesture in a Mixed Reality Environment. HUCAPP 2019, 14th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications, Feb 2019, Prague, Czech Republic. pp.P. 183 - 187 /ISBN: 978-989-758-354-4, ⟨10.5220/0007684001830187⟩. ⟨hal-02073441⟩

Collections

ENAC DEVI
222 Consultations
330 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More