Eye Gesture in a Mixed Reality Environment

Abstract : Using a simple approach, we demonstrate that eye gestures could provide a highly accurate interaction modality in a mixed reality environment. Such interaction has been proposed for desktop and mobile devices. Recently, Gaze gesture has gained a special interest in Human-Computer Interaction and granted new interaction possibilities, particularly for accessibility. We introduce a new approach to investigate how gaze tracking technologies could help people with ALS or other motor impairments to interact with computing devices. In this paper, we propose a touch-free, eye movement based entry mechanism for mixed reality environments that can be used without any prior calibration. We evaluate the usability of the system with 7 participants, describe the implementation of the method and discuss its advantages over traditional input modalities.
Complete list of metadatas

Cited literature [18 references]  Display  Hide  Download

https://hal-enac.archives-ouvertes.fr/hal-02073441
Contributor : Almoctar Hassoumi <>
Submitted on : Friday, May 24, 2019 - 11:14:31 PM
Last modification on : Monday, June 3, 2019 - 3:13:08 PM
Long-term archiving on : Monday, September 30, 2019 - 7:09:23 PM

File

HUCAPP_2019_24_CR (5).pdf
Files produced by the author(s)

Identifiers

Collections

Citation

Almoctar Hassoumi, Christophe Hurter. Eye Gesture in a Mixed Reality Environment. HUCAPP 2019 : 3rd International Conference on Human Computer Interaction Theory and Applications, Feb 2019, Prague, Czech Republic. pp.P. 183 - 187 /ISBN: 978-989-758-354-4, ⟨10.5220/0007684001830187⟩. ⟨hal-02073441⟩

Share

Metrics

Record views

64

Files downloads

77