Skip to Main content Skip to Navigation
Conference papers

Eye Gesture in a Mixed Reality Environment

Abstract : Using a simple approach, we demonstrate that eye gestures could provide a highly accurate interaction modality in a mixed reality environment. Such interaction has been proposed for desktop and mobile devices. Recently, Gaze gesture has gained a special interest in Human-Computer Interaction and granted new interaction possibilities, particularly for accessibility. We introduce a new approach to investigate how gaze tracking technologies could help people with ALS or other motor impairments to interact with computing devices. In this paper, we propose a touch-free, eye movement based entry mechanism for mixed reality environments that can be used without any prior calibration. We evaluate the usability of the system with 7 participants, describe the implementation of the method and discuss its advantages over traditional input modalities.
Complete list of metadata

Cited literature [18 references]  Display  Hide  Download
Contributor : Almoctar Hassoumi Connect in order to contact the contributor
Submitted on : Friday, May 24, 2019 - 11:14:31 PM
Last modification on : Tuesday, October 19, 2021 - 11:02:54 AM
Long-term archiving on: : Monday, September 30, 2019 - 7:09:23 PM


HUCAPP_2019_24_CR (5).pdf
Files produced by the author(s)




Almoctar Hassoumi, Christophe Hurter. Eye Gesture in a Mixed Reality Environment. HUCAPP 2019, 14th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications, Feb 2019, Prague, Czech Republic. pp.P. 183 - 187 /ISBN: 978-989-758-354-4, ⟨10.5220/0007684001830187⟩. ⟨hal-02073441⟩



Record views


Files downloads