Accéder directement au contenu Accéder directement à la navigation
Article dans une revue

Audio Focus: Interactive spatial sound coupled with haptics to improve sound source location in poor visibility

Abstract : In an effort to simplify human resource management and reduce costs, control towers are now more and more designed to not be implanted directly on the airport but remotely. This concept, known as Remote Control Tower, offers a “digital” working context because the view on the runways is broadcast remotely via cameras, which are located on the physical airport. This offers researchers and engineers the possibility to develop novel interaction techniques. But this technology relies on the sense of sight, which is largely used to give the operator information and interaction, and which is now becoming overloaded. In this paper, we focus on the design and the testing of new interaction forms that rely on the human senses of hearing and touch. More precisely, our study aims at quantifying the contribution of a multimodal interaction technique based on spatial sound and vibrotactile feedback to improve aircraft location. Applied to Remote Tower environment, the final purpose is to enhance Air Traffic Controller's perception and increase safety. Three different interaction modalities have been compared by involving 22 Air Traffic Controllers in a simulated environment. The experimental task consisted in locating aircraft in different airspace positions by using the senses of hearing and touch through two visibility conditions. In the first modality (spatial sound only), the sound sources (e.g. aircraft) had the same amplification factor. In the second modality (called Audio Focus), the amplification factor of the sound sources located along the participant's head sagittal axis was increased, while the intensity of the sound sources located outside this axis was decreased. In the last modality, Audio Focus was coupled with vibrotactile feedback to indicate in addition the vertical positions of aircraft. Behavioral (i.e. accuracy and response times measurements) and subjective (i.e. questionnaires) results showed significantly higher performance in poor visibility when using Audio Focus interaction. In particular, interactive spatial sound gave the participants notably higher accuracy in degraded visibility compared to spatial sound only. This result was even better when coupled with vibrotactile feedback. Meanwhile, response times were significantly longer when using Audio Focus modality (coupled with vibrotactile feedback or not), while remaining acceptably short. This study can be seen as the initial step in the development of a novel interaction technique that uses sound as a means of location when the sense of sight alone is not enough.
Type de document :
Article dans une revue
Liste complète des métadonnées

Littérature citée [57 références]  Voir  Masquer  Télécharger

https://hal-enac.archives-ouvertes.fr/hal-02100767
Contributeur : Laurence Porte Connectez-vous pour contacter le contributeur
Soumis le : lundi 22 avril 2019 - 20:54:08
Dernière modification le : mercredi 3 novembre 2021 - 14:24:39

Fichier

Reynal2019-IJHCS_author-versio...
Fichiers produits par l'(les) auteur(s)

Identifiants

Collections

Citation

Maxime Reynal, Jean-Paul Imbert, Pietro Aricò, Jérôme Toupillier, Gianluca Borghini, et al.. Audio Focus: Interactive spatial sound coupled with haptics to improve sound source location in poor visibility. International Journal of Human-Computer Studies, Elsevier, 2019, 129, pp.116-128. ⟨10.1016/j.ijhcs.2019.04.001⟩. ⟨hal-02100767⟩

Partager

Métriques

Consultations de la notice

223

Téléchargements de fichiers

235