Audio Focus: Interactive spatial sound coupled with haptics to improve sound source location in poor visibility

Abstract : In an effort to simplify human resource management and reduce costs, control towers are now more and more designed to not be implanted directly on the airport but remotely. This concept, known as Remote Control Tower, offers a “digital” working context because the view on the runways is broadcast remotely via cameras, which are located on the physical airport. This offers researchers and engineers the possibility to develop novel interaction techniques. But this technology relies on the sense of sight, which is largely used to give the operator information and interaction, and which is now becoming overloaded. In this paper, we focus on the design and the testing of new interaction forms that rely on the human senses of hearing and touch. More precisely, our study aims at quantifying the contribution of a multimodal interaction technique based on spatial sound and vibrotactile feedback to improve aircraft location. Applied to Remote Tower environment, the final purpose is to enhance Air Traffic Controller's perception and increase safety. Three different interaction modalities have been compared by involving 22 Air Traffic Controllers in a simulated environment. The experimental task consisted in locating aircraft in different airspace positions by using the senses of hearing and touch through two visibility conditions. In the first modality (spatial sound only), the sound sources (e.g. aircraft) had the same amplification factor. In the second modality (called Audio Focus), the amplification factor of the sound sources located along the participant's head sagittal axis was increased, while the intensity of the sound sources located outside this axis was decreased. In the last modality, Audio Focus was coupled with vibrotactile feedback to indicate in addition the vertical positions of aircraft. Behavioral (i.e. accuracy and response times measurements) and subjective (i.e. questionnaires) results showed significantly higher performance in poor visibility when using Audio Focus interaction. In particular, interactive spatial sound gave the participants notably higher accuracy in degraded visibility compared to spatial sound only. This result was even better when coupled with vibrotactile feedback. Meanwhile, response times were significantly longer when using Audio Focus modality (coupled with vibrotactile feedback or not), while remaining acceptably short. This study can be seen as the initial step in the development of a novel interaction technique that uses sound as a means of location when the sense of sight alone is not enough.
Document type :
Journal articles
Complete list of metadatas

Cited literature [38 references]  Display  Hide  Download

https://hal-enac.archives-ouvertes.fr/hal-02100767
Contributor : Laurence Porte <>
Submitted on : Monday, April 22, 2019 - 8:54:08 PM
Last modification on : Tuesday, June 18, 2019 - 10:33:12 AM

File

Reynal2019-IJHCS_author-versio...
Files produced by the author(s)

Identifiers

Collections

Citation

Maxime Reynal, Jean-Paul Imbert, Pietro Aricò, Jérôme Toupillier, Gianluca Borghini, et al.. Audio Focus: Interactive spatial sound coupled with haptics to improve sound source location in poor visibility. International Journal of Human-Computer Studies, Elsevier, 2019, 129, pp.116-128. ⟨10.1016/j.ijhcs.2019.04.001⟩. ⟨hal-02100767⟩

Share

Metrics

Record views

114

Files downloads

86