Impact of HRTF individualisation and head movements in a real/virtual localisation task (2510.09161v1)
Abstract: The objective of Audio Augmented Reality (AAR) applications are to seamlessly integrate virtual sound sources within a real environment. It is critical for these applications that virtual sources are localised precisely at the intended position, and that the acoustic environments are accurately matched. One effective method for spatialising sound on headphones is through Head-Related Transfer Functions (HRTFs). These characterise how the physical features of a listener modify sound waves before they reach the eardrum. This study examines the influence of using individualised HRTFs on the localisation and the perceived realism of virtual sound sources associated with a real visual object. Participants were tasked with localising virtual and real speech sources presented via headphones and through a spherical loudspeaker array, respectively. The assessment focussed on perceived realism and sources location. All sources were associated with one of thirty real visual sources (loudspeakers) arranged in a semi-anechoic room. Various sound source renderings were compared, including single loudspeaker rendering and binaural rendering with individualised or non-individualised HRTFs. Additionally, the impact of head movements was explored: ten participants completed the same task with and without the possibility to move their head. The results showed that using individual HRTFs improved perceived realism but not localisation performance in the static scenario. Surprisingly, the opposite was observed when head movements were possible and encouraged.
Sponsor
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.