An Interactive Immersive Audio Headset 2018
This prototype rendering system allows development and testing of audio-centred virtual and augmented reality applications, implemented on an embedded microcomputing platform (Bela) using C++. It integrates an inertial measurement unit sensor to achieve head-tracked binaural synthesis at low latency. An adaptation of state-of-the art scattering delay network synthetic reverberation is included to achieve efficient, real-time auralisation for mobile use cases. Auditory display systems running on the device are operated with graphical input interfaces via TouchOSC middleware, running on a networked iOS device to allow untethered use. The system has been used as the basis of a PhD investigation to examine head-related transfer function (HRTF) preference on this custom platform, to evaluate the resulting impact of preference on interactive sound localisation accuracy, and in the design and evaluation of a spatial auditory browsing system for music exploration.