Comparative Perceptual Studies
Development of Virtual Auditory Environments for the Blind
Within the framework of developing assistive virtual auditory environments for the visually impaired; this project explores the relationship between movement, auditory perception and sound localisation.
Within the context of comparative perceptual studies in both real and virtual world experiments, the project aims to address three broad questions: Could the role movement (chiefly head movement) plays in auditory perception inform factors such as spatial resolution when designing virtual auditory systems? How can movements be translated into such a system, either as an aspect of the control interface or as a facet of the audio rendering itself? Do differences in perception exist between sighted and visually impaired individuals?
It is hoped that not only will the project provide insights into perception, and the differences in auditory perception between blind and sighted individuals but that these insights will translate directly into improved virtual systems- for instance by mitigating common issues such as ambiguity of sound source location- which can arise as a result of the practical limitations of rendering binaural [headphone based ‘3D’] audio.
Participants: Chris Feakes, Dr Lorenzo Picinali, Dr Dylan Menzies