Our paper titled ‘Approaches to Visualising the Spatial Position of Sound-objects’ has been accepted for publication as a paper and demo at EVA London 2016. The paper, which draws heavily on research conducted as part of our AHRC-funded Transforming Transformation project explores ways in which sound spatial location and spatial trajectories can visualised in a virtual environment. The paper will be presented on Tuesday 12th July. The abstract is shown below.
In this paper we present the rationale and design for two systems (developed by the Integra Lab research group at Birmingham Conservatoire) implementing a common approach to interactive visualisation of the spatial position of ‘sound-objects’. The first system forms part of the AHRC- funded project ‘Transforming Transformation: 3D Models for Interactive Sound Design’, which entails the development of a new interaction model for audio processing whereby sound can be manipulated through grasp as if it were an invisible 3D object. The second system concerns the spatial manipulation of ‘beatboxer’ vocal sound using handheld mobile devices through already- learned physical movement. In both cases a means to visualise the spatial position of multiple sound sources within a 3D ‘stereo image’ is central to the system design, so a common model for this task was therefore developed. This paper describes the ways in which sound and spatial information are implemented to meet the practical demands of these systems, whilst relating this to the wider context of extant, and potential future methods for spatial audio visualisation.