Our paper titled ‘Approaches to Visualising the Spatial Position of Sound-objects’ has been accepted for publication as a paper and demo at the International Conference on Live Interfaces. The paper, brings together research conducted to-date as part of our AHRC-funded Transforming Transformation project, a collaboration with Glasgow School of Art’s Digital Design Studios and immersive and interactive audio company TwoBigEars. The paper will be presented on Tuesday 12th July. The abstract is shown below.

In this position paper we present research from the AHRC-funded project Transforming Transformation: 3D Models for Interactive Sound Design. The project entails exploration of a new interface for live audio transformation whereby sound can be manipulated through grasp as if it were an invisible 3D object. This approach contrasts with existing GUI-based systems, which primarily use 2D input and 2D visualisation. In the paper we describe the first phase of this research, which enables audio sources to be positioned and moved within a 3D space by grabbing them from a palette and controlling their spatial location using hand movements. Feedback regarding spatial location is provided through a visualisation of these sources within a virtual 3D space. Spatial trajectories can be ‘drawn’ in the air, and sounds can be ‘rolled’ along these trajectories thus providing a ‘direct manipulation’ interface to specifying spatio-temporal dynamics. We describe the design of the system along with findings of the initial system usability tests.

By Published On: April 18th, 2016

Share this