The project opens up a new field of music technology research by exploring human-centred approaches to creative sound design. The project will involve the development of a proof-of-concept system, which enables sound to be manipulated through hand movements as if it were an invisible 3D object. The interaction will be enhanced by real-time visualisation within a virtual 3D space. The research surfaces the notion of a ‘natural’ interface for sound design taking account of already learned behaviour and innate skills such as motor memory, gesture and spatial awareness. The aim is to enable musicians and sound designers to ‘think in sound’ when working with technology, catalysing a shift from technology-centric to human-centric models. As a pilot project, we will concentrate on one specific audio processing technique: sound spatialisation, with a view to later amplifying our research findings in a larger project, addressing a wider range of sound design practices. In simple terms: our system will explore the idea that we can make users feel as if they are directly positioning sound in space by ‘picking it up and moving it’ with their hands, with the aim of serving as an exemplar for future, more detailed studies.
Sound design entails the manipulation of sound for dramatic, realistic, musical, or emotional effect. Typically, audio is processed using techniques such as filtering, time-stretching, pitch shifting, granulation, attenuation and panning. These techniques are exposed to the sound designer through discrete controls such as ‘sliders’ and ‘dials’ each of which controls a single parameter in the underlying signal processing system. By contrast, human concepts of sound are characterised by polymorphous, perceptual, metaphorical, symbolic and onomatopoeic associations. Many of these associations are cross-sensory, for example: ‘warm’, ‘bright’, ‘distant’, ‘metallic’. Furthermore, they suggest physicality and invite the possibility of direct quasi-tactile manipulation. This is especially true of spatialisation, where there exists a sense of sound localisation within an acoustic space, further evoking the idea of tangibility: ‘moving’ or ‘placing’ a sound, or even ‘pushing’ a sound to provide inertia in a given direction.
Our research therefore challenges existing sound design models and explores the notion of a ‘natural’ and direct corporeal link between imagined sound and acoustic results. In order to achieve this, we will draw upon the immense benefits offered by the PI’s research lab within a leading music Conservatoire. Being based within a Conservatoire environment, provides the project with immediate and unfettered access to high-quality musicians from a range of backgrounds. This serves to ground the research firmly in artistic practice, allowing artistic ideas to both test extant models and serve as the basis for new ones. Formal user experience reviews will be undertaken at milestone stages in the project allowing us to surface users’ tacit views and opinions on their experience when using the developed system. The research method will also entail the commissioning of a new musical work utilising the system, to be premiered in a public workshop and concert in the final month of the project. This will serve as both an opportunity for testing in a ‘real world’ creative scenario and a means of disseminating the projects outputs in a public forum. There will also be a mid-project ‘experimental jam’ using the in-development system, which will be broadcast live on the internet.