The New Interfaces for Musical Expression conference (or NIME) is a conference that started out as a workshop at the Conference on Human Factors in Computing Systems (CHI) in 2001. It’s an international conference, with approximately 250–500 attendees annually and one of the bigger music technology / computer music conferences. I presented papers at NIME 2009 (Oslo), NIME 2014 (London) and NIME 2015 (Louisiana). I have also been on the NIME paper review panel. My paper this year, co-authored with Ali Momeni was on our ml.lib machine learning library for Max and Pure Data, which can be downloaded here.

Co-presenting at NIME Louisiana with appropriate t-shirt!

The conference covers a broad range of topics around NIMEs, but has strong focus on digital musical instruments, particularly digital interfaces for performing improvised and / or experimental music. The conference has a “scientific” strand, which includes papers, demos and posters—and an artistic strand, which primarily includes performances and installations. This year’s NIME was quite small and had an intimate feeling to it. There was a single paper track with plenty of time for reflection and conversation between sessions. Inevitably at conferences it is these between-session times when the most interesting conversations spark up.

One of the highlights for me was the Luke Dubois keynote. Luke’s interest is in maps (in the broad sense), and he convincingly argued that most NIMEs are essentially maps of some sort. He gave a provocative definition of music as “the emotional manipulation of people through data”. Following the logic of this through means NIMEs are interfaces that map data to emotion, which is an interesting thought. A promising approach to interface design is to start with the emotions and feelings we want to evoke and work ‘backwards’ from there (see also: emotional design, Kansei engineering)

I was also struck by Andrew McPherson’s paper on the “Exposing the Scaffolding of Digital Instruments with Hardware-Software Feedback Loops”. The essence of the paper was that many DMIs are “black boxes” that take physical input and output sound without exposing their internal workings, when in practice, according to Andrew: “implementation matters”! Andrew gave the piano as an example of an instrument where the “inside” of the instrument could be played and manipulated in a variety of expressive ways that are not accessible via its primary interface. This raises interesting questions about if, how, and to what extent the “inner workings” of software instruments should be exposed to musicians.

An artistic highlight for me was David Osborn’s networked live coding performance in which he performed with a Tabla player in a remote location. The player’s hands and the drum where displayed on-screen via Skype, I think the audio feed was handled using JackTrip. David built up layers of rhythms and harmonies by typing code that was projected next to the Skype feed. The generated sounds played both with and against the Tabla, which in turn responded to David’s electronic sounds. The result worked really well and was probably the most exciting live coding performance I’ve seen.

Show us your screens! Live coding performance complete with Skype feed and instructions for the audience.

Another artistic highlight for me was Soft Revolvers by Myriam Bleau performed in the Monday night concert. The NIME for this was a set of 4 x 10″ transparent acrylic “spinning tops” set out on a large table. When spun the tops would glow and control various forms of audio playback. A camera projected the table onto a large screen, so the effect was a collection of spinning spirals drifting and colliding whilst broken and glitchy beats emerged, slowed and spun up again. Spinning the tops appeared to involve significant effort making for an energetic performance.

Soft Revolvers (long) from Myriam Bleau on Vimeo.

 

By Published On: June 4th, 2015

Share this