A few months ago I posted a video (at the end of the page) about one of my experiments, yet I haven’t written anything about it so far. As summer is not always sunny and warm (in the UK), I’m now putting down everything I never wrote about.

In an effort to define the hardware and software framework, which I’m going to use for my future research (stay tuned to know more about it), I had explored a wide a range of gestural devices.  One of the devices I most enjoyed was the Myo armband.
After a supervision meeting with my supervisor, Jamie Bullock, concerning a previous experiment about the control of audio synthesis algorithm through EMG data, I started to explore possibilities of interactions within the Virtual and Mixed reality world.

One day, while I was testing the combination Integra Live – Myo, I crumbled up a piece of paper. In that particular situation, where I was wearing the Myo, I realised that the forearm’s muscle activity is primarily responsible for performing such gesture.
Since those muscles are perfectly trackable by the Myo armband, I came up with the idea of simulating the sound of crumpling paper, by performing the gesture without paper, as everyone would do with a paper sheet between their hands.

To do so, I created an Integra Live project, which included an Amplitude Modulation module driven by EMG data outcoming from the Myo. The MyoMapper was used to send EMG data to Integra Live, through the MIDI protocol.

After this experiment, I became aware of the importance of using the appropriate method for tracking a gesture, and its replica in the appropriate context. More importantly, such results confirm the possibility to use daily life gestures for applications where the real and virtual world coexist.

By Published On: August 14th, 2015

Share this