Live-blogging nime: mobilemuse: integral music control goes mobile

music sensors and emotion
integral musical control involves state and physicl interaction. Performer interacts with other performers, with audience and with the instrument. Sounds from emotional states. Performers normally ingeract by looking and hearing, but these guys have added emotional state.
audiences also communicate in the same way. This guy wants to measure the audience.
temperature, heart rate, respiration eeg and other thing you can’t really attach to an audience.
send measurements to a pattern recognition system. The performer wears sensors.
he’s showing a graph of emotional states where a performer’s state and an audience memberms state track almost exactly.
they actually do attach things to the audience. This turns out to be a pain in the arse. They have now a small sensor thing called a “fuzzball” whnich attaches to a mobile phone.
despite me blogging this from my phone, i find it hugely problematic that this level of technology and economic privilege would be required to even go to a concert….
they monitor lie detector sort of things. The mobile phone demodulates the signals. The phone can plot them. There is a huge mess of liscence issues to connect hardware to the phone, so they encode it to the audio in.
they did a project where a movie’s scenes and order was set by the audience’s state.
the application is open source.

Published by

Charles Céleste Hutchins

Supercolliding since 2003

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.