Dissertation Draft: BiLE, XYZ and gesture tracking

My colleague Shelly Knotts wrote a piece that uses the network features much more fully. Her piece, XYZ, uses gestural data and state-based OSC messages as well, to create a game system where players “fight” for control of sounds. Because BiLE does not follow the composer-programmer model of most LOrks, it ended up that I wrote most of the non-sound producing code for the SuperCollider implementation of Knotts’s piece. This involved tracking who is “fighting” for a particular value, picking a winner from among them, and specifying the OSC messages that would be used for the piece. I also created the GUI for SuperCollider users. All of this relied heavily on my BileTools classes, especially NetAPI and SharedResource.

Her piece specifically stipulated the use of gestural controllers. Other players used Wiimotes and iPhones, but I was assigned the Microsoft Kinect. There exists an OSC skeleton tracking application, but I was having problems with it segfaulting. Also, full-body tracking sends many more OSC messages than I needed and requires a calibration pose be held while it waits to recognise a user. (http://tohmjudson.com/?p=30 ) This seemed like overkill, so I decided it would be best to write an application to track one hand only.

Microsoft had not released official drivers yet when I started this and, as far as I know, their only drivers so far are windows-only. (http://research.microsoft.com/en-us/um/redmond/projects/kinectsdk/about.aspx) I had to use several third-party, open source driver and middleware layers. (http://tohmjudson.com/?p=30) I did most of my coding around the level of the PrimesenseNITE (http://www.primesense.com/?p=515) middleware. There is a NITE sample application called PointViewer that tracks one hand. (http://tohmjudson.com/?p=30) I modified the programme’s source code so that in addition to tracking the user’s hand, it sends an OSC message with the hand’s x, y and z coordinates. This allows a user to generate gesture data from hand position with a Kinect.

In future versions of my Kinect application, I would like to create a “Touch-lessOSC,” similar to the TouchOSC iPhone app, but with dual hand tracking. One hand would simply send it’s z,y,z, coordinates, but the other would move within pre-defined regions to send it’s location within a particular square, move a slider, or “press” a button. This will require me to create a way for users to define shapes and actions as well as recognise gestures related to button pressing. I expect to release an alpha version of this around January 2012.

For the SuperCollider side of things, I wrote some classes, OSCHID and OscSlot (see attached), that mimmic the Human Interface Device (HID) classes, but for HIDs that communicate via OSC via third-party applications such as the one I wrote. They also work with DarwiinOSC (http://code.google.com/p/darwiinosc/) and TouchOSC (http://hexler.net/software/touchosc) on the iPhone. As they have the same structure and methods as regular HID classes, they should be relatively easy for programmers and the wiimote subclass, WiiOSCClient (see attached), in particular, is drop-in-place compatible with the pre-existing supercollider wiimote class, which, unfortunately, is currently not working.

All of my BiLE-related class libraries have been posted to SourceForge (http://sourceforge.net/projects/biletools/) and will be released as a SuperCollider quark. My Kinect code has been posted to my blog only, (http://www.celesteh.com/blog/2011/05/23/xyz-with-kinec/) but I’ve gotten email indicating that at least a few people are using the programme.

Kinect and OSC Human Interface Devices

To make up for the boring title of this post, lets’s start off with a video:

XYZ with Kinect a video by celesteh on Flickr.

This is a sneak preview of the system I wrote to play XYZ by Shelly Knotts. Her score calls for every player to make a drone that’s controllable by x, y, and z parameters of a gestural controller. For my controller, I’m using a kinect.

I’m using a little c++ program based on OpenNi and NITE to find my hand position and then sending out OSC messages with those coordinates. I’ve written a class for OSCHIDs in SuperCollider, which will automatically scale the values for me, based on the largest and smallest inputs it’s seen so far. In an actual performance, I would need to calibrate it by waving my arms around a bit before starting to play.

You can see that I’m selecting myself in a drop down menu as I start using those x, y and z values. If this had been a real performance, other players names would have been there also and there is a mechanism wherein we duel for controls of each other’s sounds!

We’re doing a sneak preview of this piece on campus on wednesday, which I’m not allowed to invite the public to (something about file regulations) but the proper premiere will be at NIME in Oslo, on Tuesday 31st May @ 9.00pm atChateau Neuf (Street address: Slemdalsveien 15). More information about the performance is available via BiLE’s blog.

The SuperCollider Code

I’ve blogged about this earlier, but have since updated WiiOSCClient.sc to be more immediately useful to people working with TouchOSC or OSCeleton or other weird OSC devices. I’ve also generated several helpfiles!
OSCHID allows one to describe single OSC devices and define “slots” for them.
Those are called OscSlots and are meant to be quite a lot like GeneralHIDSlots, except that OSCHIDs and their slots do not call actions while they are calibrating.
The OSC WiiMote class that uses DarWiinRemote OSC is still called WiiOSCClient and, as far as I recall, has not changed its API since I last posted.
Note that except for people using smart devices like iPhones or whatever, OSC HIDs require helper apps to actually talk to the WiiMote or the kinect. Speaking of which…

The Kinect Code

Compiling / Installing

This code is, frankly, a complete mess and this should be considered pre-alpha. I’m only sharing it because I’m hoping somebody knows how to add support to change the tilt or how to package this as a proper Mac Application. And because I like to share. As far as I know, this code should be cross-platform, but I make no promises at all.
First, there are dependencies. You have to install a lot of crap: SensorKinect, OpenNi and NITE. Find instructions here or here.
Then you need to install the OSC library. Everybody normally uses packosc because it’s easy and stuff…. except it was segfaulting for me, so bugger that. Go install libOSC++.
Ok, now you can download my source code: OscHand.zip. (Isn’t that a clever name? Anyway…) Go to your NITE folder and look for a subfolder called Samples. You need to put this into that folder. Then, go to the terminal and get into the directory and type: make. God willing and the floodwaters don’t rise, it should compile and put an executable file into the ../Bin directory.
You need to invoke the program from the terminal, so cd over to Bin and type ./OscHand and it should work.

Using

This program needs an XML file which is lurking a few directories below in ../../Data/Sample-Tracking.xml. If you leave everything where it is in Bin, you don’t need to specify anything, but if you want to move stuff around, you need to provide the path to this XML file as the first argument on the command line.
The program generates some OSC messages which are /hand/x , /hand/y and /hand/z, all of which are followed by a single floating point number. It does not bundle things together because I couldn’t get oscpack to work, so this is what it is. By default, it sends these to port 57120, because that is the port I most want to use. Theoretically, if you give it a -p followed by a number for the second and third arguments, it will set to the port that you want. Because I have not made this as lovely as possible, you MUST specify the XML file path before you specify the port number. (As this is an easy fix, it’s high on my todo list, but it’s not happening this week.)
There are some keyboard options you can do in the window while the program is running. Typing s turns smoothing on or off. Unless you’re doing very small gestures, you probably want smoothing on.
If you want to adjust the tilt, you’re SOL, as I have been unable to solve this problem. If you also download libfreenect, you can write a little program to aim the thing, which you will then have to quit before you can use this program. Which is just awesome. There are some Processing sketches which can also be used for aiming.
You should be able to figure out how to use this in SuperCollider with the classes above, but here’s a wee bit of example code to get you started:




 k = OSCHID.new.spec_((
  ax: OscSlot(realtive, '/hand/x'),
  ay: OscSlot(realtive, '/hand/y'),
  az: OscSlot(realtive, '/hand/z')
  ));

 // wave your arms a bit to calibrate

 k.calibrate = false;

 k.setAction(ax, { |val|  val.value.postln});

And more teaser

You can see the GUIs of a few other BiLE Tools in the video at the top, including the Chat client and a shared stopwatch. There’s also a network API. I’m going to do a big code release in the fall, so stay tuned.