NEXUS UI: simplified expresive mobile development

This is a distributed performance system for the web. started being focussed on the server, but changed toelp with user interface development tools. Anything that uses a browser can use it, but they’re into mobile devices.
They started with things like knobs, sliders and now offer widgets of various sorces. This is slightly gimmicky, but ok.
NexusUI.js allows you to access the interface. The example is very short and has some toys on it.
They’re being very handy-wavy about how and where audio happens. (they say this runs on a refrigerator (with a browser), but the tilt might not be supported in that case)
Audio! You can use Web Audio if you love javascript. Can use AJAX to send it to servers or Node.js rails, whatever. Can also send to libPD on iOS. nx.sendTo(‘node’) for node.js
They are showing a slide of how to get OSC data from the UI object.
This is a great competitor to touchOSC, as far as I can tell form this paper.
However, Nexus is a platform. There is a template for building new interfaces. It’s got nifty core features.
They are showing a demo of a video game for iphone that uses libPD

Now they are testifying as to ease of use. They have made a bunch of Max tutorials for each nexus object. Tutorials on how to set up on a local server. They have a nexusDrop ui interface builder makes it very competitive with touchOSC, but more generally useful. Comes with an included server or something.
NexusUP is a max thingee that will automagically build a nexusUI based on your pre-existing max patch. (whoah)
Free and open source software
Building a bunch of tools for their mobile phone orchestra.

Tactile overlays

laser cut a thingee in the shape of your ui. put it on your ipad and you get a tactile sense of the interface.

Questions

Can they show this on the friday hackathon?
Yes

Kinect and OSC Human Interface Devices

To make up for the boring title of this post, lets’s start off with a video:

XYZ with Kinect a video by celesteh on Flickr.

This is a sneak preview of the system I wrote to play XYZ by Shelly Knotts. Her score calls for every player to make a drone that’s controllable by x, y, and z parameters of a gestural controller. For my controller, I’m using a kinect.

I’m using a little c++ program based on OpenNi and NITE to find my hand position and then sending out OSC messages with those coordinates. I’ve written a class for OSCHIDs in SuperCollider, which will automatically scale the values for me, based on the largest and smallest inputs it’s seen so far. In an actual performance, I would need to calibrate it by waving my arms around a bit before starting to play.

You can see that I’m selecting myself in a drop down menu as I start using those x, y and z values. If this had been a real performance, other players names would have been there also and there is a mechanism wherein we duel for controls of each other’s sounds!

We’re doing a sneak preview of this piece on campus on wednesday, which I’m not allowed to invite the public to (something about file regulations) but the proper premiere will be at NIME in Oslo, on Tuesday 31st May @ 9.00pm atChateau Neuf (Street address: Slemdalsveien 15). More information about the performance is available via BiLE’s blog.

The SuperCollider Code

I’ve blogged about this earlier, but have since updated WiiOSCClient.sc to be more immediately useful to people working with TouchOSC or OSCeleton or other weird OSC devices. I’ve also generated several helpfiles!
OSCHID allows one to describe single OSC devices and define “slots” for them.
Those are called OscSlots and are meant to be quite a lot like GeneralHIDSlots, except that OSCHIDs and their slots do not call actions while they are calibrating.
The OSC WiiMote class that uses DarWiinRemote OSC is still called WiiOSCClient and, as far as I recall, has not changed its API since I last posted.
Note that except for people using smart devices like iPhones or whatever, OSC HIDs require helper apps to actually talk to the WiiMote or the kinect. Speaking of which…

The Kinect Code

Compiling / Installing

This code is, frankly, a complete mess and this should be considered pre-alpha. I’m only sharing it because I’m hoping somebody knows how to add support to change the tilt or how to package this as a proper Mac Application. And because I like to share. As far as I know, this code should be cross-platform, but I make no promises at all.
First, there are dependencies. You have to install a lot of crap: SensorKinect, OpenNi and NITE. Find instructions here or here.
Then you need to install the OSC library. Everybody normally uses packosc because it’s easy and stuff…. except it was segfaulting for me, so bugger that. Go install libOSC++.
Ok, now you can download my source code: OscHand.zip. (Isn’t that a clever name? Anyway…) Go to your NITE folder and look for a subfolder called Samples. You need to put this into that folder. Then, go to the terminal and get into the directory and type: make. God willing and the floodwaters don’t rise, it should compile and put an executable file into the ../Bin directory.
You need to invoke the program from the terminal, so cd over to Bin and type ./OscHand and it should work.

Using

This program needs an XML file which is lurking a few directories below in ../../Data/Sample-Tracking.xml. If you leave everything where it is in Bin, you don’t need to specify anything, but if you want to move stuff around, you need to provide the path to this XML file as the first argument on the command line.
The program generates some OSC messages which are /hand/x , /hand/y and /hand/z, all of which are followed by a single floating point number. It does not bundle things together because I couldn’t get oscpack to work, so this is what it is. By default, it sends these to port 57120, because that is the port I most want to use. Theoretically, if you give it a -p followed by a number for the second and third arguments, it will set to the port that you want. Because I have not made this as lovely as possible, you MUST specify the XML file path before you specify the port number. (As this is an easy fix, it’s high on my todo list, but it’s not happening this week.)
There are some keyboard options you can do in the window while the program is running. Typing s turns smoothing on or off. Unless you’re doing very small gestures, you probably want smoothing on.
If you want to adjust the tilt, you’re SOL, as I have been unable to solve this problem. If you also download libfreenect, you can write a little program to aim the thing, which you will then have to quit before you can use this program. Which is just awesome. There are some Processing sketches which can also be used for aiming.
You should be able to figure out how to use this in SuperCollider with the classes above, but here’s a wee bit of example code to get you started:




 k = OSCHID.new.spec_((
  ax: OscSlot(realtive, '/hand/x'),
  ay: OscSlot(realtive, '/hand/y'),
  az: OscSlot(realtive, '/hand/z')
  ));

 // wave your arms a bit to calibrate

 k.calibrate = false;

 k.setAction(ax, { |val|  val.value.postln});

And more teaser

You can see the GUIs of a few other BiLE Tools in the video at the top, including the Chat client and a shared stopwatch. There’s also a network API. I’m going to do a big code release in the fall, so stay tuned.