Compiling SuperCollider on Ubuntu Studio 3.04 beta 2 (and otherwise setting up audio stuff)

The list of required libraries has changed somewhat from different versions. This is what I did:

sudo apt-get install git cmake libsndfile1-dev libfftw3-dev  build-essential  libqt4-dev libqtwebkit-dev libasound2-dev libavahi-client-dev libicu-dev libreadline6-dev libxt-dev pkg-config subversion libcwiid1 libjack-jackd2-dev emacs gnome-alsamixer  libbluetooth-dev libcwiid-dev netatalk

git clone --recursive

cd supercollider

mkdir build

cd build

cmake ..


If all that worked, then you should install it:

sudo make install


If it starts, you’re all good!
Users may note that this version of Ubuntu Studio can compile in Supernova support, so that’s very exciting.
I’ve gone to a beta version of Ubuntu Studio because Jack was giving me a bit of trouble on my previous install, so we’ll see if this sorts it out.
Note in the apt-get part that emacs is extremely optional and netatalk allows me to mount apple mac file systems that are shared via apple talks, something I need to do with my laptop ensemble, but which not everyone will need. Gone-alsamixer is also optional and is a gnome app. It’s a gui mixer application which lets you set levels on your sound card. Mine was sending the ins straight to the outs, which is not what I wanted, so I could fix it this way or by writing and running a script. Being lazy, I thought the GUI would be a bit easier. There’s also a command line terminal application called alsamixer, if you like that retro 80’s computing feeling.
It can also be handy to sometimes kill pulse audio without it respawning over and over. Fortunately, it’s possible to do this:

sudo gedit /etc/pulse/client.conf

Add in these two lines:

autospawn = no
daemon-binary = /bin/true

I still want pulse to start by default when I login, though, so I’ve set it to start automatically. I found the application called Startup Applications and clicked add. For the name, I put pulseaudio. for the command, I put:

pulseaudio --start

Then I clicked the add button on that dialog screen and it’s added. When I want to kill pulseaudio, I will open a terminal and type:

pulseaudio --kill

and when I want it back again, I’ll type:

pulseaudio --start

(I have not yet had a killing and a restarting throw any of my applications into silent confusion, but I’m sure it will happen at some point.)
There’s more on this


It’s time for everybody’s favourite collaborative real time network live coding tool for SuperCollider.
Invented by PowerBooks UnPlugged – granular synthesis playing across a bunch of unplugged laptops.
Then some of them started Republic111, which is named for the room number where the workshop where they taught stuff.
Code reading is interesting in network msic partly because of stealing, but also to understand somebody else’s code quickly, or actively understand it by changing it. Live coding is a public or collective thinking action.
If you evaluate code, it shows up in a history file, and gets sent to everybody else in the Republic. You can stop the sound of everybody on the network. All the SynthDefs are saved. People play ‘really equally’ on everybody’s computer. Users don’t feel obligated to act, but rather to respond. Participants spend most of their time listening
Republic is mainly one big class, which is a weakness and should be broken up into smaller classes hat can be used separately. Scott Wilson is working on a newer versions which on github. Look up ‘The Way things May Go on Vimeo’.
Graham and Jonas have done a system which allows you to see a map of who is emitting what sound and you can click on it and get the Tdef that made it.
Scott is putting out a call for participation and discussion about how it should be.

Benjamin Graf – mblght

Lighting guys sit behind lighting desks and hit buttons for the duration of concerts, so lights in shows are actually usually boring, despite having valuable and variable equipment.
Wouldn’t it be great if you could do stochastic lights with envelope controls?
SuperCollider does solid timing and has support for different methods of dispersing stuff and has flexible signal routing.
He’s got an object that holds descriptions of the capabilities of any lighting fixture – moving, colour, on, off, etc.
He uses events in the pattern system as one way of changing stuff.
He’s added light support to the Server. So you can do SinOsc control of light changes, sendint to contorl busses. He’s also made light UGens.
He ended up live coding the lights for a festival.


What about machine listening? It would be easy to do in this system.
The code is on github.

Fun with Cellular Automata

The SuperCollider code is stolen from redFrick’s blog post about cellular automata. All I have added is the SynthDef, one line at the top to set sound variables and one line in the drawing function to make new synths based on what blocks are coloured in.
Ergo, I haven’t really got anything to add, but this is still kind of fun and Star-Trek-ish


SynthDef(sinegrain2, {arg pan = 0, freq, amp, grainDur; var grain, env;

 env = * 2, amp), doneAction:2); // have some overlap
 grain=, 0, env);,, pan))}).add;


//game of life /redFrik
// add sound-related variables
var grainDur = 1/20, lowfreq = 200, hifreq = 800, lowamp =0.005, hiamp = 0.085, rows = 50, cols = 50;

      var envir, copy, neighbours, preset, rule, wrap;
        var w, u, width= 200, height= 200, cellWidth, cellHeight;
        w= Window("ca - 2 pen", Rect(128, 64, width, height), false);
        u= UserView(w, Rect(0, 0, width, height));
        u.background= Color.white;
        cellWidth= width/cols;
        cellHeight= height/rows;
        wrap= true;                     //if borderless envir
        /*-- select rule here --*/
        //rule= #[[], [3]];
        //rule= #[[5, 6, 7, 8], [3, 5, 6, 7, 8]];
        //rule= #[[], [2]];                                             //rule "/2" seeds
        //rule= #[[], [2, 3, 4]];
        //rule= #[[1, 2, 3, 4, 5], [3]];
        //rule= #[[1, 2, 5], [3, 6]];
        //rule= #[[1, 3, 5, 7], [1, 3, 5, 7]];
        //rule= #[[1, 3, 5, 8], [3, 5, 7]];
        rule= #[[2, 3], [3]];                                           //rule "23/3" conway's life
        //rule= #[[2, 3], [3, 6]];                                      //rule "23/36" highlife
        //rule= #[[2, 3, 5, 6, 7, 8], [3, 6, 7, 8]];
        //rule= #[[2, 3, 5, 6, 7, 8], [3, 7, 8]];
        //rule= #[[2, 3, 8], [3, 5, 7]];
        //rule= #[[2, 4, 5], [3]];
        //rule= #[[2, 4, 5], [3, 6, 8]];
        //rule= #[[3, 4], [3, 4]];
        //rule= #[[3, 4, 6, 7, 8], [3, 6, 7, 8]];               //rule "34578/3678" day&night
        //rule= #[[4, 5, 6, 7], [3, 5, 6, 7, 8]];
        //rule= #[[4, 5, 6], [3, 5, 6, 7, 8]];
        //rule= #[[4, 5, 6, 7, 8], [3]];
        //rule= #[[5], [3, 4, 6]];
        neighbours= #[[-1, -1], [0, -1], [1, -1], [-1, 0], [1, 0], [-1, 1], [0, 1], [1, 1]];
        envir= Array2D(rows, cols);
        copy= Array2D(rows, cols);{|x|{|y| envir.put(x, y, 0)}};
        /*-- select preset here --*/
        //preset= #[[0, 0], [1, 0], [0, 1], [1, 1]]+(cols/2); //block
        //preset= #[[0, 0], [1, 0], [2, 0]]+(cols/2); //blinker
        //preset= #[[0, 0], [1, 0], [2, 0], [1, 1], [2, 1], [3, 1]]+(cols/2); //toad
        //preset= #[[1, 0], [0, 1], [0, 2], [1, 2], [2, 2]]+(cols/2); //glider
        //preset= #[[0, 0], [1, 0], [2, 0], [3, 0], [0, 1], [4, 1], [0, 2], [1, 3], [4, 3]]+(cols/2); //lwss
        //preset= #[[1, 0], [5, 0], [6, 0], [7, 0], [0, 1], [1, 1], [6, 2]]+(cols/2); //diehard
        //preset= #[[0, 0], [1, 0], [4, 0], [5, 0], [6, 0], [3, 1], [1, 2]]+(cols/2); //acorn
        preset= #[[12, 0], [13, 0], [11, 1], [15, 1], [10, 2], [16, 2], [24, 2], [0, 3], [1, 3], [10, 3], [14, 3], [16, 3], [17, 3], [22, 3], [24, 3], [0, 4], [1, 4], [10, 4], [16, 4], [20, 4], [21, 4], [11, 5], [15, 5], [20, 5], [21, 5], [34, 5], [35, 5], [12, 6], [13, 6], [20, 6], [21, 6], [34, 6], [35, 6], [22, 7], [24, 7], [24, 8]]+(cols/4); //gosper glider gun
        //preset= #[[0, 0], [2, 0], [2, 1], [4, 2], [4, 3], [6, 3], [4, 4], [6, 4], [7, 4], [6, 5]]+(cols/2); //infinite1
        //preset= #[[0, 0], [2, 0], [4, 0], [1, 1], [2, 1], [4, 1], [3, 2], [4, 2], [0, 3], [0, 4], [1, 4], [2, 4], [4, 4]]+(cols/2); //infinite2
        //preset= #[[0, 0], [1, 0], [2, 0], [3, 0], [4, 0], [5, 0], [6, 0], [7, 0], [9, 0], [10, 0], [11, 0], [12, 0], [13, 0], [17, 0], [18, 0], [19, 0], [26, 0], [27, 0], [28, 0], [29, 0], [30, 0], [31, 0], [32, 0], [34, 0], [35, 0], [36, 0], [37, 0], [38, 0]]+(cols/4); //infinite3
        //preset= Array.fill(cols*rows, {[cols.rand, rows.rand]});{|point| envir.put(point[0], point[1], 1)};
        i= 0;
        u.drawFunc= {
                i= i+1;
                                if(, y)==1, {
                                        Pen.addRect(Rect(x*cellWidth, height-(y*cellHeight), cellWidth, cellHeight));
    // the new line,[freq, x.linexp(0, cols, lowfreq, hifreq), amp, y.linexp(0, rows, lowamp, hiamp), pan, 0, grainDur, grainDur])
                                var sum= 0;
                                        var nX= x+point[0];
                                        var nY= y+point[1];
                                        if(wrap, {
                                                sum=, nY%rows); //no borders
                                        }, {
                                                if((nX>=0)&&(nY>=0)&&(nX<cols)&&(nY<rows), {sum=, nY)}); //borders
                                if(rule[1].includes(sum), {     //borne
                                        copy.put(x, y, 1);
                                }, {
                                        if(rule[0].includes(sum), {     //lives on
                                                copy.put(x, y,, y));
                                        }, {    //dies
                                                copy.put(x, y, 0);
                envir= copy.deepCopy;
        Routine({while{w.isClosed.not} {u.refresh; i.postln; (1/20).wait}}).play(AppClock);

Recording Audio and Video from SuperCollider on Ubuntu Studio

Recently, I needed to record my screen and audio while SuperCollidering on Ubuntu Studio. (This will work on other operating systems also also with some tweaks.) This is the code I included:


("ffmpeg -f jack -ac 2 -i ffmpeg -f x11grab -r 30 -s $(xwininfo -root | grep 'geometry' | awk '{print $2;}') -i :0.0 -acodec pcm_s16le -vcodec libx264 -vpre lossless_ultrafast -threads 0" +
    "/home/celesteh/Documents/" ++ Date.getDate.bootSeconds ++ ".mkv"



As far as I am able to determine, that allows you to record stereo audio from jack and the screen content of your primary screen. Obviously the syntax is slightly dense.

I was unable to figure out how to tell it to automatically get the jack output I wanted, so in order to run this, I first started JACK Control, used that to start jack, and then evaluated my SC code, which included the above line. That opened a terminal window.

Then I went back to JACK Control and clicked the Connect button to open a window with a list of connections. I took the supercollider output connections and dragged them to the ffmpeg input connections, which enabled the sound. This step is very important, as otherwise the recording will be silent. The dragging is illustrated in the accompanying image by the thick, red line.

Then AFTER making the audio connection, I started making sound with SuperCollider, which was then recorded. When I finished, I went to the terminal window opened by supercollider and typed control-c to stop the recording. Then, I quit the SC server and JACK.

After a few moments, the ffmpeg process finished quitting. I could then watch the video and hear sound via VLC (on my system, VLC does not play audio if jack is running, so check that if your film is silent). I used OpenShot to cut off the beginning part, which included a recording of me doing the jack connections.

The recommended way of recording desktop output on ubuntu studio is recordmydesktop, but there are many advantages to doing it with ffmpeg instead. There is a bug in recordmydesktop which means it won’t accept jack input, so you have to record the video and audio separately and splice them together. (If you chose to do this, note you need to quit jack before recordmydesktop will generate your output file). Also, I found that recordmydesktop took up a lot of processing power and the recording was too glitchy to actually use. The command line for it is a LOT easier, though, so pick your preference.

You need not include the command to start ffmpeg in your SuperCollider code. Instead of using the runInTerminal message, you can just run it in a terminal. As above, make sure you start JACK before you start this code and don’t forget to make the connections between it and ffmpeg. I prefer to put it in the code because then I don’t forget to run it, but this is a matter entirely for personal preference.


Almost all of the programs linked above are cross-platform, so this is very likely to work on mac or windows with only a few changes. Mac users have two ways to proceed. One is to use jack with QuickTime. You can use jack to route audio on your system so the SuperCollider output goes to the quicktime input. Or, if you’re in windows or just want to use ffmpeg, you will need to change the command line so that it gets the screen geometry in a different way and so that it captures from a different device. Check the man pages for ffmpeg and for avconv, which comes with it. Users on the channel on freenode can also help. Leave a comment if you’ve figured it out.

Compiling superCollider on Ubuntu studio 12.0.4

For various annoying reasons, I’ve just reformatted my hard drive and freshly installed the latest Ubuntu Studio LTS. As soon as I’d restored my home directory, I set to work compiling SuperCollider.
The README_Linux.txt file covers most of what you need to know, but here’s what I had to type to get going:

sudo apt-get install git cmake libsndfile1-dev libfftw3-dev  build-essential  libqt4-dev libqtwebkit-dev libasound2-dev libavahi-client-dev libicu-dev libreadline6-dev libxt-dev pkg-config subversion libcwiid1 libjack-jackd2-dev emacs

cd ~/Documents

git clone --recursive

cd supercollider

mkdir build

cd build



You have to disable supernova because it requires a newer version of gcc. If all that works, then you can install it:

sudo make install

I had something go wrong with my installed SC libs, but that’s a side issue.

Granular Performance Thingee

I’ve been working on a thing where I draw shapes and they turn into granular synthesis clouds. I have some ideas for what I want it to do, but one thing I’d really like to do is be able to record video of me playing it, via recording my desktop – with audio. Indeed, I need to sort this out very shortly, as I’d like to submit a proposal for the SuperCollider Symposium. (If anybody has advice on how to record this on Ubuntu, please do leave a comment.)

As it was, all I’ve got is a screen shot of post-drawing and an audio file of what it sounded like whilst playing.

Although this has quite a few bugs, some people have expressed some interest the system, so I’m making source code available. It’s written in SuperCollider. This is nowhere near a release version and the code is ugly, so be forewarned. but it mostly works. It relies on the Conductor quark by Ron Kuivila. There are two files. Put in your Extensions folder. The open up test.scd in the lovely new SC 3.6 IDE. Or it will probably work in earlier versions. Select the whole file and evaluate it.
One of the windows that opens is a controller with a play and stop button. One is a large black window. And one is a bunch of blank buttons. I have no idea what they don’t have labels. Draw in the black window using click and drag (or a stylus if you’re lucky). Press play in the controller window. When the cursor gets to the cloud, you can hopefully hear it playing. There are some sliders in that window to change the speed of the cursor. You can do this while it is playing. You can also draw new shapes while it playing. The blank buttons have sound parameters attached to them. So if you press one, the next cloud you draw will have that button’s parameters. (The buttons are supposed to say things like ‘Short Sine Grains’ or ‘Long Saw Grains’.) If you want to modify a cloud, you can right click on it to get a popup window that changes some of the sound settings.
You can’t save your work, but the server window has a record button on it. If you press that and then play your drawing and then hit the ‘stop recording’ button in the server window, you’ll be able to record the audio in real time..
I had the idea that I could draw on this with my stylus while on stage in real time, but when I plugged my computer into the projector, it changed my screen parameters and my stylus calibration failed. I’m sure there is a work around, which, ideally, I’d like to find by next week.

LiveBlogging: Modality – modal control in SuperCollider

by many people

Modality is a loose collaboration to make a toolkit to hook up controllers to SC.  Does mapping, including some complex stuff and some on-the-fly stuff.

Marije spoke a bit of how they began collaborating

Concept – support many devices over many protocols. Make a common interface. Easily remap.


They currently support MIDI and HID. the common interface is MKtl. Provides a system to process the data. They have templates. Templates for common ways of processing. Same interface for MKtl and MDispatch. (they may move to FRP (I don’t know what that is))

Ktl quark is out of date.

(I think I might be interested in contributing to this project – or at least provide templates for stuff)

Different protocol have different transport mechanisms. Things very by OS. Different controllers have different semantics.

A general solution is not trivial.

Scaling is different on different OSes. Names of devices may have variations. MIDI has some device name issues.  real MIDI (non-usb) will not report their names, but use MIDI ports.  Similar issues will arise with OSC or SerialPort. 

The device description index is an identity dictionary. It’s got some NanoKontrol stuff in it. I am definitely interested in this…

They’ve got some templates, but it’s still a bit vapourware.

For every button or input on your device, they define what it is, where it is, etc.  This is good stuff.  You can also set the I/O type.

Device descriptions have names, specifications, platform differences, hierarchical naming (for use in pattern-matching). You can programmatically fill in the description

nanoKontrol, Gamepad, DanceMat, a bunch of things.

Events and signals

Functional reactive processing. Events, data flow, change propogation. FRP – functional reactive programming

These are functions without sideFX until you get to the output phase.

In the FP Quark – functional programming Quark.

Events are encoded in an event stream.  Event Source with a do method adds a side effect.  When somethng happens (is “fired”), do the do.  Only event sources can be fired.

the network starts with an event source. 

Signals are similar but have state? You can ask for the value and change it.

To create the network use combinators.

inject has state internally.

Dynamic Event Switching limits and event depending on a selector.  this is kind of like the gate thing in max.

With Modality, every control has an elements, every element has a singal and a source. Controls have keys.

You can combine values, attach stuff to knob changes. Easy to attach event streams to functions.

this is complex to describe, but works intuitively in practice.  You can do deltas, accumulators, etc.

Closing remarks

this is on github, but it not yet released.  depends on the FP quark.

Needs gui replacements.  Needs a backend for OSC devices.

Needs some hackin in the SC source.


  • Would you be interested in doing the descriptors in JSON, so it can be used by non-SC guys? Yeah, why not.  This is a good plan, even.

Liveblogging the Sc symposium: Overtone Library

Collaborative programmable music. Runs in LISP (dialect of LISP?) that runs in the JVM.  It’s got concurrency stuff. It’s programmable. It runs in Clojure.

Deals with the SC server.  This sort of looks like it’s running in emacs…

All SC Ugens are available.  He built a bunch of metadata for this, a lot like the SC classes for the Ugens.  There is in-line documentation, which is nice.  The Node-tree shows all currently running UGens.

Midi events are received as events and can be used by any function. Wiggle your nano controller.  This came with the JVM.  So all Java libraries are supported.  OSC support. Serial support.

Synth code and musical expression code can be written in the same language.  Specify phrases in a score, concat them.  The language is relatively readable. as far as lisp goes.  Most things are immutable, so this is good for concurrence. Too many variables can confuse the programmer.

He’s using a monome. Every button call has a function, which has the X,Y coordinate, whether it’s pressed or released and a history of all other button presses.

Now he’s doing some mono-controlled dubstep.

C-Gens are re-usable UGen trees, possible a bit like synthdefs. Can do groups also.

This can also use stuff, because it’s got java.  OpenGL graphics also supported. They can hook into any UGen

Anything can be glued together.

This is kind of cool. But you need to deal with both java and lisp.


  • Collaboration?  It helps you deal with shared state, without blocking or locking.