Live Blogging the Sc Symposium – Flocking by Colin Clark

Flocking – audio synthesis in javascript on the web

flockingjs.org
github/colinbdclark/flocking

audio synthesis framework written in javascript

specifically intended to support artists

Inspired by SC

Web is everywhere

programming environments that have graphical tools

Flocking is highly declarative

Synth graphs declares trees of names unit generators – you write data strictures, not code

Data is easy to manipulate

flock.synth({
  synthDef: {
    ugen: "flock.ugen.sinOsc",
    freq: 440
    mul: 0.25
  }
})

He skips the Rate:”audio” because that’s the default.
Modulation:

flock.synth({
  synthDef: {
    ugen: "flocl.ugen.sinOsc",
    freq: 440
    mul: {
      ugen:flock.ugen.line"
....
  }
})

It handles buffers and whatnot, but not multichannel expansion.
Scheduling is unreliable…but works

A useful script

The best way to remember to do something when you’re going to run some important program is to put in the program itself. Or at the very least, put it in a script that you use to invoke the program.
I have a few things I need to remember for the performance I’m preparing for. One has to do with a projector. I’m using a stylus to draw cloud shapes on my screen. And one way I can do this is to mirror my screen to a projector so the audience can see my GUI. However, doing this usually changes the geometry of my laptop screen, so that instead of extending all the way to the edges, there are empty black bars on either side of the used portion of my display. That’s fine, except the stylus doesn’t know and doesn’t adjust. So to reach the far right edge of the drawn portion of the screen, I need to touch the far right edge of the drawn portion, which puts over a centimetre between the stylus tip and the arrow pointer. Suboptimal!
Ideally, I’d like to have any change in screen geometry trigger a script that changes the settings for for the stylus (and I have ideas about how that may or may not work, using upstart, xrandr and xsetwacom), but in the absence of that, I just want to launch a manual calibration program. If I launch the settings panel, there’s a button on that that launches one. So the top part of my script checks if the calibration is different than normal and launches settings if it is.
The next things I need to remember are audio related. I need to kill pulseaudio. If my soundcard (a Fast Track Ultra) is attached, I need to change the amplitude settings internally so it doesn’t send the input straight to the output. Then I need to start jack using it. Or if it’s not attached, I need to start jack using a default device. Then, because it’s useful, I should start up Jack Control, so I can do some routing, should I need it. (Note: in 12.04 if you start qjackctl after starting jackd, it doesn’t work properly. This is fixed by 13.04.) Finally, I should see if SuperCollider is already running and if not, I should start it.
That’s a bit too much to remember for a performance, so I wrote a script. The one thing I need to remember with this script is that if I want to kill jack, it won’t die from Jack Control, so I’ll need to do a kill -9 from the prompt. hopefully, this will not be an issue on stage.
This is my script:

#!/bin/bash


# first check the screen

LINE=`xrandr -q | grep Screen`
WIDTH=`echo ${LINE} | awk '{ print $8 }'`
HEIGHT=`echo ${LINE} | awk '{ print $10 }' | awk -F"," '{ print $1 }'`

if  [[ ${WIDTH} != 1366 || ${HEIGHT} != 768 ]]
  then
 gnome-control-center wacom
  else
 echo normal resolution
fi


# now setup the audio

pulseaudio --kill

# is the ultra attached?
if aplay -l | grep -qi ultra
  then
 echo ultra
 
 #adjust amplitude
 i=0
 j=0
 for i in $(seq 8); do
         for j in $(seq 8); do
                 if [ "$i" != "$j" ]; then
                         amixer -c Ultra set "DIn$i - Out$j" 0% > /dev/null
                 else
                         amixer -c Ultra set "DIn$i - Out$j" 100% > /dev/null
                 fi
                 amixer -c Ultra set "AIn$i - Out$j" 0% > /dev/null
         done
 done

 #for i in $(seq 4); do 
 # amixer -c Ultra set "Effects return $i" 0% > /dev/null 
 #done 

 #start jack
 jackd -d alsa -d hw:Ultra &
  else
 #start jack with default hardware
 jackd -d alsa -d hw:0 &
fi

sleep 2

# jack control
qjackctl &

sleep 1

# is supercollider running?
if ps aux | grep -vi grep | grep -q scide
  then
 echo already running
  else
 scide test.scd &
fi

Compiling SuperCollider on Ubuntu Studio 3.04 beta 2 (and otherwise setting up audio stuff)

The list of required libraries has changed somewhat from different versions. This is what I did:

sudo apt-get install git cmake libsndfile1-dev libfftw3-dev  build-essential  libqt4-dev libqtwebkit-dev libasound2-dev libavahi-client-dev libicu-dev libreadline6-dev libxt-dev pkg-config subversion libcwiid1 libjack-jackd2-dev emacs gnome-alsamixer  libbluetooth-dev libcwiid-dev netatalk

git clone --recursive https://github.com/supercollider/supercollider.git

cd supercollider

mkdir build

cd build

cmake ..

make

If all that worked, then you should install it:

sudo make install

scide

If it starts, you’re all good!
Users may note that this version of Ubuntu Studio can compile in Supernova support, so that’s very exciting.
I’ve gone to a beta version of Ubuntu Studio because Jack was giving me a bit of trouble on my previous install, so we’ll see if this sorts it out.
Note in the apt-get part that emacs is extremely optional and netatalk allows me to mount apple mac file systems that are shared via apple talks, something I need to do with my laptop ensemble, but which not everyone will need. Gone-alsamixer is also optional and is a gnome app. It’s a gui mixer application which lets you set levels on your sound card. Mine was sending the ins straight to the outs, which is not what I wanted, so I could fix it this way or by writing and running a script. Being lazy, I thought the GUI would be a bit easier. There’s also a command line terminal application called alsamixer, if you like that retro 80’s computing feeling.
It can also be handy to sometimes kill pulse audio without it respawning over and over. Fortunately, it’s possible to do this:

sudo gedit /etc/pulse/client.conf

Add in these two lines:

autospawn = no
daemon-binary = /bin/true

I still want pulse to start by default when I login, though, so I’ve set it to start automatically. I found the application called Startup Applications and clicked add. For the name, I put pulseaudio. for the command, I put:

pulseaudio --start

Then I clicked the add button on that dialog screen and it’s added. When I want to kill pulseaudio, I will open a terminal and type:

pulseaudio --kill

and when I want it back again, I’ll type:

pulseaudio --start

(I have not yet had a killing and a restarting throw any of my applications into silent confusion, but I’m sure it will happen at some point.)
There’s more on this

Republic

It’s time for everybody’s favourite collaborative real time network live coding tool for SuperCollider.
Invented by PowerBooks UnPlugged – granular synthesis playing across a bunch of unplugged laptops.
Then some of them started Republic111, which is named for the room number where the workshop where they taught stuff.
Code reading is interesting in network msic partly because of stealing, but also to understand somebody else’s code quickly, or actively understand it by changing it. Live coding is a public or collective thinking action.
If you evaluate code, it shows up in a history file, and gets sent to everybody else in the Republic. You can stop the sound of everybody on the network. All the SynthDefs are saved. People play ‘really equally’ on everybody’s computer. Users don’t feel obligated to act, but rather to respond. Participants spend most of their time listening
Republic is mainly one big class, which is a weakness and should be broken up into smaller classes hat can be used separately. Scott Wilson is working on a newer versions which on github. Look up ‘The Way things May Go on Vimeo’.
Graham and Jonas have done a system which allows you to see a map of who is emitting what sound and you can click on it and get the Tdef that made it.
Scott is putting out a call for participation and discussion about how it should be.

Benjamin Graf – mblght

Lighting guys sit behind lighting desks and hit buttons for the duration of concerts, so lights in shows are actually usually boring, despite having valuable and variable equipment.
Wouldn’t it be great if you could do stochastic lights with envelope controls?
SuperCollider does solid timing and has support for different methods of dispersing stuff and has flexible signal routing.
He’s got an object that holds descriptions of the capabilities of any lighting fixture – moving, colour, on, off, etc.
He uses events in the pattern system as one way of changing stuff.
He’s added light support to the Server. So you can do SinOsc control of light changes, sendint to contorl busses. He’s also made light UGens.
He ended up live coding the lights for a festival.

Questions

What about machine listening? It would be easy to do in this system.
The code is on github.

Fun with Cellular Automata

The SuperCollider code is stolen from redFrick’s blog post about cellular automata. All I have added is the SynthDef, one line at the top to set sound variables and one line in the drawing function to make new synths based on what blocks are coloured in.
Ergo, I haven’t really got anything to add, but this is still kind of fun and Star-Trek-ish

(

SynthDef(sinegrain2, {arg pan = 0, freq, amp, grainDur; var grain, env;

 env = EnvGen.kr(Env.sine(grainDur * 2, amp), doneAction:2); // have some overlap
 grain= SinOsc.ar(freq, 0, env);


Out.ar(0,Pan2.ar(grain, pan))}).add;

)


//game of life /redFrik
(
// add sound-related variables
var grainDur = 1/20, lowfreq = 200, hifreq = 800, lowamp =0.005, hiamp = 0.085, rows = 50, cols = 50;


      var envir, copy, neighbours, preset, rule, wrap;
        var w, u, width= 200, height= 200, cellWidth, cellHeight;
        w= Window("ca - 2 pen", Rect(128, 64, width, height), false);
        u= UserView(w, Rect(0, 0, width, height));
        u.background= Color.white;
        cellWidth= width/cols;
        cellHeight= height/rows;
        wrap= true;                     //if borderless envir
        /*-- select rule here --*/
        //rule= #[[], [3]];
        //rule= #[[5, 6, 7, 8], [3, 5, 6, 7, 8]];
        //rule= #[[], [2]];                                             //rule "/2" seeds
        //rule= #[[], [2, 3, 4]];
        //rule= #[[1, 2, 3, 4, 5], [3]];
        //rule= #[[1, 2, 5], [3, 6]];
        //rule= #[[1, 3, 5, 7], [1, 3, 5, 7]];
        //rule= #[[1, 3, 5, 8], [3, 5, 7]];
        rule= #[[2, 3], [3]];                                           //rule "23/3" conway's life
        //rule= #[[2, 3], [3, 6]];                                      //rule "23/36" highlife
        //rule= #[[2, 3, 5, 6, 7, 8], [3, 6, 7, 8]];
        //rule= #[[2, 3, 5, 6, 7, 8], [3, 7, 8]];
        //rule= #[[2, 3, 8], [3, 5, 7]];
        //rule= #[[2, 4, 5], [3]];
        //rule= #[[2, 4, 5], [3, 6, 8]];
        //rule= #[[3, 4], [3, 4]];
        //rule= #[[3, 4, 6, 7, 8], [3, 6, 7, 8]];               //rule "34578/3678" day&night
        //rule= #[[4, 5, 6, 7], [3, 5, 6, 7, 8]];
        //rule= #[[4, 5, 6], [3, 5, 6, 7, 8]];
        //rule= #[[4, 5, 6, 7, 8], [3]];
        //rule= #[[5], [3, 4, 6]];
        neighbours= #[[-1, -1], [0, -1], [1, -1], [-1, 0], [1, 0], [-1, 1], [0, 1], [1, 1]];
        envir= Array2D(rows, cols);
        copy= Array2D(rows, cols);
        cols.do{|x| rows.do{|y| envir.put(x, y, 0)}};
        /*-- select preset here --*/
        //preset= #[[0, 0], [1, 0], [0, 1], [1, 1]]+(cols/2); //block
        //preset= #[[0, 0], [1, 0], [2, 0]]+(cols/2); //blinker
        //preset= #[[0, 0], [1, 0], [2, 0], [1, 1], [2, 1], [3, 1]]+(cols/2); //toad
        //preset= #[[1, 0], [0, 1], [0, 2], [1, 2], [2, 2]]+(cols/2); //glider
        //preset= #[[0, 0], [1, 0], [2, 0], [3, 0], [0, 1], [4, 1], [0, 2], [1, 3], [4, 3]]+(cols/2); //lwss
        //preset= #[[1, 0], [5, 0], [6, 0], [7, 0], [0, 1], [1, 1], [6, 2]]+(cols/2); //diehard
        //preset= #[[0, 0], [1, 0], [4, 0], [5, 0], [6, 0], [3, 1], [1, 2]]+(cols/2); //acorn
        preset= #[[12, 0], [13, 0], [11, 1], [15, 1], [10, 2], [16, 2], [24, 2], [0, 3], [1, 3], [10, 3], [14, 3], [16, 3], [17, 3], [22, 3], [24, 3], [0, 4], [1, 4], [10, 4], [16, 4], [20, 4], [21, 4], [11, 5], [15, 5], [20, 5], [21, 5], [34, 5], [35, 5], [12, 6], [13, 6], [20, 6], [21, 6], [34, 6], [35, 6], [22, 7], [24, 7], [24, 8]]+(cols/4); //gosper glider gun
        //preset= #[[0, 0], [2, 0], [2, 1], [4, 2], [4, 3], [6, 3], [4, 4], [6, 4], [7, 4], [6, 5]]+(cols/2); //infinite1
        //preset= #[[0, 0], [2, 0], [4, 0], [1, 1], [2, 1], [4, 1], [3, 2], [4, 2], [0, 3], [0, 4], [1, 4], [2, 4], [4, 4]]+(cols/2); //infinite2
        //preset= #[[0, 0], [1, 0], [2, 0], [3, 0], [4, 0], [5, 0], [6, 0], [7, 0], [9, 0], [10, 0], [11, 0], [12, 0], [13, 0], [17, 0], [18, 0], [19, 0], [26, 0], [27, 0], [28, 0], [29, 0], [30, 0], [31, 0], [32, 0], [34, 0], [35, 0], [36, 0], [37, 0], [38, 0]]+(cols/4); //infinite3
        //preset= Array.fill(cols*rows, {[cols.rand, rows.rand]});
        preset.do{|point| envir.put(point[0], point[1], 1)};
        i= 0;
        u.drawFunc= {
                i= i+1;
                Pen.fillColor= Color.black;
                cols.do{|x|
                        rows.do{|y|
                                if(envir.at(x, y)==1, {
                                        Pen.addRect(Rect(x*cellWidth, height-(y*cellHeight), cellWidth, cellHeight));
    // the new line
    Synth.new(sinegrain2,[freq, x.linexp(0, cols, lowfreq, hifreq), amp, y.linexp(0, rows, lowamp, hiamp), pan, 0, grainDur, grainDur])
                                });
                        };
                };
                Pen.fill;
                cols.do{|x|
                        rows.do{|y|
                                var sum= 0;
                                neighbours.do{|point|
                                        var nX= x+point[0];
                                        var nY= y+point[1];
                                        if(wrap, {
                                                sum= sum+envir.at(nX%cols, nY%rows); //no borders
                                        }, {
                                                if((nX>=0)&&(nY>=0)&&(nX<cols)&&(nY<rows), {sum= sum+envir.at(nX, nY)}); //borders
                                        });
                                };
                                if(rule[1].includes(sum), {     //borne
                                        copy.put(x, y, 1);
                                }, {
                                        if(rule[0].includes(sum), {     //lives on
                                                copy.put(x, y, envir.at(x, y));
                                        }, {    //dies
                                                copy.put(x, y, 0);
                                        });
                                });
                        };
                };
                envir= copy.deepCopy;
        };
        Routine({while{w.isClosed.not} {u.refresh; i.postln; (1/20).wait}}).play(AppClock);
        w.front;
)


Recording Audio and Video from SuperCollider on Ubuntu Studio

Recently, I needed to record my screen and audio while SuperCollidering on Ubuntu Studio. (This will work on other operating systems also also with some tweaks.) This is the code I included:

s.waitForBoot({

("ffmpeg -f jack -ac 2 -i ffmpeg -f x11grab -r 30 -s $(xwininfo -root | grep 'geometry' | awk '{print $2;}') -i :0.0 -acodec pcm_s16le -vcodec libx264 -vpre lossless_ultrafast -threads 0" +
    "/home/celesteh/Documents/" ++ Date.getDate.bootSeconds ++ ".mkv"
        ).runInTerminal;

....

})

As far as I am able to determine, that allows you to record stereo audio from jack and the screen content of your primary screen. Obviously the syntax is slightly dense.

I was unable to figure out how to tell it to automatically get the jack output I wanted, so in order to run this, I first started JACK Control, used that to start jack, and then evaluated my SC code, which included the above line. That opened a terminal window.

Then I went back to JACK Control and clicked the Connect button to open a window with a list of connections. I took the supercollider output connections and dragged them to the ffmpeg input connections, which enabled the sound. This step is very important, as otherwise the recording will be silent. The dragging is illustrated in the accompanying image by the thick, red line.

Then AFTER making the audio connection, I started making sound with SuperCollider, which was then recorded. When I finished, I went to the terminal window opened by supercollider and typed control-c to stop the recording. Then, I quit the SC server and JACK.

After a few moments, the ffmpeg process finished quitting. I could then watch the video and hear sound via VLC (on my system, VLC does not play audio if jack is running, so check that if your film is silent). I used OpenShot to cut off the beginning part, which included a recording of me doing the jack connections.

The recommended way of recording desktop output on ubuntu studio is recordmydesktop, but there are many advantages to doing it with ffmpeg instead. There is a bug in recordmydesktop which means it won’t accept jack input, so you have to record the video and audio separately and splice them together. (If you chose to do this, note you need to quit jack before recordmydesktop will generate your output file). Also, I found that recordmydesktop took up a lot of processing power and the recording was too glitchy to actually use. The command line for it is a LOT easier, though, so pick your preference.

You need not include the command to start ffmpeg in your SuperCollider code. Instead of using the runInTerminal message, you can just run it in a terminal. As above, make sure you start JACK before you start this code and don’t forget to make the connections between it and ffmpeg. I prefer to put it in the code because then I don’t forget to run it, but this is a matter entirely for personal preference.

Cross-platform

Almost all of the programs linked above are cross-platform, so this is very likely to work on mac or windows with only a few changes. Mac users have two ways to proceed. One is to use jack with QuickTime. You can use jack to route audio on your system so the SuperCollider output goes to the quicktime input. Or, if you’re in windows or just want to use ffmpeg, you will need to change the command line so that it gets the screen geometry in a different way and so that it captures from a different device. Check the man pages for ffmpeg and for avconv, which comes with it. Users on the channel on freenode can also help. Leave a comment if you’ve figured it out.

Compiling superCollider on Ubuntu studio 12.0.4

For various annoying reasons, I’ve just reformatted my hard drive and freshly installed the latest Ubuntu Studio LTS. As soon as I’d restored my home directory, I set to work compiling SuperCollider.
The README_Linux.txt file covers most of what you need to know, but here’s what I had to type to get going:

sudo apt-get install git cmake libsndfile1-dev libfftw3-dev  build-essential  libqt4-dev libqtwebkit-dev libasound2-dev libavahi-client-dev libicu-dev libreadline6-dev libxt-dev pkg-config subversion libcwiid1 libjack-jackd2-dev emacs

cd ~/Documents

git clone --recursive https://github.com/supercollider/supercollider.git

cd supercollider

mkdir build

cd build

cmake .. -DSUPERNOVA=OFF

make

You have to disable supernova because it requires a newer version of gcc. If all that works, then you can install it:

sudo make install

I had something go wrong with my installed SC libs, but that’s a side issue.

Granular Performance Thingee

I’ve been working on a thing where I draw shapes and they turn into granular synthesis clouds. I have some ideas for what I want it to do, but one thing I’d really like to do is be able to record video of me playing it, via recording my desktop – with audio. Indeed, I need to sort this out very shortly, as I’d like to submit a proposal for the SuperCollider Symposium. (If anybody has advice on how to record this on Ubuntu, please do leave a comment.)

As it was, all I’ve got is a screen shot of post-drawing and an audio file of what it sounded like whilst playing.

Although this has quite a few bugs, some people have expressed some interest the system, so I’m making source code available. It’s written in SuperCollider. This is nowhere near a release version and the code is ugly, so be forewarned. but it mostly works. It relies on the Conductor quark by Ron Kuivila. There are two files. Put Cloud.sc in your Extensions folder. The open up test.scd in the lovely new SC 3.6 IDE. Or it will probably work in earlier versions. Select the whole file and evaluate it.
One of the windows that opens is a controller with a play and stop button. One is a large black window. And one is a bunch of blank buttons. I have no idea what they don’t have labels. Draw in the black window using click and drag (or a stylus if you’re lucky). Press play in the controller window. When the cursor gets to the cloud, you can hopefully hear it playing. There are some sliders in that window to change the speed of the cursor. You can do this while it is playing. You can also draw new shapes while it playing. The blank buttons have sound parameters attached to them. So if you press one, the next cloud you draw will have that button’s parameters. (The buttons are supposed to say things like ‘Short Sine Grains’ or ‘Long Saw Grains’.) If you want to modify a cloud, you can right click on it to get a popup window that changes some of the sound settings.
You can’t save your work, but the server window has a record button on it. If you press that and then play your drawing and then hit the ‘stop recording’ button in the server window, you’ll be able to record the audio in real time..
I had the idea that I could draw on this with my stylus while on stage in real time, but when I plugged my computer into the projector, it changed my screen parameters and my stylus calibration failed. I’m sure there is a work around, which, ideally, I’d like to find by next week.