Benjamin Graf – mblght

Lighting guys sit behind lighting desks and hit buttons for the duration of concerts, so lights in shows are actually usually boring, despite having valuable and variable equipment.
Wouldn’t it be great if you could do stochastic lights with envelope controls?
SuperCollider does solid timing and has support for different methods of dispersing stuff and has flexible signal routing.
He’s got an object that holds descriptions of the capabilities of any lighting fixture – moving, colour, on, off, etc.
He uses events in the pattern system as one way of changing stuff.
He’s added light support to the Server. So you can do SinOsc control of light changes, sendint to contorl busses. He’s also made light UGens.
He ended up live coding the lights for a festival.

Questions

What about machine listening? It would be easy to do in this system.
The code is on github.

Fun with Cellular Automata

The SuperCollider code is stolen from redFrick’s blog post about cellular automata. All I have added is the SynthDef, one line at the top to set sound variables and one line in the drawing function to make new synths based on what blocks are coloured in.
Ergo, I haven’t really got anything to add, but this is still kind of fun and Star-Trek-ish

(

SynthDef(sinegrain2, {arg pan = 0, freq, amp, grainDur; var grain, env;

 env = EnvGen.kr(Env.sine(grainDur * 2, amp), doneAction:2); // have some overlap
 grain= SinOsc.ar(freq, 0, env);


Out.ar(0,Pan2.ar(grain, pan))}).add;

)


//game of life /redFrik
(
// add sound-related variables
var grainDur = 1/20, lowfreq = 200, hifreq = 800, lowamp =0.005, hiamp = 0.085, rows = 50, cols = 50;


      var envir, copy, neighbours, preset, rule, wrap;
        var w, u, width= 200, height= 200, cellWidth, cellHeight;
        w= Window("ca - 2 pen", Rect(128, 64, width, height), false);
        u= UserView(w, Rect(0, 0, width, height));
        u.background= Color.white;
        cellWidth= width/cols;
        cellHeight= height/rows;
        wrap= true;                     //if borderless envir
        /*-- select rule here --*/
        //rule= #[[], [3]];
        //rule= #[[5, 6, 7, 8], [3, 5, 6, 7, 8]];
        //rule= #[[], [2]];                                             //rule "/2" seeds
        //rule= #[[], [2, 3, 4]];
        //rule= #[[1, 2, 3, 4, 5], [3]];
        //rule= #[[1, 2, 5], [3, 6]];
        //rule= #[[1, 3, 5, 7], [1, 3, 5, 7]];
        //rule= #[[1, 3, 5, 8], [3, 5, 7]];
        rule= #[[2, 3], [3]];                                           //rule "23/3" conway's life
        //rule= #[[2, 3], [3, 6]];                                      //rule "23/36" highlife
        //rule= #[[2, 3, 5, 6, 7, 8], [3, 6, 7, 8]];
        //rule= #[[2, 3, 5, 6, 7, 8], [3, 7, 8]];
        //rule= #[[2, 3, 8], [3, 5, 7]];
        //rule= #[[2, 4, 5], [3]];
        //rule= #[[2, 4, 5], [3, 6, 8]];
        //rule= #[[3, 4], [3, 4]];
        //rule= #[[3, 4, 6, 7, 8], [3, 6, 7, 8]];               //rule "34578/3678" day&night
        //rule= #[[4, 5, 6, 7], [3, 5, 6, 7, 8]];
        //rule= #[[4, 5, 6], [3, 5, 6, 7, 8]];
        //rule= #[[4, 5, 6, 7, 8], [3]];
        //rule= #[[5], [3, 4, 6]];
        neighbours= #[[-1, -1], [0, -1], [1, -1], [-1, 0], [1, 0], [-1, 1], [0, 1], [1, 1]];
        envir= Array2D(rows, cols);
        copy= Array2D(rows, cols);
        cols.do{|x| rows.do{|y| envir.put(x, y, 0)}};
        /*-- select preset here --*/
        //preset= #[[0, 0], [1, 0], [0, 1], [1, 1]]+(cols/2); //block
        //preset= #[[0, 0], [1, 0], [2, 0]]+(cols/2); //blinker
        //preset= #[[0, 0], [1, 0], [2, 0], [1, 1], [2, 1], [3, 1]]+(cols/2); //toad
        //preset= #[[1, 0], [0, 1], [0, 2], [1, 2], [2, 2]]+(cols/2); //glider
        //preset= #[[0, 0], [1, 0], [2, 0], [3, 0], [0, 1], [4, 1], [0, 2], [1, 3], [4, 3]]+(cols/2); //lwss
        //preset= #[[1, 0], [5, 0], [6, 0], [7, 0], [0, 1], [1, 1], [6, 2]]+(cols/2); //diehard
        //preset= #[[0, 0], [1, 0], [4, 0], [5, 0], [6, 0], [3, 1], [1, 2]]+(cols/2); //acorn
        preset= #[[12, 0], [13, 0], [11, 1], [15, 1], [10, 2], [16, 2], [24, 2], [0, 3], [1, 3], [10, 3], [14, 3], [16, 3], [17, 3], [22, 3], [24, 3], [0, 4], [1, 4], [10, 4], [16, 4], [20, 4], [21, 4], [11, 5], [15, 5], [20, 5], [21, 5], [34, 5], [35, 5], [12, 6], [13, 6], [20, 6], [21, 6], [34, 6], [35, 6], [22, 7], [24, 7], [24, 8]]+(cols/4); //gosper glider gun
        //preset= #[[0, 0], [2, 0], [2, 1], [4, 2], [4, 3], [6, 3], [4, 4], [6, 4], [7, 4], [6, 5]]+(cols/2); //infinite1
        //preset= #[[0, 0], [2, 0], [4, 0], [1, 1], [2, 1], [4, 1], [3, 2], [4, 2], [0, 3], [0, 4], [1, 4], [2, 4], [4, 4]]+(cols/2); //infinite2
        //preset= #[[0, 0], [1, 0], [2, 0], [3, 0], [4, 0], [5, 0], [6, 0], [7, 0], [9, 0], [10, 0], [11, 0], [12, 0], [13, 0], [17, 0], [18, 0], [19, 0], [26, 0], [27, 0], [28, 0], [29, 0], [30, 0], [31, 0], [32, 0], [34, 0], [35, 0], [36, 0], [37, 0], [38, 0]]+(cols/4); //infinite3
        //preset= Array.fill(cols*rows, {[cols.rand, rows.rand]});
        preset.do{|point| envir.put(point[0], point[1], 1)};
        i= 0;
        u.drawFunc= {
                i= i+1;
                Pen.fillColor= Color.black;
                cols.do{|x|
                        rows.do{|y|
                                if(envir.at(x, y)==1, {
                                        Pen.addRect(Rect(x*cellWidth, height-(y*cellHeight), cellWidth, cellHeight));
    // the new line
    Synth.new(sinegrain2,[freq, x.linexp(0, cols, lowfreq, hifreq), amp, y.linexp(0, rows, lowamp, hiamp), pan, 0, grainDur, grainDur])
                                });
                        };
                };
                Pen.fill;
                cols.do{|x|
                        rows.do{|y|
                                var sum= 0;
                                neighbours.do{|point|
                                        var nX= x+point[0];
                                        var nY= y+point[1];
                                        if(wrap, {
                                                sum= sum+envir.at(nX%cols, nY%rows); //no borders
                                        }, {
                                                if((nX>=0)&&(nY>=0)&&(nX<cols)&&(nY<rows), {sum= sum+envir.at(nX, nY)}); //borders
                                        });
                                };
                                if(rule[1].includes(sum), {     //borne
                                        copy.put(x, y, 1);
                                }, {
                                        if(rule[0].includes(sum), {     //lives on
                                                copy.put(x, y, envir.at(x, y));
                                        }, {    //dies
                                                copy.put(x, y, 0);
                                        });
                                });
                        };
                };
                envir= copy.deepCopy;
        };
        Routine({while{w.isClosed.not} {u.refresh; i.postln; (1/20).wait}}).play(AppClock);
        w.front;
)


Recording Audio and Video from SuperCollider on Ubuntu Studio

Recently, I needed to record my screen and audio while SuperCollidering on Ubuntu Studio. (This will work on other operating systems also also with some tweaks.) This is the code I included:

s.waitForBoot({

("ffmpeg -f jack -ac 2 -i ffmpeg -f x11grab -r 30 -s $(xwininfo -root | grep 'geometry' | awk '{print $2;}') -i :0.0 -acodec pcm_s16le -vcodec libx264 -vpre lossless_ultrafast -threads 0" +
    "/home/celesteh/Documents/" ++ Date.getDate.bootSeconds ++ ".mkv"
        ).runInTerminal;

....

})

As far as I am able to determine, that allows you to record stereo audio from jack and the screen content of your primary screen. Obviously the syntax is slightly dense.

I was unable to figure out how to tell it to automatically get the jack output I wanted, so in order to run this, I first started JACK Control, used that to start jack, and then evaluated my SC code, which included the above line. That opened a terminal window.

Then I went back to JACK Control and clicked the Connect button to open a window with a list of connections. I took the supercollider output connections and dragged them to the ffmpeg input connections, which enabled the sound. This step is very important, as otherwise the recording will be silent. The dragging is illustrated in the accompanying image by the thick, red line.

Then AFTER making the audio connection, I started making sound with SuperCollider, which was then recorded. When I finished, I went to the terminal window opened by supercollider and typed control-c to stop the recording. Then, I quit the SC server and JACK.

After a few moments, the ffmpeg process finished quitting. I could then watch the video and hear sound via VLC (on my system, VLC does not play audio if jack is running, so check that if your film is silent). I used OpenShot to cut off the beginning part, which included a recording of me doing the jack connections.

The recommended way of recording desktop output on ubuntu studio is recordmydesktop, but there are many advantages to doing it with ffmpeg instead. There is a bug in recordmydesktop which means it won’t accept jack input, so you have to record the video and audio separately and splice them together. (If you chose to do this, note you need to quit jack before recordmydesktop will generate your output file). Also, I found that recordmydesktop took up a lot of processing power and the recording was too glitchy to actually use. The command line for it is a LOT easier, though, so pick your preference.

You need not include the command to start ffmpeg in your SuperCollider code. Instead of using the runInTerminal message, you can just run it in a terminal. As above, make sure you start JACK before you start this code and don’t forget to make the connections between it and ffmpeg. I prefer to put it in the code because then I don’t forget to run it, but this is a matter entirely for personal preference.

Cross-platform

Almost all of the programs linked above are cross-platform, so this is very likely to work on mac or windows with only a few changes. Mac users have two ways to proceed. One is to use jack with QuickTime. You can use jack to route audio on your system so the SuperCollider output goes to the quicktime input. Or, if you’re in windows or just want to use ffmpeg, you will need to change the command line so that it gets the screen geometry in a different way and so that it captures from a different device. Check the man pages for ffmpeg and for avconv, which comes with it. Users on the channel on freenode can also help. Leave a comment if you’ve figured it out.

Compiling superCollider on Ubuntu studio 12.0.4

For various annoying reasons, I’ve just reformatted my hard drive and freshly installed the latest Ubuntu Studio LTS. As soon as I’d restored my home directory, I set to work compiling SuperCollider.
The README_Linux.txt file covers most of what you need to know, but here’s what I had to type to get going:

sudo apt-get install git cmake libsndfile1-dev libfftw3-dev  build-essential  libqt4-dev libqtwebkit-dev libasound2-dev libavahi-client-dev libicu-dev libreadline6-dev libxt-dev pkg-config subversion libcwiid1 libjack-jackd2-dev emacs

cd ~/Documents

git clone --recursive https://github.com/supercollider/supercollider.git

cd supercollider

mkdir build

cd build

cmake .. -DSUPERNOVA=OFF

make

You have to disable supernova because it requires a newer version of gcc. If all that works, then you can install it:

sudo make install

I had something go wrong with my installed SC libs, but that’s a side issue.

Granular Performance Thingee

I’ve been working on a thing where I draw shapes and they turn into granular synthesis clouds. I have some ideas for what I want it to do, but one thing I’d really like to do is be able to record video of me playing it, via recording my desktop – with audio. Indeed, I need to sort this out very shortly, as I’d like to submit a proposal for the SuperCollider Symposium. (If anybody has advice on how to record this on Ubuntu, please do leave a comment.)

As it was, all I’ve got is a screen shot of post-drawing and an audio file of what it sounded like whilst playing.

Although this has quite a few bugs, some people have expressed some interest the system, so I’m making source code available. It’s written in SuperCollider. This is nowhere near a release version and the code is ugly, so be forewarned. but it mostly works. It relies on the Conductor quark by Ron Kuivila. There are two files. Put Cloud.sc in your Extensions folder. The open up test.scd in the lovely new SC 3.6 IDE. Or it will probably work in earlier versions. Select the whole file and evaluate it.
One of the windows that opens is a controller with a play and stop button. One is a large black window. And one is a bunch of blank buttons. I have no idea what they don’t have labels. Draw in the black window using click and drag (or a stylus if you’re lucky). Press play in the controller window. When the cursor gets to the cloud, you can hopefully hear it playing. There are some sliders in that window to change the speed of the cursor. You can do this while it is playing. You can also draw new shapes while it playing. The blank buttons have sound parameters attached to them. So if you press one, the next cloud you draw will have that button’s parameters. (The buttons are supposed to say things like ‘Short Sine Grains’ or ‘Long Saw Grains’.) If you want to modify a cloud, you can right click on it to get a popup window that changes some of the sound settings.
You can’t save your work, but the server window has a record button on it. If you press that and then play your drawing and then hit the ‘stop recording’ button in the server window, you’ll be able to record the audio in real time..
I had the idea that I could draw on this with my stylus while on stage in real time, but when I plugged my computer into the projector, it changed my screen parameters and my stylus calibration failed. I’m sure there is a work around, which, ideally, I’d like to find by next week.

LiveBlogging: Modality – modal control in SuperCollider

by many people

Modality is a loose collaboration to make a toolkit to hook up controllers to SC.  Does mapping, including some complex stuff and some on-the-fly stuff.

Marije spoke a bit of how they began collaborating

Concept – support many devices over many protocols. Make a common interface. Easily remap.

Devices

They currently support MIDI and HID. the common interface is MKtl. Provides a system to process the data. They have templates. Templates for common ways of processing. Same interface for MKtl and MDispatch. (they may move to FRP (I don’t know what that is))

Ktl quark is out of date.

(I think I might be interested in contributing to this project – or at least provide templates for stuff)

Different protocol have different transport mechanisms. Things very by OS. Different controllers have different semantics.

A general solution is not trivial.

Scaling is different on different OSes. Names of devices may have variations. MIDI has some device name issues.  real MIDI (non-usb) will not report their names, but use MIDI ports.  Similar issues will arise with OSC or SerialPort. 

The device description index is an identity dictionary. It’s got some NanoKontrol stuff in it. I am definitely interested in this…

They’ve got some templates, but it’s still a bit vapourware.

For every button or input on your device, they define what it is, where it is, etc.  This is good stuff.  You can also set the I/O type.

Device descriptions have names, specifications, platform differences, hierarchical naming (for use in pattern-matching). You can programmatically fill in the description

nanoKontrol, Gamepad, DanceMat, a bunch of things.

Events and signals

Functional reactive processing. Events, data flow, change propogation. FRP – functional reactive programming

These are functions without sideFX until you get to the output phase.

In the FP Quark – functional programming Quark.

Events are encoded in an event stream.  Event Source with a do method adds a side effect.  When somethng happens (is “fired”), do the do.  Only event sources can be fired.

the network starts with an event source. 

Signals are similar but have state? You can ask for the value and change it.

To create the network use combinators.

inject has state internally.

Dynamic Event Switching limits and event depending on a selector.  this is kind of like the gate thing in max.

With Modality, every control has an elements, every element has a singal and a source. Controls have keys.

You can combine values, attach stuff to knob changes. Easy to attach event streams to functions.

this is complex to describe, but works intuitively in practice.  You can do deltas, accumulators, etc.

Closing remarks

this is on github, but it not yet released.  depends on the FP quark.

Needs gui replacements.  Needs a backend for OSC devices.

Needs some hackin in the SC source.

Questions

  • Would you be interested in doing the descriptors in JSON, so it can be used by non-SC guys? Yeah, why not.  This is a good plan, even.

Liveblogging the Sc symposium: Overtone Library

Collaborative programmable music. Runs in LISP (dialect of LISP?) that runs in the JVM.  It’s got concurrency stuff. It’s programmable. It runs in Clojure.

Deals with the SC server.  This sort of looks like it’s running in emacs…

All SC Ugens are available.  He built a bunch of metadata for this, a lot like the SC classes for the Ugens.  There is in-line documentation, which is nice.  The Node-tree shows all currently running UGens.

Midi events are received as events and can be used by any function. Wiggle your nano controller.  This came with the JVM.  So all Java libraries are supported.  OSC support. Serial support.

Synth code and musical expression code can be written in the same language.  Specify phrases in a score, concat them.  The language is relatively readable. as far as lisp goes.  Most things are immutable, so this is good for concurrence. Too many variables can confuse the programmer.

He’s using a monome. Every button call has a function, which has the X,Y coordinate, whether it’s pressed or released and a history of all other button presses.

Now he’s doing some mono-controlled dubstep.

C-Gens are re-usable UGen trees, possible a bit like synthdefs. Can do groups also.

This can also use Processing.org stuff, because it’s got java.  OpenGL graphics also supported. They can hook into any UGen

Anything can be glued together.

This is kind of cool. But you need to deal with both java and lisp.

Questions

  • Collaboration?  It helps you deal with shared state, without blocking or locking.

LiveBlogging SC: Mx

by Chris Satinger (aka Felix Crucial)

Mx is a tool for connecting objects together.  audio, control, midi etc

Anything that plays on a bus, the bus can go in and it can be put on a mixer.

This mixer is a GUI thing. You can use it just to glue on things like fadeouts or amplitude control.

Just write a descriptor file.

The system is not the gui, it’s the patching framework.

You can patch synthdefs together. and edit the synthdefs on the fly.

This patches things a wee bit like PD.

It checks for bad values and prevents explosions.

There is no time line system. It’s a hosting system and only manages connections and starts and stops. You can put in other timelines

It uses environment variables. ~this is the unit.

~this.sched(32, { … }, { … })

You can put documents in the Mx. Those can change the Mx as it runs, so it’s all very self-modifying. (When I was an undergrad, they told me this was naughty, but like many other naughty things, it can be very cool.)

Things have outlets and inlets that you can connect.   There is apparently a querying system which we will learn about.

He gets good music out of the system despite having no idea what’s going on a lot of the time

Dragging cables is fun for a while, but then…

Questions

  • Adaptors? The describe what an object is and describes the inlets and outlets.  There’s also a system for announcements. Cable strategies also define behaviours.

Liveblogging SC: live coding with assembler

Dave – 

Esoteric programming languages are an interesting thing we might care about.

CPUs in mine craft – you can see the processing.

Space invaders assembler with lines showing the order of execution.

Very slow execution can show what’s going on. This can be sonified.

 Till – 

BetaBlocker is a quark in sc3-plugins

(talk to him if you want to go work in helsinki)

BBlocker never crashes, but it  might not do anything.  It has a stack and a heap and a program counter.

This is like Dave’s grid on the DS, where it runs in an infinite loop.

UGens

DetaBlockerBuf – is a demand rate UGen. So you can do weird computations in your ugen?  It does a programming step everytime it gets triggered.

The programs are stored in buffers. You can do random ones.

There is also a visual thingee.

BBlockerBuf exposes the stack and the program counter.

BBlockerProgram holds a beta blocker program for the assembler. 

You can create a program with the assembler code.  you can play the program.

BetaBlockerProgram([NOP, POP, ADD]) etc

Tom Hall – 

John Cage would be 100 this year.

A metaphorically digital, constrained, sonic system. An invitation to listen

Questions

  • Is the heap a wave table? No, the output of the program is the sound.
  • Is it a coincidence that it sounds like putting a induction coil on a laptop?  Um, maybe. He says it sounds very 8-bit-y. Maybe because it’s 8bit.
  • Is it easy to write logical seeming programs, or are they mostly random? It is possible to write things that make sense. The fun of it is the weirdness and things getting trashed by accident.  Dave is going genetic programing with a system like this.
  • The output is one byte at a time? No, each step does something and the output is something I didn’t understand.
  • Graphics question? Not Till’s field.

I think this could be really useful for student or teenagers who are sort of intereted in programming.

LiveBloggin the SC symposium: Keynote – Takeko Akamatsu

Using SC since 2000.

Main project is Craftwife. (All members are housewives, she says).  Going since 2008.  There are 5 members now. They are between pop and art culture.

She started initially doing demos of Remkon, an iOS OSC app.  How to make this popular? 

  • Borrow the image of something already famous – Karftwerk.
  • What is Originality? – SC patterns
  • Crash of music industry – live to record, record to live. Craftwife should be live only

Influenced by “the Work of Art in the Age of Mechanical Reproduction”

She makes extensive use of PatternProxies

She also works with Craftwife + Kaeso+.  Kaseo+ is a circuit bender.  she controls strobe lights, analogue synthesier, etc.

SuperCollider.jp

SC in Japan. They have a meetup in Tokyo. She posts on twitter. She does workshops.

During her show in the Hague in 2007, she got frustrated and smashed her computer. And then quit making computer music for a year and grew vegetables.

She held a workshop at a place called the WombLounge.  Not everyone was a musician. She covered interaction between many environments.

SuperColliderSpeedCodingShow

She will give people a theme and five minutes and they have to make a sound.

4 people are quickly coding something on the theme of spring.

SuperCollider.future

She wants the book in an eBook in Japanese.

SuperCollider.cycling

She has attached a sensor to her exercise bike and uses this during her workout routine.

She’s tired of loud sounds. And sound systems are annoying.

She played a video of JMC saying what he wants for sc4. It’s not client server and it’s a lot smaller.