Communication Control and Stage Sharing in Netowrked Live Coding

Collaborative live coding is more than one performer live coding at the same time, networked or not, he says.
Network music can by synchronus or asynchornos, collocated or remote.
There are many networked live coding environments.
You can add instrumental performers to live code stuff, for example by live-generating notation. Or by having somebody play an electronic instrument that is being modifies on the fly in software.
How can a live coding environment facillitate mixed collaboration? How and what sould people share? Code text? State? Clock.? variables? How to communicate? How do you share control? SO MANY QUESTIONS!!
They have a client/server model where only one machine makes sound. No synchronisation is required. There is only one master state. However, there are risks of collision and conflict and version control.
the editor runs in a web browser because every fucking thing is in a browser now.
Shows variables in a window and a chat window and a big section of the text. shows the live value of all variables in the program state. Can also show the network/live value.
Now showing collusion risk in this. if two coders use the same variable name, this creates a conflict. Alice is corrupting Bob’s code, but maybe Bob is actually corrupting her code. Anyway, every coder has their own name space and can’t access each other’s variables, which seems reductive. Maybe Bob such just be less of a twat. The live variable view shows both Alice’s and Bob’s variables under separate tabs.
His demo says at the top (‘skip this demo is late’
How do people collaborate if they want to mess around with each other’s variables? They can put some variables ina shared name space. click your variables and hit the shared button and woo shared.
How do you share control?
Chat messages show in the mobile instrument screen for the ipad performer. The programmer can submit a function to the performer in such a way so that the performer has agency in deciding when to run the function.
the tool for all of the this is called UrMus

Questions

Would live coders actually be monitoring each other’s variables in a performance?
Of course, this used in general coding, and hand waving

Live code, code based interfaces and live patching – theory and practice

Some theory

Not every use of code interaction on stage is an instance of live coding. When I first started working with SuperCollider a decade ago, I didn’t create GUIs. I started and stopped code on stage. I had comments in the code giving me instructions on how to do this. One piece instructed me to count to ten between evaluating code blocks. Another told me to take a deep breath.
Part of this was because I hadn’t yet learned how to invoke callback methods or use tasks. Some of it was to create a musical timing – a deep breath is not the same as a two second pause. This was undoubtedly using code interactions to make music. But in no sense were these programs examples of live coding. Once a block was started, there was no further intervention possible aside from halting executing or turning down a fader to slowly mute the output. These pieces were live realisations of generative music, which means they have virtually no interactivity once started whether by code or by other means.
There is not a bright line separating code based interfaces from live coding but instead a continuum between pieces like the ones I used to write and blank slate live coding. The more the interactivity of the code, the farther along this continuum something falls. Levels of interaction could be thought to include starting and stopping, changing parameters on the fly, and changing the logic or signal graph on the fly. Changing logic or signal graph would put one closer to blank slate coding than does just changing numbers on something while it plays.
This argument does imply a value judgement about authenticity, however, this is not my purpose. Different types of code interactions are better suited to different circumstance. A piece that is more live coded isn’t necessarily sonically or objectively better. However, this kind of value judgement is useful in applying the metaphor of live coding to other interactions.
I have been pondering for a while whether or not live synthesiser patching is an analogue form of live coding, a question first posed by Julian Rohrhuber (2011) on the live coding email list. On the one hand, the kind of analogue modules used for modular synthesisers were originally developed for analogue computers. The synthesiser itself is a general purpose tool for sound, although, obviously limited to whatever modules are available. (Thus putting it some place between a chainsaw and an idea. (Toplap 2010)) Both computer programs and live patches can quickly grow in complexity to there the performer can no longer comprehend exactly what’s happening. (Collins 2007)
On the other hand, there is no code. However, I’m not sure how much that matters. A PD or MAX patch created on the fly that crates a signal graph is clearly an example of live coding. If for some reason, the patching language had hard limits on what unit generators were available and in what quantity, this would still count. Therefore the transition from virtual to physical seems small. Instead of focussing on the code itself, then, let’s look at the metaphor.
Knob twirling is an example of changing numbers on the fly. Modular synthesisers do contain logic in the forms of gates and switches. This logic and the musical signal routing can be changed on the fly via re-patching. Therefore, a live patching performance that contained all of these elements would be an example of analogue live coding.

Gig Report

I very recently did some live patching in the Live Code Festival in Karlsruhe. Alas, this reasoning of what is or is not live coding did not become clear to me until I was reflecting back on my performance afterwards. This is the first time I was doing patching with any other goals in addition to making nice sounds, which meant I was pushing against places the sounds wanted to settle and I realised on stage I was inadequately prepared. There is both in live coding and live patching a problem of how to prepare for a show, something it has in common with forms of improvised or partially improvised music.
I had a conversation with Scott Wilson about how to practice improvised music that has an agenda. I should have spent the few days before the show building patches that use gates to control timbrel or graph changes. I should have also practised making graph changes in the middle of playing. Instead, I spent the days ahead wrestling with problems with Jack on Linux. I use SuperCollider to manage some panning and recording for me and was having tech problems with it. Mixing analogue and digital systems in this way exposes one to the greater inherent instability of computers. I could make my own stereo autopanner with some envelope followers, a comparator and a panner, so I’ll be looking into putting something together out of guitar pedals or seeing of there is an off-the-shelf solution available.
For this performance, I decided to colour code my cables, following the colour conventions of Sonology in the Hague, so that I would use black for audio, blue for control voltages and red for triggers. This was so users with synthesiser knowledge might be able to at least partly decode my patches. However, this caused a few problems. Normally, I play with cables around my neck and I’ve never before done anything live with cable colours. This time, every time I looked down, I only saw red cables but never actually wanted to use them. For the audience, I tried to make up for the difficulty of seeing a distant synth by using a web cam to project live images of it, but the colour was lost in the low resolution of the web cam. People who tried to just look at the synth directly would have trouble perceiving black cables on a black synth. If I do colour coding again, I need to swap the colours I use and not wear them around my neck. A better webcam might also help.
Aside from the low resolution, the web cam part was successful. I also set the program up so that if I pressed certain buttons on my midi device, slides would be displayed of modules. So when I patched an oscillator, I pushed the oscillator button on the midi control and a labelled picture of an oscillator appeared in the upper left corner. I didn’t always remember to push the buttons, but the audience appreciated the slides and I may extend this in future with more of my modules (I forgot to do the ring mod module) and also extend it to more combinations, so I would have a slide of two oscillators showing FM and one of three showing chaos.
Sonically, the patching seems to have been a success although it was not fun to do because I did have an agenda I was trying to push towards, but had not rehearsed adequately. I want to spend a lot of time now working this out and getting it right and doing another show, but that was it. My next presentation will be all SuperCollider and I need to work on that. I am thinking a lot, though about what I might do for the Other Minds festival next spring. I wonder if live patching would be adequately ambitious for such a high profile gig….

Citations

Collins, Nick. “Live Coding Practice,” 2007. The Proceedings of NIME 2007 [E-JOURNAL]
Available at: <http://www.nime.org/2007/proceedings.php> [Accessed 3 March 2012].
Rohrhuber, Julian. “[livecode] analogue live coding?” 19 February 2011. [Email to livecode list].
Available at: <http://lists.lurk.org/mailman/private/livecode/2011-February/001176.html>
[Accessed: 1 March 2012].
TopLap, “ManifestoDraft.” 14 November 2010. TopLap. [ONLINE] Available at:
<http://toplap.org/index.php/ManifestoDraft> [Accessed: 12 September 2011].

Republic

It’s time for everybody’s favourite collaborative real time network live coding tool for SuperCollider.
Invented by PowerBooks UnPlugged – granular synthesis playing across a bunch of unplugged laptops.
Then some of them started Republic111, which is named for the room number where the workshop where they taught stuff.
Code reading is interesting in network msic partly because of stealing, but also to understand somebody else’s code quickly, or actively understand it by changing it. Live coding is a public or collective thinking action.
If you evaluate code, it shows up in a history file, and gets sent to everybody else in the Republic. You can stop the sound of everybody on the network. All the SynthDefs are saved. People play ‘really equally’ on everybody’s computer. Users don’t feel obligated to act, but rather to respond. Participants spend most of their time listening
Republic is mainly one big class, which is a weakness and should be broken up into smaller classes hat can be used separately. Scott Wilson is working on a newer versions which on github. Look up ‘The Way things May Go on Vimeo’.
Graham and Jonas have done a system which allows you to see a map of who is emitting what sound and you can click on it and get the Tdef that made it.
Scott is putting out a call for participation and discussion about how it should be.

Cyber-physical programming in Extempore – Andrew Sorensen

Cybernetics is a cyber physical system – any closed loop control system.
Cyberphsysical programming is one of these but with a programmer stuck in the loop. The programmer is above the loop and in the loop. Coupling of human, machine and environment.
By changing the world, we understand the world better.
Real-time, real-time. Writing in real-time on a real-time system.
Giant radio telescope is in the works in Australia. they will have huge anoubts of data to deal within real-time.

Extempore

This is a high performance computing thing for real time. It tries to analyse code to figure out how long it’s going to run. Supports embedded computing.
xtlang is a subversion of this? Everything is hot swappable.
It’s a member of the lisp family. Sort of. static typing. No garbage collection. It’s fast and determinate – it will always run at the same speed every time you run it.
This a systems language that feels like a dynamic language. You think you’re doing lisp, but it’s all cyber-physical.

Questions

How old is this language? 2.5 years.
What did the code look like for the example of it’s use in the video he showed? There are online examples.
… this talk was somewhat over my head, as are many of the questions….
Is the compiler written in Extempore? Not yet. It’s in scheme right now.

Benjamin Graf – mblght

Lighting guys sit behind lighting desks and hit buttons for the duration of concerts, so lights in shows are actually usually boring, despite having valuable and variable equipment.
Wouldn’t it be great if you could do stochastic lights with envelope controls?
SuperCollider does solid timing and has support for different methods of dispersing stuff and has flexible signal routing.
He’s got an object that holds descriptions of the capabilities of any lighting fixture – moving, colour, on, off, etc.
He uses events in the pattern system as one way of changing stuff.
He’s added light support to the Server. So you can do SinOsc control of light changes, sendint to contorl busses. He’s also made light UGens.
He ended up live coding the lights for a festival.

Questions

What about machine listening? It would be easy to do in this system.
The code is on github.

David Ogborn: EspGrid

In Canada, laptop orchestras get tones of gigs.
Naive sync methods: do redundant packet transmission – so send the same value several times in a row. This actually increases the chance of collision, but probably one will get through. Or you can schedule further in advance and schedule larger chunks – so send a measure instead of just sending a beat.
download it from esp.mcmasters.ca. mac only
5 design principles

  • Immediacy – launch it and you’ve got stuff going right away
  • Decentralisation – everything is peer to peer
  • Neutrality – works with chuck, supercollider, whatever
  • Hybridity – they can even use different software on the same computer at the same time
  • Extensibility – it can schedule arbitrary stuff

The grid has public and private parts. EspGrid communicates with other apps via localhost OSC. Your copy of supercollider does not talk to the larger network. EspGrid handles all that.
The “private protocol” is not osc. It’s going to use a binary format for transmission. Interoperability is thus only based on client software, not based on the middleware.
Because the Grid thing runs OSC to clients, it can run on a neighbour’s computer and send the osc messages to linux users or other unsupported OSes.
The program is largely meant to be run in the background. You can turn a beat on or off, and this is shared across the network. You can chat. You can share clipboards. Also, Chuck will dump stuff directly.
Arbitrary osc messages will be echoed out, with a time stamp. you can schedule them for the future.
You can publish papers on this stuff or use it to test shit for papers. Like swap sync methods and test which works best.
Reference Beacon does triangulation to figure out latencies.
He wants to add WAN stuff, but not change the UI, so the users won’t notice.

Question

Have they considered client/server topology for time sync? No. A server is a point of failure.
Security implications? He has not considered the possibility of sending naughty messages or how to stop them.
Licence? Some Open Source one… maybe GPL2. It’s on google code.

Chad McKinney – Lich.js – A Networked Audio / Visual Live Coding Language

They started with SuperCollider and have gone on from there. He’s into updates in browser technologies.
He decided to write a language first as a way to start live coding.
Uses web audio and web gl
This language is GPL2 and is on github
If you’re on Chrome, go mess with http://www.chadmckinneyaudio.com/Lich.js/Lich.html

Battery dying

Alex McLean

He did command line scripts 2001-2004, then started working in perl, the Haskell.
slub – writing code to make music to drink beer to.
Feedback.pl – writing code to write code to make music to drink beer to
He read Laurie Spiegel’s paper on manipulations of musical patterns, so he got into pattern languages (ie HMSL, Common music, SuperCollider)
Tidal is embedded in Haskell for pattern manipulation for music. Complexity through combination of simplicity.
Structures could be trees of structures…
So he moved to functions of time. Time is an integer. Give a Pattern a time and it gives you an integer and the periodicity. This is limited because time is an integer.
Now, he thinks of time as cyclic, with repetition. So make time a floating point instead of an int. But this makes lookups hard.
So the thought of having patterns be a sequence with discrete stuff or a signal, which is indexed by a rational and is non-discrete. However, mixing analog and digital in one data type is a problem.
So he made separate types, but then this caused massive code inflation.
He’s gone back to one type again.
Haskell is intense…..
and REALLY concise

Questions

Does the system have state? No, which means he can’t even have random numbers.
Is time still a loop in the current version? Notionally, yes. But representationally, it’s a number, just in the functions, so its what you make it.