Collaboration in Pd by Kerry Hagan

She has been collaborating with Miller Puckette since 2014. She is a composer, he is a maths guy. It works well together.

She uses Markov Chains with everything she writes. Xenakis used stochastic Markov chains with an equilibrium states. She uses finite state tables with equilibrium states.

Miller thinks composers want Markov chains to get a particular non equilibrium percentage outputs. He wants maximally uniform random results.

This created repeating patterns, rather than stochastic results. If you use a Fibonacci pattern however, there is no repeat aside from rhythmic motifs. Irrational ratios create non repeating patterns.

image

Puckette calls this z12 or claves.

How do you use this musically? Try speeding out up to the audio rate!

You get a nice stochastic drone!

The used delays and band pass filters to separate out different lines of material from a single z12.

She does acousmatic music without gesture. This is to surround people with immersive textures.

Now she’s talking about coupled oscillators which is slightly beyond me. It’s a mass spring system? It’s non linear.

She picked a bunch of cool sounds and had the coefficients ramp from one to another.

Rev3~ is a good reverb. And is updated, so it should be rev4~ but anyway.

Oh god the lines on pd patches are yikes.

This a very fruitful collaboration.

Bowers does a random walk involving weighted bit flipping. This is a bounded walk. Lots of small steps, fewer big steps. She’s making a piece with dentist drill sort of noises that will play on bone conduction headphones.

Q. Z12?

A. They are maximally uniform probabilities. It normalises the ratio to 1 then looks at the ratios of x to y and send the one most needed to reach the desired ratio.

So if it’s 3 to 4 it will send the out put most needed to get the right ratio. This repeats with rational numbers.

Q. ?

A. She got a spatulation algorithm which is exactly the one I want for panning. Each speaker has an angle. Each signal has an angle and a width. She used a cosine signal to pan it. In order to get rid of the idea of trajectory she used a sine wave with variable delay.

Q. Can you think of ways to enable other cross discipline collaborations?

A. With students, the challenge is that students don’t respect programmers and programmers think composing is easy. Therefore successful collaborators have a little knowledge of one and are experts on the other. This was tricky to manage at IRCAM, for example. Successful collaborators respect each others skills. This can lead to issues of authorship.

Q. Why use vanilla?

A. More stable! Also she likes building her own stuff. It’s also more useful for mobile platforms.

Q. Couples oscillators

A. They take impulses, like tapping a mass spring object.

Orla Huges the animal iPad orchestra

Facilitates musical play between parent and child in advance of music therapy sessions.

This is like a toy as there is free play.

The target child is between 3-5. It helps parents and children bond. Kids have familiarity with iPads. MobMuPlat is an iPad tool. This is free and build ui for pd apps on ios and android.

It seems to be an OSC app.

Her project is a great example of an app built for devices. She tested it by taking it to a patent and child group.

Users did not know what a tuba looked like, so whee changed the ui to show pictures of the instruments.

Q. This lecturer has a bee in his bonnet about sampling be synthesis… Will the use of samples help get kids interested in real instruments? This is one of the great challenges of our time.

A. She thought about the iPad as an instrument in its own right.

Q. Would this work for children and people in extreme old age?

A. Let’s try it out!

Q. How long did this take to develop?

A. For months.

Q. This is a non idiomatic use of pd!

A. Yes.

Simon kilshaw on libpd

Pd runs inside unity game engine. It takes up almost no disk space.

Can attach scripts to an avatar’s feet to get pd-generated footsteps. This you don’t need a bunch of footstep samples.

A c# script send a floating point to KalimbaPd. So when the y position crosses a threshold, it sends a bang.

A receive object in pdGets the float or bang.

Kalimba let’s you look at the patch, but you can’t deploy it. Lib4pd costs money but can be deployed.

He’s shown an app with a leap motion that is a theremin.

Q. Is there an advantage to doing  synthesis instead of samples?

A. Recordings of ambient sound might take hours of disk space.

Q. Are gamers happy with synthetic sounds?

A. Maybe.

Q. Do visual cues help with figuring out the semantic meaning of a sound FX?

A. Probably.

Q. How flexible is this?

A. As flexible as you want. Things can be half pure determined to focus on expressive parameters.

Mitchell Turner

Speaking about his piece Blues for Dublin. Influenced by the Delta Blues. But post minimalist blues like Stereolab? He gets bored by Steve Reich.

Works with the electric guitar. Uses a tiny travel guitar. It sounds shit, so he uses a of patch to fix it.

He’s got an ABA form. He is describing his guitar playing techniques.

His pieces solve one problem per each. Hid orbits pieces were live mixed of recorded guitar.

Hid patch is extremely legible. He is good at encapsulation.

He’s built a wee fixed filter and used some reverb. Then he’s got an fx patcher. Huge mess of confections just to change amplitudes of FX.

His template is left audio in, right audio in, amplitude.

He’s got a tap delay line. For a stutter delay that’s a bit smeary, but not granular.

Q. Is this a digital guitar pedal?

A. He has not yet gotten a pedal casing.

Live blogging the of mini con

Interactive music recordings (ala rjdj)  from Keith Hennigan.

1. write some music. The quality of music is the most important parameter for a successful piece

2. Figure out the interaction. What and how you record will be driven by the interaction.

Videogame music is interactive or adaptive or generative. Interactive music has intention on the user’s part. Adaptive music responds to user action aimed at changing noon musical parameters. Generative music changes on its own initiative. This is all dynamic music.

Adaptive recordings could react to GPS, time of day, etc.

Music could change structurally, so the order ABACD could become ACBDA etc.

Or it could play the second always in the same order, but with different stems for alternate lyrics, etc.

3. Programming

The platform you use must be distributable to users. Pd runs on everything.

You are likely to be working with stems, because artists often want to control audio quality. readsf~

Or you may wish to work with samples in order to pitch shift, etc.

Or take full advantage of the system and do your own synthesis. The more pd you use, the greater potential for real time control and interaction.

4. Create an interface
Must be usable by non programmers. You can use 3rd party programme like Unity to create nifty interfaces. This can make apps.

questions

Q. Why do this?

A. We have the technology for the first time. Dynamic sound are like chose your own adventure music. Consumers want this. They don’t want to always be passive.

Q. Classical music is goal oriented. Will the big challenge be coming up with modular pieces of music?

A. Yes, this majorly increases the composers workload. He recommends people try to give this a go.

Q. Are the transitions what make this music exciting? They are a liminal space.

A. Queue to queue transitions are the most difficult to get right. Some game composed have a transition matrix of elements to use at transitional points. Change instrumentation. Add a stinger, etc

Q. What makes music good aside from cultural capital?

A. There can be technical issues within an app, such as unintentional glitches. Music can fail on its own terms. Novelty is not enough.

Q. Does using unity make music expensive to publish?

A. They work with indie devs a lot. Whether this is favourable to paying fees per piece? Maybe find a free kit or work with a designer.

Q. Does this change the definition of authorship of a piece of a music?

A. Is a piece always sound like a single piece then it’s a piece. If it changes unrecognisably, you’ve built an instrument. The intention of the creator also counts.

Musical interface Design – experience oriented frameworks

Why define frameworks? It’s established practice in Human computer Interaction. Because it is useful for designing. they propose heuristics or interaction dimensions.
Existing frameworks have little use in practice. Also ‘interactive installations are often missing (i don’t understand this). Things are very complex and often arbitrary.
He’s showing some cool instruments/installations and aksing us to consider the player’s experience.
their framework is Musical INterface for User Experience Tracking – MINUET
focuses on player experience. it is a design process unfolding time. includes DMIs and interactive installations. (by which they mean they consider installations and notjust instruments)
Design goal: purpose of interactions. Design specification: how to fill objectives.
Design goals are about a high-level user story. Lie a playful experience, or to stimulate creativity, education etc.
Goals can be about people, activities or contexts.
contexts: musical style, physical environment, social environment.
case study: hexagon
An educational interface to train musical perception
they wanted to make it easy to master the iterface ut have a high ceiling for the educational bit.
It’s for tonal music….
speeding through the last slides

Questions

Maraije (sp) wants to know about player, participant and audience as three user types on a scale. More or less informed participants. she also wants to know about presentation, to teach people how to use the instrument or environment.
Interesting points!

the prospects for eye-contolled musical performance

Instrument controlled only by rotation of eyeballs!
He’s only found 6 examples of eyeball music.
Eye trackers shine an IR light on your eyeball and then look for a reflection vector with a camera. Callibrate by having users look at spots on a display.
eyes move to look at objects in peripheral vision. Eyes jump fast.
Data is error-prone. Eyes need visual targets. Only 4 movements per second with eyes.
Eyes can only move to focus. So an eyeball piece needs visual targets.
Audiences have nothing to see with eyeball instruments. Also, eyeball trackers are super expensive.
People have built eyeball instruments for computer musicians or for people with disabilities.
He is showing us documentation of 6 existing pieces.
“It’s sad to see a person [who] is severely disabled” says the presenter, and yet nobody in the documentation looks sad at all….

Questions>

Q: Many of the nime performances were designed for people with disability.
A: Ok

Dimensionality and appropriation in digital music instrument design

Musicans play instruments in unexpected ways. SO they decided to build something and see what people did with it.
appropriation is a process in which a performer develops a working relationship with the instrument. This is exploitation or subversion of the design features of a technology (ie turntablism)
Appropriation is related to style. Appropriation is a very different way of working with something.
How does the design of the instrument effect style and appropriation?
they gave 10 musicians a highly constrained instruments to see if they got style diversity. They made a box with a speaker in it with a position and pressure speaker. the mapped timbre to pressure and pitch to movement. And a control group with only pitch.
People did some unexpected things, like tapping the box, putting the hand over the speaker and licking the sensor (ewww)
The users developed unconventional techniques because of and in spite of constraints
One desgree of freedom had more hidden affordances. The two agrees of freedom had only 2 additional variations and no additional affordances.
Users of the 1 degree of freedom group described it as a richer and more complex device which they had not fully explored. Users of the more complex instrument felt they had explored all options and were upset about pitch range.
The presenter beleives there is a cognitive bandwidth for appropriation. More built options limit exploration of hidden affordances.
This was a very information-rich presentation of a really interesting study.

Questions

Q: Is pitch so dominant that it skews everything? What if you did an instrument that did just timbre?
A: Nobody complained about loudness.
Q: If participants were all musicians, did their primary instrument effect their performance?
A: Some participants were NIMErs, others were acoustic players. They’re studying whether backgrounds effected perofrmance style.

Constraining movement for DMI …..something….

Someone is playing a sort of a squeeze box that is electronic, so you can spin the end of it. Plus it has accelerometers. It’s kind of demo-y, but there’s potential there.
This comes from a movement-based design approach. They came up with the movement first. Design for actions.
Movement-based designed need not use the whole body. It need not be kinecty, non-touch. The material has a physicality.
Now we see a slide with tiny tiny words. It’s about LMA, which is Laban Movement Analysis. For doing a taxonomy of movements. They want to make expressive movement not just for dancers.
Observe movement, explore movement, devise movement, design to support the designed movement. (is this how traditional instrument makes work??)
the analysed a violinist and a no-input mixer. they made a shape change graph. There is a movement called ‘carving’ which he liked. The squeeze box uses the movement and is called ‘the twister’
They gave the prototype to a user study and told them to try moving it (with no sound). Everybody did the carving movement. People asked for buttons and accelerometers (really??)
the demo video was an artistic commission, specifically meant to be demo-y (obvs)

Questions

Laban theory is about ‘effort theory’ about the body. Did the instruments offer resistance?
They looked at interface stiffness, but decided not to focus on it. Effort describes movement dynamics, but is personal to performers.

From Shapes to Bodies: Design for Manufacturing in the Prosthetic Instruments

the Prosthetic Instruments are a family of wearable instruments designed for use by dancers in a professional context. The instruments go on tour without the creators.
the piece was called ‘Les Gestes’ and was a tour in Canada. Instrument designers and composers were from McGill. The choreography and dancing was form a professional dance company in Montreal. Van Grimde Corps Secret.
there were a fuckton of people involved in the production. Lighting gesigners, costume designers, etc all had a stake in instrument design.
One of these instruments was in a concert here and was great. It looks like part of a costume.
The three instruments are called spine, ribs and visors. They are hypothetical extensions to the body. Extra limbs for your body. Dancers wear them in performance. They are removable in this context.
Ribs and Visors are extremely similar. They are touch sensitive. The spine has vertebrae connected by pvc tubing and a PET-G rod.
Professional artistic considerations – durability, usability. backups required. limiting funding and timeframes. small scale manufacturing. How are these stored and transported? what about batteries? Is there anything that needs special consideration or explanation (how to reboot).
Collaboration requires iterative design and tweaking.
Bill Buxton talks of ‘artist spec’, the most demanding standard of design. People have spent year developing a technique and your tool needs to fit in that technique.

Questions

  • Why mix acrylic and pvc?
    There is a lot of stress on the instruments, so they use tough materials.
  • Can you talk about the dancer’s experiences?
    The dancers did not seek technical knowledge, but they wanted to know how to experience and interact with it. They had preferences for certain instruments.