Putting Ubuntu on Intel Macs

I’ve started putting Ubuntu on 64 bit Intel MacBooks, some of which are 15 years old. It’s gone surprisingly well. Computers that were zippy when they were new return to being a delight. Alas, under-specced macs remain under-specced. However, all macs have excellent cameras and Audio I/O, so even a sluggish mac can be very useful for applications like Zoom or even webcasting. Putting linux on it gives it the security updates you need to confidently go online.

To follow this guide you will need a USB stick (or, possibly two that you can plug in at the same – this can be via a hub), and an ethernet connection to the internet. If you don’t have ethernet access, it’s still possible to get everything working, but you may have to refer to another tutorial for how to install drivers for your wifi.

First, you’ll need to download the latest Ubuntu LTS and use it to make a bootable USB stick. (How to do that is beyond the scope of this tutorial, because you might be on any OS for this step, but you can find a lot of guides.)

Turn the target Mac off, put the USB drive in a USB slot, and turn if back on while holding down the option key. You should get to a menu that includes your USB stick as one of the options.

If you do not see your USB stick, never fear. You will need a second USB stick! Download rEFIt. This will be an IMG disk image. If the bootable disk maker you used previously doesn’t work with IMGs, Raspberry Pi Imager will. In the middle dropdown menu, select “Use custom”, find the img in your download folder, and select your USB stick in the Storage menu.

Raspberry Pi Imager v1.8.5. Oprating System: Other specific-purpose OS, Freemium and paid-for OS, Misc utility images, Eras, Use custom
Raspberry Pi Imager

Plug in both USB sticks (you may need a hub), press option while the computer is booting, pick your rEFIt stick in the first menu, and then that stick’s program will present you with a second menu, which should allow you to boot the Ubuntu image.

You’ll eventually get to a screen that asks you for your language, then asks you if you need accessibility help and then asks you what keyboard you’re using. Be sure to find your keyboard’s country AND specify that it’s a mac keyboard.

You will get to a screen about getting on the internet. Don’t worry if your wifi isn’t working yet, just plug in the ethernet cable.

You want to do an interactive installation. You want to install optional extras. You want to install proprietary software and drivers and you probably want to reformat the computer’s disk drive. I always accept the default suggestion for how to set up the reformatted drive.

After you finish installing, you will be prompted to reboot and remove your USB sticks. You can also unplug the ethernet cable.

Most people will find that their wifi now works. They need only run a system update and they’re good to go. Congratulations! If you do audio, video, graphics or other production stuff, you might want to check out the Ubuntu studio installer to get some nice bundles of software for that.

Some users will find that their wifi does not work and will need to do a few extra steps. Those users should plug the ethernet cable back in.

If your wifi worked when it was still running macOS, it can run with Ubuntu. Your mac probably has a broadcom chip and these can sometimes take a few extra steps to sort out. You’ll need to use the terminal.

Start following the first answer to this question: https://askubuntu.com/questions/55868/installing-broadcom-wireless-drivers BUT

If you get to a step sudo apt install firmware-b43-installer , and you are installing 20.04 (the most recent LTS as of writing), you will get some errors. If you do get errors, never fear. According to this helpful answer, just type:

sudo sed -i 's,https://www.lwfinger.com/b43-firmware/${DOWNLOAD},https://github.com/minios-linux/b43-firmware/releases/download/b43-firmware/${DOWNLOAD},' /var/lib/dpkg/info/firmware-b43-installer.postinst

Then run sudo apt update && sudo apt upgrade

Returning to the main tutorial you’re following, for most users, the next step will be to run sudo apt install linux-firmware , however, some users will find a message saying this is already installed. In that case, run sudo apt reinstall linux-firmware

After you finish installing and are ready to reboot, you can again try unplugging the ethernet cable. This should solve your problems. If not, there is a Q&A section below the first answer of https://askubuntu.com/questions/55868/installing-broadcom-wireless-drivers , which can help you figure out what the issue is.

Do not RSVP to protests

Dear Madam or Sir,

I am writing about your RSVP policy. I’m a member of your organisation and I think its great that this group is turning out to protest Trump.  I also understand that most of your events run more smoothly when you know who is going to show up.

However, I would like to very strongly encourage you to stop asking for RSVPs for protests. In the current political climate this is extremely unwise. Elderly pensioners holding peace placards are being arrested at their homes post protest. Imagine how much easier this would be if there was a list of participants! For those of us who might have to return to the US, at a time when senior figures in the administration are calling for all liberal and left activists to be arrested, this is also putting participants at risk.

I know in normal times, running a peaceful and perfectly legal protest aimed at a politician would be obviously no risk whatsoever. But we are really not in normal times.

I understand your reluctance to put the meetup location in a mass email, but do note that bad actors can easily find it by also RSVPing.

Thank you for your consideration. Please note that this issue is quite urgent and serious.

Best,
Charles

Joining the Greens

But first things first. Here’s an email I’ve just sent to Labour:

To whom it may concern,

I wish to resign my membership. Labour’s scapegoating of both trans people and migrant makes it exceptionally clear that I, a trans migrant, am not welcome. I put up with this for too long out of a fear that the other parties would be worse, but seeing Starmer and now Cooper jumping on the bandwagon of increasingly fascistic messaging is too much. I am ashamed of the party. I’m ashamed that I waited until things got this bad before leaving.

Your online form requires a membership number, which I’m having trouble locating. If you could send me the number or remove me from the party lists, either would be equally helpful and I’d appreciate it.

My name is Charles and my address is [redacted].

Thank you for aid in this matter. I’ve been a member of Labour since the day I became a citizen and I do sincerely hope that you’re able to sort out your priorities.

Best,

Charles

This issue, of course, is not the messaging, but the material harms. The government is encouraging violence. Indeed, they’re already doing their own arbitrary detentions. In some ways, the UK is running ahead of the US on the race to the bottom. The anti-protesting bill’s criminalisation of trespass puts GRT communities at risk. Their decision that they can strip citizenship of anyone who might qualify for citizenship in another country makes the citizenship of virtually all migrants, many of their children, and virtually all Jews precarious. The UK has structures in place to round up and expel large populations and, frankly, this does seem like what they may decide to do instead of addressing climate change.

Post-Post Digital Failure: Tuba as HCI Glitch

Kim Cascone wrote in his seminal paper, The Aesthetics of Failure, that “the medium is no longer the message in glitch music: the tool has become the message.” He identifies glitch music as emerging “from the ‘failure’ of digital technology . . . bugs, application errors, system crashes, clipping, aliasing, distortion, quantization noise, and even the noise floor of computer sound cards are the raw materials . . ..” He identifies this as “post-digital”[1].

Vanhanen, also writing about the origins of glitch as “the unintentional sounds of a supposedly silent medium.” This is “the result of a two-way relationship between hardware/software and the producer (mis)using it.”[2] Shelly Knotts writes about live-code driven failure as an “inevitability of imperfection.” Error does not generally result from an intentional misuse but is a constant possibility which extends to our entire environment. Live code errors make the audience and practitioners aware of the “imperfection of technical systems” even as these systems surround us and we rely on them, making it potentially a critique of liberal technocracy and capitalism more generally. [3]

However, all of these writers place all of the sound within the machine. Knotts notes that while “a jazz musician [might] suffer a broken string or reed in a performance, its unlikely their entire instrument will collapse.” [3] However George Lewis makes associations between 1980s computer music, which often did involve setting up systems in front of the audience, and (free / jazz) improvisation.

Lewis wrote that the computer music scene of the 1980s in the San Francisco Bay Area “was also widely viewed as providing possibilities for itinerant social formations that could challenge institutional authority and power.” This music was played in a band setting, and formed an improvisational practice, “from a collaborative rather than an instrumental standpoint, negotiating with their machines rather than fully controlling them.” (Unfortunately, this exciting beginning lead to not only to live coding, but also to “ubiquitous computing, which lives on as IoT.)[4]

Lewis himself also experimented with the borders of failure with systems coupled with his trombone. When I was at Sonology in 2005-6, Clarence Barlow described a theatre piece that I believe he attributed to Lewis. Lewis was on stage with his trombone and an effects box, but the effects box was not working. A tech came to assist, then another, then another, until a team of engineers disassembled the entire effects unit. While this was happening, Lewis sat down to eat his dinner. The piece ended when the box was completely disassembled. Unfortunately, I can’t find a reference to this piece, although searching for it did lead me to the writing mentioned above.

Lewis’s piece would more traditionally be classed as theatre rather than glitch. At a stretch, one could claim the effects box is misused. The technicians collaborate on “fixing” the box, and the piece becomes a ritual of debugging, a communal, music-not-making practice. In taking a dinner break, Lewis reflects on how tech outages cause work stoppages. In his piece, the tech “failure” causes the entire piece to “collapse”. His trombone is silent.

Domifare is also a brass piece that intentionally incorporates technical error. But, unlike the “post-digital” “glitch” pieces of 25 years ago, the errors don’t lie in mangled sound output, but rather in input failure. When the piece functions, it functions around 20% of the time. Sometimes less. On Monday, it was a lot less. Over 15 minutes, not a single command executed.

By placing the instrument as an input to the REPL loop, it queers the acoustic / digital binary and makes total failure audible. If 2000 was post-digital for Cascone, clearly 25 years later, with low bass, we are post again. Indeed, while my computer had no output, the input was constantly present. Although the system uses the logics of live code, especially ixilang, functionally it bears a lot of similarity to responsive systems, such as Voyager by Lewis[5] or Diamond Curtain Wall by Anthony Braxton[6]. Arguably, it’s a simpler system because the results are deterministic – or are when the system works.

Several years ago, I played a free improv set with others at The Luggage Store Gallery in San Francisco. I brought my laptop and my tuba, with the intention to switch between them part way through the set. As we started, my computer would only make static. Something was wildly wrong and after a few minutes of trying to fix it, I switched to tuba for the remainder of the set. Speaking to others afterwards, Matt Davignon said that he thought the static had been on purpose, and he thought I was “one of those computer musicians.” My old double bass teacher, Damon Smith, put it in a positive, enthusiastic light. “It doesn’t matter, because you have a tuba!” He went on “all computer musicians should have tubas with them!”

When I announced I was giving up on Domifare, Evan Rascob, echoing Smith, called out that I should have just played tuba for a few minutes. I should have.

Although it directly contradicts the TopLap Manifesto regarding backups [7], perhaps Smith is right. All computer musicians should have tubas.

Works Cited

[1] K. Cascone, ‘The Aesthetics of Failure: “Post-Digital” Tendencies in Contemporary Computer Music’, Comput. Music J., vol. 24, no. 4, pp. 12–18, Dec. 2000, doi: 10.1162/014892600559489.

[2] J. Vanhanen, ‘Virtual Sound: Examining Glitch and Production’, Contemp. Music Rev., vol. 22, no. 4, pp. 45–52, Dec. 2003, doi: 10.1080/0749446032000156946.

[3] S. Knotts, A. Hamilton, and L. Pearson, ‘Live coding and failure’, Aesthet. Imperfection Music Arts Spontaneity Flaws Unfinished, pp. 189–201, 2020.

[4] G. Lewis, ‘From Network Bands to Ubiquitous Computing: Rich Gold and the Social Aesthetics of Interactivity’, in Improvisation and social aesthetics, G. Born, E. Lewis, and W. Straw, Eds., in Improvisation, community, and social practice. , Durham: Duke University Press, 2017, pp. 91–109.

[5] G. Lewis, Voyager. 1985.

[6] A. Braxton, Diamond Curtain Wall. 2005.

[7] ‘ManifestoDraft – Toplap’. Accessed: Jul. 16, 2025. [Online]. Available: https://toplap.org/wiki/ManifestoDraft

Domifare at Folklore

Even Rascob (aka BITPRINT) has been organising regular Live Code gigs are Folklore in Hackney. I played Domifare last night . . . sort of.

I’ve blogged before about pitch recognition being flaky. And it is, but usually within the first three minutes or so, the SuperCollider autocorrelation UGen does actually recognise the pitches and the piece runs.

Not last night. Instead, I spend 15 minutes playing the same four note phrase over and over and over again, in front of an audience.

What went wrong

  • Normally, when I play this, I have the mic right down in the bell, and it was up slightly higher this time, which may have caused problems.
  • When I practice this, I lip the pitch up or down slightly and this often works. This level of subtlety and control is extremely difficult after several minutes of failure on stage. Instead, my playing got messier and messier over the course of the set.
  • As I was trying to piece out, I couldn’t decide whether to use my old mouth piece, or my new one which is slightly more difficult with greater freedom. It didn’t seem to make a difference when I was practising, so I went for the newer, freer one, which might have been a mistake.
  • My sound card’s output was also extremely low, which is a problem I’ve had before with Pipe Wire. This was concerning during the tech setup, but turned out not to be an issue during the performance.
  • My laptop was sat on a stool in front of me which was not a distance that worked at all with my glasses. The screen was so blurry, I couldn’t properly tell what notes were arriving.

How to fix it

  • If I need consistent mic placement that’s down in the bell, I should make a mount that goes into the bell. The would be a cork-covered ring, with spokes, a mic suspended in the middle.
  • Flucoma would allow me to train a neural net to recognise a series of pitches as a cue. Because the tuba spectrum is weird and the mic is most sensitive at the weird points, I would probably have to do the training on stage. Would his be more tedious of 15minutes of failed command input? No.
  • Practising this piece is essentially training myself to be decipherable to the algorithm, which is subtly different than normal practice goals or technique. I did not get as much practice as I would have liked. I spent a lot of time building lip strength, with the idea it would make my notes clearer, but not as much time getting feedback from the autocorrelation algorithm. It may be more practice with the program would have helped. Or, the algorithm was confused by background noise or mic placement, perhaps it would have made no difference whatsoever.
  • Taking the bus with a tuba, a laptop, an audio card, cables, a mic, a mic stand and so forth is already a bit much, but it may be the case that I also need a laptop stand so I can ensure my computer being at a height and location where I can see it. Or my old reading glasses required more and more distance. Maybe a laptop on a stool is not a good use for them.

How I dealt with everything

I think my stage presence was fine, actually, except for when I was giving up at the end. I should have launched a few minutes of solo improv starting from and around the cue phrase. I’m going to practice this a bit, not that I expect the piece to fail like this again.

This was not my first performance of this piece. It went fine when I played it in Austria, 3 years ago.

Well, at least the failure of that piece was all that went wrong

Shelly Knotts and I were also meant to play some MOO, but discovered during the sound check that most of it wasn’t working, so we cut it from the programme.

Audience Reactions

People were generally positive. Multiple people used the word “futility” but with a positive intention. Which goes to show you can’t trust nerds.

To do

  • Incorporate Flucoma
  • Play this on Serpent because it’s more portable and I really do have more freedom of pitch.
Video by Shelly Knotts

Domifare GUI improvements

A SuperCollider GUI window with buttons across the top, a server meter, a large text area, a list tot he side, an input bar, a bass cleff with nothing following it, four sliders controlling thresholds, two images containing four bass cleff cyctems between them, labelled, "Record Loop", "Stop" "Shake" and "Unshake"
The GUI for Domifare

Domifare is back under development because I will be performing with it on Monday evening in London, at Folklore. https://lu.ma/2rkkzmcz

All the improvements thus far have been to the GUI. It’s come a long way, but there are still some persistent bugs: you must resize the window to get the GUI to layout correctly. The new SuperCollider sclang version is a release candidate right now, so I’m holding off fixing everything, as I hope the many conflicting GUI methods will be better harmonised in that version.

The project is relying on BiLETools for several of the widgets because I was having problems with EZSliders. Again, after the major version update, I plan to remove this dependency.

The Key class in TuningLib is no longer required. It was always overkill, and some of it’s functionality has broken in the last three years.

In general, the pitch tracking is working better than it did three years ago, especially the autocorrelation, although there is still a high error rate. The built in “cheat sheet” makes this much easier to use, although I fear it lets the audience in a bit too much on how simplistic this whole setup is.

The notation is all generated via MuseScore, saved as SVG and then edited in Inkscape. The version of SC on my computer won’t open SVG files (or at least not inkscape SVGs), so these are exported to PNG. The lone F clef in the middle of the image adds notes to the right as it recognises them. The language parser does not track octaves, so it displays noteheads inside the lower part of the staff.

Adding the octave tracking is only partially fiddly, but doing this properly would entail turning the new class CleffView into a proper notation layout class. That has obvious utility, but its ore geometry than I want to get into right now.

I may upgrade the variable list from being a BtListView / EZListView to being a stack of ObjectGUIs for the DomifareLoop class. This would be valuable because I could indicate whether they were playing or were shaken. This could also include their name spelled via CleffViews. Or I could just learn to play directly from solfedge.

Another possible future improvement could be the inclusion of Solresol character glyphs, which would have a fun alien vibe. The problems are both that (last I looked) there is not a nice Linux font that supports these and it would require a time investment to be able to play variables names listed that way.

I’ve also misspelled “clef” as “cleff” but a find and replace is giving me crash errors, so idk. Shrug. Sad face. You can see the latest version on GitHub, although this will move to Codeberg hopefully soon.

There’s a very obvious case for integrating Flucoma into this project to recognise gestures. This would also entail building a training interface. The informational webpage for the SuperCollider release candidate specifically mentions that Flucoma does not work with it, so this is also deferred until that gets fixed. I don’t want to have to hold off upgrading SuperCollider, so I cannot create a situation where upgrading breaks this project.

Some of the London live coders (Lu) have expressed enthusiasm for the idea of doing the training as part of the performance. That has president for cybernetic pieces like Hornpipe by Gordon Mumma and remains a very good idea, but still not for Monday.

If this sounds cool and you can’t come on Monday, maybe you could let me know of another gigging opportunity in your town that I could play at? I’d like to take this show on the road!

Navigating ScMoo

In my previous post, I described how to install and start the SuperCollider Moo. In this post, I’m going to talk about how to use and move around the Moo.

Note this is an early version and it’s my intention to gradually make the SC Moo align more closely with LambdaMOO commands and syntax. Note also that the database is static. Your changes will be visible to yourself and others while you are logged in, but they will not be saved. If you make something you really like, please keep a local copy of it.

When you log in, you’ll see a lot of messages in the SC Post window, including loads of error messages. Just hold on until the GUI opens. This GUI will have a random colour, but it will have three text areas.

The Moo's GUI. The mouse is clicking and dragging on the gray line between the two panes to change the relative sizes.
The Moo GUI

The left hand side is the Moo’s output. The right hand side and bottom are both input. The right hand side is designed to enable entering code. If you just want to use the Moo as a user, you can make that side smaller by clicking and dragging on the grey line that separates the two sides. Then just enter your Moo commands in the bottom text area, followed by enter or return to evaluate them.

When you log in, your character will connect to the lobby. You will see text describing the room. Below that, you will see a list of objects if any are present. Below that, a list of other users, if any are present. And finally, a list of exits. If there are not exits, it will say “There is no way out” because I thought that was funny.

You can look again at the room any time by tying “look” (without the quotes) and pressing enter in the bottom text area of the window. In the current iteration of the database, the lobby contains a flyer. To look at it, you can type “look flyer” and hit enter. To look at yourself, type “look me“.

The default message for what you look like isn’t very exciting, so you might take a moment to change it. Type ‘ describe me as "My description goes here." ‘ (without the single quotes). Put your description inside double quotes. After you change it, try look me again. This is what others will see when they look at your character.

You can move through the Moo by typing the names of the exits. From the lobby, you can type “north“. This will place you in the bar, which has several objects. You can look at all of them and some of them have additional verbs – that is they are interactive. To see the verbs on an object type “verbs ” followed by the name of the object. For example, “verbs cage” will tell you the verbs on the cage. From that, you’ll see that one of the verbs is climb, so try climb cage.

Some of the verbs have audio on them and some don’t. Some of the objects have attached sound and some don’t. Let’s look at the jukebox. The verbs on it aren’t promising. We can’t describe it because we’re not the owner and it doesn’t seem to have anything in about playing it. So let’s put our own audio on it.

Go to the right hand side of GUI and evaluate the SuperCollider code (Moo.default.me ++ Moo.default.me.location).push; Now we can live code the jukebox, which has been added to the environment as ~jukebox.

We can set a pattern ~jukebox.pattern = Pbind(); and then play it ~jukebox.play; The resulting pattern is super boring, so why not modify it? ~jukebox.pattern = Pbind(\freq, 330); You can then live code it as you would with Pbinds or any other kind of pattern. You can set any SynthDef you’d like. As of now, all of these interventions are entirely local, alas, but networking them is coming.

Logging in to SCMoo

Friends, I wrote a Moo in SuperCollider.

Well, it’s partially written. Anyway, the point is that you can log in and play with the database developed by Shelly Knotts.

A proper tutorial will be coming, but between this, the included help files and the Github Readme should be enough to get started. If you experience any problems with this script, please leave a comment or reply.

Before you run it the first time, you must install two Quarks.

Quarks.install("https://github.com/celesteh/BiLETools.git");
Quarks.install("https://github.com/celesteh/SCMoo.git");

This should cause the required Quark, JSONlib, to automatically install.

Then you will need to run all of the following code, which you may wish to save in an scd file. Please edit it so variable a gets your name.


(
a = "MyName";
g = "Moo".toUpper;

w = MooWebSocketResponder(a,"UserPassword",g,"GroupPassword", "https://moo.blessing.exchange/osc.html").echo_(true);

s.waitForBoot({

	//trapdoor

	SynthDef(\trapdoorCrash, {|out, amp=02, gate=1, pan=0, dur=1|

		var env, noise, chaos, panner, sin, senv, sline, sfreq;

		chaos = EnvGen.kr(Env.perc) * 2;
		noise = Crackle.ar(chaos, amp);
		senv =  EnvGen.kr(Env.perc) * 200;
		sfreq = Rand(60, 80);
		sline = XLine.kr(sfreq, 50, 1);
		sin = SinOsc.ar(sline + senv) / 2;
		env = EnvGen.kr(Env.perc, doneAction:2);
		panner = Pan2.ar(noise + sin, pan, env);
		Out.ar(out, panner);
	}).add;



	// 7 midi
	SynthDef(\trapdoorSines, {|midinote, amp, dur, gate=1, out|
		var sines, env, filter;

		env = EnvGen.kr(Env.asr, gate, doneAction:2);
		sines = Splay.ar(
			[0.5, 1, 2, 4].collect({|i|
				[
					SinOsc.ar(((midinote - 7).midicps *i), 0, amp),
					SinOsc.ar(((midinote).midicps *i), 0, amp),
					SinOsc.ar(((midinote + 7).midicps *i), 0, amp)
				]
		}).flat, 1, env);
		filter = BBandPass.ar(sines, 500);
		Out.ar(out, filter);
	}).add;

	SynthDef(\trapdoorOpen, {|out, amp=02, gate=1, dur=1|
		var pos, saw, env, freq, panner;

		pos = SinOsc.ar(2/dur);
		freq = (pos * 150) + 150;
		saw = Saw.ar(freq, amp);
		env = EnvGen.kr(Env.asr, gate, doneAction:2);
		panner = Pan2.ar(saw, pos, env);

		Out.ar(out, panner);
	}).add;

	SynthDef(\trapdoorSplash, {|out, amp=02, gate=1, dur=1, pan=0|

		var noise, panner, env, fenv, filter;

		noise = WhiteNoise.ar(amp*2);
		fenv = (EnvGen.kr(Env.perc(releaseTime:dur)) * 400) + XLine.kr(800, 200, dur);
		filter = RLPF.ar(noise, fenv);
		env = EnvGen.kr(Env.adsr, gate, doneAction:2);
		panner = Pan2.ar(filter, pan, env);

		Out.ar(out, panner);
	}).add;


	// bats

	SynthDef(\batAttack, {|freq, amp, dur, gate=1, pan, out|

		var sin, panner, env, pmenv, pm;

		pmenv = (EnvGen.kr(Env.adsr) * pi) + (pi/3);
		pm= SinOsc.ar(freq * (37/41), 0, pmenv);
		sin = SinOsc.ar(freq, pm, amp);
		env = EnvGen.kr(Env.adsr, gate, doneAction:2);
		panner = Pan2.ar(sin, pan, env);
		Out.ar(out, panner);
	}).add;

	SynthDef(\batSing, {|freq, amp, dur, gate=1, pan, out|

		var trig, vosim, panner, filter, env;

		trig = Impulse.ar(freq/2, 0.1, EnvGen.kr(Env.asr, gate));
		vosim = VOSIM.ar(trig, freq*3);
		filter = BPF.ar(vosim, freq*2);
		env = EnvGen.kr(Env.triangle(dur), doneAction:2);
		panner = Pan2.ar(filter, pan, env);
		Out.ar(out, panner)
	}).add;

	SynthDef(\batPing, {|freq, amp, dur, gate=1, pan, out|

		var sin, env, panner;

		sin = SinOsc.ar(freq, 0, amp*2);
		env = EnvGen.kr(Env.perc, doneAction:2);
		panner = Pan2.ar(sin, pan, env);
		Out.ar(out, panner);
	}).add;

	SynthDef(\bass, { |out=0,amp=0.1,sustain=0.2,freq=200,fb=0, room=3, mix=0.5, res=0, nois=0.2, trem_freq=4, depth=0.9, rel=0.1, att=0.01, frange=50, del=0.05, comb=0.2, freq_n=3, width=1.0, dec=0.01|
		var snd, env, ctrl;

		//ctrl = ;
		snd = Saw.ar([freq, freq+10], 1).tanh; // * LFNoise1.kr(trem_freq).range(depth, 1);
		snd = snd + Pulse.ar(freq*0.5, 0.6).dup;
		snd = snd + Pluck.ar(WhiteNoise.ar(1), 1, freq.reciprocal, freq.reciprocal, 10, 0);
		// SinOscFB.ar([freq, freq + 10], fb,1).tanh * LFNoise1.kr(trem_freq).range(depth, 1);
		// snd = snd + BrownNoise.ar(nois).tanh;
		// snd = snd * (Crackle.ar(LFNoise1.kr(3).range(1.0, 2.0)) * 0.3).tanh;
		snd = RLPF.ar(snd, freq + 100, 0.8);
		snd = FreeVerb.ar(snd, mix, room).tanh;
		// snd = DFM1.ar(snd, freq, res);
		env = EnvGen.ar(Env.perc(att,rel),doneAction:2);
		// env = EnvGen.ar(Env.linen(att,sustain, rel),doneAction:2);
		OffsetOut.ar(out, snd.dup * env * amp);
	}).add;

	SynthDef(\whale, { |out=0,amp=0.1,sustain=0.01,freq=200,fb=0, room=15, mix=0.8, res=0, nois=0.5, trem_freq=4, depth=0.9, rel=0.5, att=0.1, frange=50, del=0.05, comb=10, freq_n=3|
		var snd, env, ctrl;

		//ctrl = ;
		// snd = Formants.ar(LFNoise1.kr([freq_n, freq_n+1, freq_n-1, freq_n+2]).range(freq, freq+frange) * [1, 1.1, 1.2, 1.3], Vowel([\e, \o, \u], [\alto, \tenor])) * 3; // * LFNoise1.kr(trem_freq).range(depth, 1);

		snd = Splay.ar(SinOscFB.ar([freq, freq + 10, freq + 20, freq + 30], fb,1).tanh);
		snd = Splay.ar(PitchShift.ar(snd, 0.2, LFNoise1.kr(0.2).range(1, Array.fill(4, { rrand(0.5, 0.2)} )))).tanh;
		snd = DelayC.ar(RLPF.ar(snd, Rand(100, 3000), 0.03), 1, 1 / (2), 1, snd * 0.5);
		// snd = BrownNoise.ar(nois).tanh;
		// snd = snd * (Crackle.ar(LFNoise1.kr(0.7).range(1.0, 2.0)) * 0.3).tanh;
		// snd = snd * (PinkNoise.ar(LFNoise1.kr(0.3).range(1.0, 2.0)) * 0.3).tanh;
		// snd = CombC.ar(snd, 0.1, LFNoise1.kr(comb).range(0.03, 0.1));
		snd = FreeVerb.ar(snd, mix, room).tanh;
		// snd = DFM1.ar(snd, freq, res);
		env = EnvGen.ar(Env.linen(att,sustain, rel),doneAction:2);
		OffsetOut.ar(out, snd.dup * env * amp);
	}).add;

	SynthDef(\lake_eels, { |out=0,amp=0.1,sustain=0.01,freq=200,fb=0, room=3, mix=0.5, res=0, nois=0.2, trem_freq=4, depth=0.9, rel=0.5, att=0.1, frange=50, del=0.05, comb=0.3, freq_n=3|
		var snd, env, ctrl;

		//ctrl = ;
		snd = SinOsc.ar([440, 442] * SinOsc.ar(LFTri.kr(0.5).range(5, 50), 0, LFTri.kr(0.4).range(5, 50)), 0, 1); // * LFNoise1.kr(trem_freq).range(depth, 1);

		// SinOscFB.ar([freq, freq + 10], fb,1).tanh * LFNoise1.kr(trem_freq).range(depth, 1);
		snd = snd + BrownNoise.ar(nois).tanh;
		// snd = snd * (Crackle.ar(LFNoise1.kr(3).range(1.0, 2.0)) * 0.3).tanh;
		snd = CombC.ar(snd, 0.1, LFNoise1.kr(comb).range(0.03, 0.1));
		snd = RLPF.ar(snd, LFSaw.kr(0.3).range(2000, 500));
		snd = Mix.ar(FreeVerb.ar(snd, mix, room)).tanh;
		snd = Pan2.ar(snd, SinOsc.ar(LFTri.kr(0.1).range(1, 3)).range(-1, 1));
		// snd = DFM1.ar(snd, freq, res);
		env = EnvGen.ar(Env.linen(att,sustain, rel),doneAction:2);
		OffsetOut.ar(out, snd * env * amp);
	}).add;

	\Formants.asClass.notNil.if({
		"""
SynthDef(\witch, { |out=0,amp=0.1,sustain=0.01,freq=200,fb=0, room=3, mix=0.5, res=0, nois=0.2, trem_freq=4, depth=0.9, rel=0.5, att=0.1, frange=50, del=0.05, comb=10, freq_n=3, mult=1|
var snd, env, ctrl;

//ctrl = ;
snd = Formants.ar(LFNoise1.kr([freq_n, freq_n+1, freq_n-1, freq_n+2]).range(freq, freq+frange) * mult, Vowel([\e, \o, \u], [\alto, \soprano])) * 3; // * LFNoise1.kr(trem_freq).range(depth, 1);
snd = snd + SinOsc.ar([freq, freq*1.05] * SinOsc.ar(LFSaw.kr(0.5).range(50, 20), 0, LFSaw.kr(0.4).range(20, 50)), 0, 1);

// SinOscFB.ar([freq, freq + 10], fb,1).tanh * LFNoise1.kr(trem_freq).range(depth, 1);
// snd = snd + BrownNoise.ar(nois).tanh;
// snd = snd * (Crackle.ar(LFNoise1.kr(3).range(1.0, 2.0)) * 0.3).tanh;
// snd = CombC.ar(snd, 0.1, LFSaw.kr(comb).range(0.05, 0.01));
snd = FreeVerb.ar(snd, mix, room).tanh;
// snd = DFM1.ar(snd, freq, res);
env = EnvGen.ar(Env.linen(att,sustain, rel),doneAction:2);
OffsetOut.ar(out, snd.dup * env * amp);
}).add;
""".interpret;
	});


	SynthDef(\spider, { |out=0,amp=0.1,sustain=0.01,freq=200,fb=0, room=3, mix=0.5, res=0, nois=0.2, trem_freq=4, depth=0.9, rel=0.5, att=0.1, frange=50, del=0.05, comb=0.2, freq_n=3, width=1.0|
		var snd, env, ctrl;

		//ctrl = ;
		snd = Pulse.ar([freq, freq+10], LFTri.kr(1).range(0, width), 1).tanh; // * LFNoise1.kr(trem_freq).range(depth, 1);

		// SinOscFB.ar([freq, freq + 10], fb,1).tanh * LFNoise1.kr(trem_freq).range(depth, 1);
		// snd = snd + BrownNoise.ar(nois).tanh;
		// snd = snd * (Crackle.ar(LFNoise1.kr(3).range(1.0, 2.0)) * 0.3).tanh;
		snd = CombC.ar(snd, 0.1, LFNoise1.kr(comb).range(0.03, 0.1));
		snd = FreeVerb.ar(snd, mix, room).tanh;
		// snd = DFM1.ar(snd, freq, res);
		env = EnvGen.ar(Env.perc(att,sustain),doneAction:2);
		// env = EnvGen.ar(Env.linen(att,sustain, rel),doneAction:2);
		OffsetOut.ar(out, snd.dup * env * amp);
	}).add;


	\Formants.asClass.notNil.if({

		"""
SynthDef(\ghosts, { |out=0,amp=0.1,sustain=0.01,freq=200,fb=0, room=3, mix=0.5, res=0, nois=0.2, trem_freq=4, depth=0.9, rel=0.5, att=0.1, frange=50, del=0.05, comb=10, freq_n=3, cfreq=3|
var snd, env, ctrl;

//ctrl = ;
snd = Formants.ar(LFNoise1.kr([freq_n, freq_n+1, freq_n-1, freq_n+2]).range(freq, freq+frange) * [1, 1.1, 1.2, 1.3], Vowel([\e, \o, \u], [\alto, \tenor])) * 3; // * LFNoise1.kr(trem_freq).range(depth, 1);

// SinOscFB.ar([freq, freq + 10], fb,1).tanh * LFNoise1.kr(trem_freq).range(depth, 1);
snd = snd + BrownNoise.ar(nois).tanh;
snd = snd * (Crackle.ar(LFNoise1.kr(cfreq).range(1.0, 2.0)) * 0.3).tanh;
snd = CombC.ar(snd, 0.1, LFNoise1.kr(comb).range(0.03, 0.1));
snd = FreeVerb.ar(snd, mix, room).tanh;
// snd = DFM1.ar(snd, freq, res);
env = EnvGen.ar(Env.linen(att,sustain, rel),doneAction:2);
OffsetOut.ar(out, snd.dup * env * amp);
}).add;
""".interpret;
	});


	SynthDef(\dog, { |out=0,amp=0.1,sustain=0.01,freq=440,fb=0, room=3, mix=0.5, res=0, nois=0.5, trem_freq=4, depth=0.8, rel=0.5, att=0.1|
		var snd, env, ctrl;

		//ctrl = ;
		snd = SinOscFB.ar([freq, freq + 10], fb,1).tanh * LFNoise1.kr(trem_freq).range(depth, 1);
		// snd = snd + BrownNoise.ar(nois).tanh;
		// snd = snd * (Crackle.ar(LFNoise1.kr(3).range(1.0, 2.0)) * 4).tanh;
		snd = FreeVerb.ar(snd, mix, room).tanh;
		// snd = DFM1.ar(snd, freq, res);
		env = EnvGen.ar(Env.linen(att,sustain, rel),doneAction:2);
		OffsetOut.ar(out, snd.dup * env * amp);
	}).add;

	SynthDef(\bar1, {|out=0,amp=0.1,sustain=0.01,freq=440,fb=0, room=3, mix=0.5, res=0.99, nois=0.5, trem_freq=4, depth=0.8, rel=0.5, att=0.1|
		var snd, env, ctrl;

		//ctrl = ;
		snd = SinOscFB.ar([freq, freq], fb, 1).tanh;
		// snd = snd + BrownNoise.ar(nois).tanh;
		// snd = snd * (Crackle.ar(LFNoise1.kr(3).range(1.0, 2.0)) * 4).tanh;
		snd = FreeVerb.ar(snd, mix, room).tanh;
		// snd = DFM1.ar(snd, freq, res);
		env = EnvGen.ar(Env.perc(att,sustain),doneAction:2);
		OffsetOut.ar(out, snd.dup * env * amp);
	}).add;

	SynthDef(\mocktail, { |out=0,amp=0.1,sustain=0.01,freq=440,fb=0, room=3, mix=0.5, res=0.99, nois=0.5, trem_freq=4, depth=0.8, rel=0.5, att=0.1|
		var snd, env, ctrl;

		//ctrl = ;
		snd = SinOscFB.ar([freq], fb,1).tanh * LFNoise1.kr(trem_freq).range(depth, 1);
		// snd = snd + BrownNoise.ar(nois).tanh;
		snd = snd * (Crackle.ar(LFNoise1.kr(3).range(1.0, 2.0)) * 4).tanh;
		snd = snd + Dust.ar(10);
		// snd = Decay2.ar(snd, 0.01, 0.1, WhiteNoise.ar);
		snd = DelayN.ar(snd, 0.2, 0.2, 1, snd);
		// snd = DFM1.ar(snd, freq, res);
		snd = snd * (Crackle.ar(LFNoise1.kr(10).range(1.0, 2.0)) * 4).tanh;
		snd = FreeVerb.ar(snd, mix, room).tanh;
		// snd = DFM1.ar(snd, freq, res);
		env = EnvGen.ar(Env.linen(att,sustain, rel),doneAction:2);
		OffsetOut.ar(out, snd.dup * env * amp);
	}).add;

	SynthDef(\cage, {|out=0,amp=0.1,sustain=0.01,freq=440,fb=0, room=3, mix=0.5, res=0.99, nois=0.5, trem_freq=4, depth=0.8, rel=0.5, att=0.1|
		var snd, env, ctrl;

		//ctrl = ;
		snd = SinOscFB.ar([freq, freq], fb, 1).tanh;
		// snd = snd + BrownNoise.ar(nois).tanh;
		// snd = snd * (Crackle.ar(LFNoise1.kr(3).range(1.0, 2.0)) * 4).tanh;
		snd = FreeVerb.ar(snd, mix, room).tanh;
		// snd = DFM1.ar(snd, freq, res);
		env = EnvGen.ar(Env.perc(att,sustain),doneAction:2);
		OffsetOut.ar(out, snd.dup * env * amp);
	}).add;


	\Formants.asClass.notNil.if({
		"""
SynthDef(\barperson, { |out=0,amp=0.1,sustain=0.01,freq=200,fb=0, room=3, mix=0.5, res=0, nois=0.5, trem_freq=4, depth=0.9, rel=0.5, att=0.1, frange=50, del=0.08|
var snd, env, ctrl;

//ctrl = ;
snd = Formants.ar(LFNoise1.kr(5).range(freq, freq+frange), Vowel([\e, \o], [\alto, \tenor])) * 2 * LFNoise1.kr(trem_freq).range(depth, 1);

// SinOscFB.ar([freq, freq + 10], fb,1).tanh * LFNoise1.kr(trem_freq).range(depth, 1);
// snd = snd + BrownNoise.ar(nois).tanh;
snd = snd * (Crackle.ar(LFNoise1.kr(3).range(1.0, 2.0)) * 0.3).tanh;
snd = CombC.ar(snd, 0.1, del);
snd = FreeVerb.ar(snd, mix, room).tanh;
// snd = DFM1.ar(snd, freq, res);
env = EnvGen.ar(Env.linen(att,sustain, rel),doneAction:2);
OffsetOut.ar(out, snd.dup * env * amp);
}).add;
""".interpret;
	});


	s.sync;

	//w.getJSON({|json| json.debug("command line"); j = json; "JSON retrieved".postln});

	AppClock.sched(0, {
		w.getJSON({|json| json.debug("command line");

			AppClock.sched(0.5, {
				n = NetAPI.other(a, g, path:w);

				{
					4.wait;
					m = Moo.login(n, json, \parseText, rest:0.03);
					AppClock.sched(1, {
						m.gui.fontSize = 14;
						{
							var spider, ghosts, witch, lake, barperson, cage, mocktail, dog;

							spider = m[5929];
							ghosts = m[7668];
							witch = m[6360];
							lake = m[8829];
							barperson = m[6832];
							cage = m[5746];
							mocktail = m[4203];
							dog = m[3486];

							barperson.stop;
							barperson.pattern_(Pbind(\instrument, \barperson,
								\dur, barperson[\dur],
								\degree, Prand([1, 4, 2], 5) * (((Pfunc({barperson[\degree].value}) + 1)).round(1)),
								// \legato, 0.001,
								\scale, Scale.minor,
								\att, barperson[\att],
								\rel,  barperson[\rel],
								\room, 0.5,
								\octave, Prand([5, 4], inf),
							));

							cage.stop;
							cage.pattern_(Pbind(\instrument, \cage,
								\dur, Pfunc({cage[\dur].value}) * Pstutter(8, Pseq([1, 0.8, 0.6],inf), inf),
								\degree, (Pseq([0, 1, 2, 3, 4], 5) + Pstutter(5, Pseq([0, 3, 6, 9],inf), inf)) + (((Pfunc({ cage[\degree].value}) + 1)).round(1)),
								// \legato, 0.001,
								\scale, Scale.minor,
								\att, cage[\att],
								\rel, cage[\rel],
								\fb, cage[\fb],
								\room, cage[\room],
								\octave, Pstutter(4, Pseq([5, 6, 7], inf), inf),
							));

							mocktail.stop;
							mocktail.pattern_(Pbind(\instrument, \mocktail,
								\dur, mocktail[\dur],
								\degree, Pseq([1, 2, 7, 5, 2, 4], 25) * (((Pfunc({ mocktail[\degree].value}) + 1)).round(1)),
								// \legato, 0.001,
								\scale, Scale.minor,
								\att, mocktail[\att],
								\rel, mocktail[\rel],
								\fb, mocktail[\fb],
								\room, 0.5,
								\octave, Pstutter(4, Pseq([6, 5, 4], inf), inf),
							));

							dog.stop;
							dog.pattern_(Pbind(\instrument, \dog,
								\dur, dog[\dur],
								\degree, Pseq([1, 4, 2], 2) * (((Pfunc({dog[\degree].value}) + 1)).round(1)),
								// \legato, 0.001,
								\scale, Scale.minor,
								\att, dog[\att],
								\rel,  dog[\rel],
								\fb, dog[\fb],
								\room, 0.5,
								\octave, Pstutter(3, Pseq([5, 4], inf), inf),
							));


							spider.stop;
							spider.pattern_(Pbind(\instrument, \spider,
								\dur, spider[\dur],
								\degree, Pseq((0..8), 3) + Pstutter(5, Pseq([2, 4, 6, 8], inf), inf) + (((Pfunc({~spider[\degree].value}) + 1)).round(1)),
								// \legato, 0.001,
								\scale, Scale.minor,
								\freq_n, spider[\freq_n],
								\att, spider[\att],
								\rel,  spider[\rel],
								\fb, spider[\fb],
								\room, spider[\room],
								\octave, spider[\oct],
							));


							ghosts.stop;
							ghosts.pattern_(Pbind(\instrument, \ghosts,
								\dur, ghosts[\dur],
								\degree, Pseq([1], 1) * (((Pfunc({ghosts[\degree].value}) + 1)).round(1)),
								// \legato, 0.001,
								\scale, Scale.minor,
								\freq_n, ghosts[\freq_n],
								\att, ghosts[\att],
								\rel,  ghosts[\rel],
								\nois, ghosts[\nois],
								\fb, ghosts[\fb],
								\room, ghosts[\room],
								\cfreq, ghosts[\cfreq],
								\frange, ghosts[\frange],
								\comb, ghosts[\comb],
								\octave, ghosts[\oct], //Pseq([4], inf),
							));


							witch.stop;
							witch.pattern_(Pbind(\instrument, \witch,
								\dur, witch[\dur],
								\degree, (Pseq([5, 4, 3, 2, 1], 2) + Pstutter(5, Pseq([0, -1], inf), inf))* (((Pfunc({~witch[\degree].value}) + 1)).round(1)),
								// \legato, 0.001,
								\scale, Scale.minor,
								\freq_n, witch[\freq_n],
								\att, witch[\att],
								\rel,  witch[\rel],
								\sustain, 0.5,
								\fb, witch[\fb],
								\room, 0.5,
								\octave, Pseq([4], inf),
							));

							lake.stop;
							lake.pattern_(Pbind(\instrument, \lake_eels,
								\dur, lake[\dur],
								\degree, Pseq([1], 2) * (((Pfunc({lake[\degree].value}) + 1)).round(1)),
								// \legato, 0.001,
								\scale, Scale.minor,
								\freq_n, lake[\freq_n],
								\att, lake[\att],
								\rel,  lake[\rel],
								\fb, lake[\fb],
								\room, 0.5,
								\octave, Pseq([4], inf),
							));



						}.fork;
					}, nil);

				}.fork

			}, nil);

		});
	}, nil)

});


)


//n = NetAPI.other(a, g, path:w, joinAction:{"Join the Moo now".postln});
//m = Moo.login(n, j, \parseText);

I met my MP today

This was in the works for a while. I wrote my MP a few weeks ago in response to the Trans Actual campaign to ensure that every MP has met a trans person. They suggested that I should bring my spouse, so we logged into Teams together for a 20 minute chat.

When I asked for the meeting, I had a couple of topics in mind, but the recent Supreme Court decision superseded them, so instead I talked about how trans rights is a generational issue – most young people do know a trans person and they’re not going to forgive Labour if it doesn’t act to protect their friends.

More important than the content of the meeting was that it happened. Our MPs need to hear from us. Trans people, especially, but also allies.

If you’re trans, please both write and ask for a meeting. If you’re cis, please write. (Example scripts here and here, but do use at least some of your own words. Find email addresses here. )

Indeed, if you’re cis, don’t just write your MP, but please also check in with your trans friends.

And check in with your cis friends. Make sure they also know you support trans rights. Ask them to write their MPs also.

Even More Asks

The most important thing I’m asking for is to write an email. It can be super short, just be clear that you support trans rights and want legislative action to protect them.

But, alas, in this climate, bigots are emboldened, so one important part of allyship needs to include helping de-embolden them. If somebody keeps saying “men and women” to mean people, keep adding “and enbies”. This is slightly awkward, but it’s less awkward for you than it would be for an enby. You’re doing G-d’s work.

Talk shit about transphobia. Don’t leave people guessing where your alliances lie. Correct transphobic terminology. People are not “biological” or not. There are women: trans and cis. Men: trans and cis. And non-binary people (sometimes known as “enbies”). Use this terminology. Gently correct others by repeating what they just said back to them with the right wording. Remember that the more polite thing to do is to speak up.

If you witness transphobia impact a person, don’t stay quiet or do nothing, but act on the 5Ds. Most people subject to random transphobic harassment are cis, so watch out for your lesbian friends and others that bigots mistake for trans people.

As far as non-random harassment goes, it tends to be minor, but ongoing. It’s death by a thousand cuts, but your speaking up can make a real difference. Repeat back what was just said but with the right name and pronouns. Disagree with troublesome assertions. Be clear that terfs are weird and don’t speak for you or your friends.

Toilets

Unfortunately, the supreme court ruling may create problems for people who just need to relieve themselves. When you are using the facilities, please be aware of vibes and vulnerable users. The advice below is for cis (passing) people.

Disabled / ungendered loos

Anyone who wishes to use these spaces should be allowed. While people with disabilities do get first crack and to skip queues, nobody should be challenged on seeking to use this space.

If you witness a challenge, speak up and remind the challenger that the loos are for everybody.

Similarly, nobody should be forced to use these loos if they assert the belong in the women’s or the men’s.

Women’s loos

If somebody is too interested in another person or the vibes seem off, be aware. Transphobes will sometimes try to get another person to speak, in order to judge their vocal register. If somebody in the queue asks another person in the queue a question, be ready to answer it yourself. Disrupt their attempts to sus out who might be trans.

Men’s loos

If you are in the loo and you notice somebody who looks like they might not be normally a user of the men’s room or like they might be vulnerable in some way, let them use the loo without being weird at them, but also be aware of vibes. Are other people in there being weird or creepy? Does the situation seem like it might be risky or just feel off? Linger by combing your hair or handwashing or whatever until the other user gets out ok.

If someone is forced to use the wrong loo

Even with the court ruling, everyone has the right to use the loo which best fits their identity. However, in practice, you may witness somebody forced into the wrong space. You need to make sure they get out ok. If they’re forced into non-gendered loos, they’re still at risk for harassment, so keep an eye out. If they’re forced into the men’s somebody friendly must accompany them.

The ideal people to accompany them into the men’s are women and non-binary people. Strangers can and should volunteer for this.

If only men are available, their escort really ought to be somebody they know who is not creepy. It can be more than one person, especially if the first person to volunteer seems weird for any reason.

If you are a strange man, this is socially tricky. Ask the person how and if you can help. Do they want you to ask other users to leave so they can have the loo to themselves? Do they want you to wait for them outside? Do you know where there’s a different toilet nearby?

Ideally, in a case where things seem off around the use of loos or changing rooms, you can hopefully see off trouble before it starts. In case you can’t, be aware of bystander training.

What’s the deal with mumsnet

I wrote this and posted it to Mastodon less than a fortnight ago. Here it is again on my blog, with some minor edits and updated with recent events.

What is mumsnet and how did it get like this

Somebody may have written a proper paper on this, but I want to speculate briefly on how mumsnet happened, which hopefully might point towards preventative measures.

Cis Britons know mumsnet as a webforum for mothers. However, over the last 20 years, it has gradually turned into a highly influential transphobic hate site, while also still being a major, mainstream site for especially new mothers.

The site launched 30-ish years ago, when webforums were still exciting. This was during the digital divide era, so the initial culture was established by middle class women.

It exists partly as a response to some cultural problems in the UK. The thing about England is that people here hate children. Loathe them. This is not a joke. Indeed, even more than people hate children, they abhor babies. At most venues, indoor smoking is more welcome than babies. There aren’t really that many places that people staying home with new babies can go.

Also, despite national myths, England is as sexist as America. New mums are specifically vulnerable to this. They’re cut off from their normal support networks. For natal mothers, embodied aspects of childbirth also bring the full weight of “women are icky” down upon their heads.

So, are you feeling lonely, isolated, disempowered, at odds with your husband who suddenly has some traditional ideas about nappy changing or whatever? Well, here’s a place where you can go to talk to other people just like yourself! Other lonely, aggrieved, relatively wealthy, entitled (but suddenly cut off from various forms of power) people, trying to figure out how to navigate a very hostile system. Share stories! Trade tips! Organise!

Politicians started having to go talk to mumsnet to win voters. This impacted policy. Political organisation on the forums worked!

This is rather a lot for a chat board for mothers, which, by design, was never intended to address parenthood (let alone personhood) more generally. For example, its not intended for men on paternity leave, who also have the same isolation and hostile systems (although not the same kind of sexism). People of other genders who have given birth have to deal with a highly gendered space if they want to join. Adoptive mums have their own battles on the platform. And it only solves part of the problem it’s tackling: people meeting up online are still isolated from face to face connections.

However, the evolution from social space to political space also created additional problems which the site did not address. Decades ago, Jo Freeman wrote ‘The Tyranny of Structurelessness‘ which describes how consciousness raising groups (women’s mutual support groups) did not transfer well to being political groups, because political groups need to be organised in ways that acknowledge and account for power relationships. Mutual support groups can kind of ignore those, but as soon as they try to go out and do something, systems of oppressive power are replicated, cliquiness and/or bullying arises. Mumsnet has all of those problems at mass scale.

Finally, the biggest problem of all, especially as mumsnet moved into politics: it equates womanhood with motherhood.

Despite or perhaps because of these flaws, it was still a lively discussion platform. It became a major centre for discussion of ‘women’s issues’ and feminism. The political forums of the site started to attract women who did not actually have kids. This creates a structural tension, as we have a definition of womanhood essentially rooted in giving birth, but also a population of invested users who have not and will not do so. Part of the project of the forum therefore includes navigating that tension – the definition of womanhood is inherently under negotiation in the space. Middle class white feminists want in, so they need to redraw the boundary so that a different Other is excluded. This discussion is also taking place where many of the participants are entitled but isolated and vulnerable. Some people in this position are searching for an Other to blame. Men are an obvious target, but not a safe one. The pre-existing tradition of British terfism, already popular with some of the political participants, provided an answer to all of these problems.

In retrospect, it seems inevitable it would turn into a hate site. Indeed, every social site eventually turns into a hate site if it is popular and not moderated. Mumsnet moderators did a calculation of how many haters loved the site, how many people didn’t care about the issue, and how many trans mums they were driving off. The compromise they came up with was to keep the hatred relatively contained in radioactive corners of the site. And this has worked great. It’s still a major site for mums. It has developed terfism from a fringe concern to the UK’s mainstream feminism. Everyone (that matters) is happy.

The Supreme Court

In the time since I originally wrote the above, the Supreme Court of the UK has issued a ruling on gender. The influence of mumsnet has reached the highest court – the official definition of womanhood is now cemented in mumsnet logic.

I want to very strongly assert that this really is mumsnet’s victory. It is not a coincidence that the hatred and arguments hashed out by connected, professional, influential, posh women has changed government policy in very real ways that have real impacts on other women, men and non-binary people.

A good reputation

After writing this, I had dinner with some cis friends who are parents and tried broaching the topic of whether mumsnet might be problematic. For many cis people, the success and affordances of the site vastly outweighs the minor inconvenience of trans people losing their human rights.

Do try talking to your friends, sure, but it may not be enough and this may be something trans people may wish to delegate.

Solving the problem

So, what do we do? When they had their billboard co-advertising campaign “Brand X is endorsed by mumsnet”, I wrote to several of the advertisers I saw, linking them to an NGO’s designation of them as a hate site. (A link I no longer possess and can’t recover because websearch has ceased working. However, Vice does have an article.)

The bigger solution, of course, is to do something about England. The trend of kids in pubs is helping. But better access to childcare, more tolerance of babies in public spaces, more access to babysitting/childcare are all needed.

And while we’re at it: lift the benefit cap, make sure everyone has their basic needs met, recognise that the UN’s human rights includes not only food and shelter and so forth, but also leisure. There need to be free activities welcoming to mixed groups including people with kids and people without.

Women would stop joining mumsnet if they could get their social needs met by their existing networks. The stigmatisation of giving birth needs to end. And, in the mean time, getting new mums on to, say, federated social media like Mastodon, is honestly kind of a hard sell. For cis people who don’t care really about their trans friends’s political wellbeing outside of an abstract oh-dearism, leaving mumsnet doesn’t have the urgency of leaving US big tech. Federated social media does not having a ready made user base for new parents on the same scale. Again, this is a conversation people should try to have.

But, let’s be real, by “people” I mean cis people. Especially parents. Especially mothers. Because that’s who is going to have sway.