{"id":4744,"date":"2016-09-08T13:43:46","date_gmt":"2016-09-08T12:43:46","guid":{"rendered":"http:\/\/www.celesteh.com\/blog\/?p=4744"},"modified":"2016-09-08T13:43:46","modified_gmt":"2016-09-08T12:43:46","slug":"algorithms-and-authorship","status":"publish","type":"post","link":"https:\/\/www.celesteh.com\/blog\/2016\/09\/08\/algorithms-and-authorship\/","title":{"rendered":"Algorithms and Authorship"},"content":{"rendered":"<p>A recent Wall Street Journal <a href=\"http:\/\/www.wsj.com\/articles\/facebooks-trending-feature-exhibits-flaws-under-new-algorithm-1473176652\">article<\/a> (paywalled, see below for relevant quotes) felt it necessary to quote associate professor Zeynep Tufekci on the seemingly self-evident <a href=\"https:\/\/twitter.com\/zeynep\/status\/773504373397880832\">assertion<\/a> that \u2018Choosing what to highlight in the trending section, whether by algorithms or humans, is an editorial process\u2019.  This quote was necessary, as Zuckerburg <a href=\"https:\/\/twitter.com\/zeynep\/status\/773506254195023873\">asserts<\/a> Facebook is a technology company, building tools but not content. He thus seeks to absolve himself of responsibility for the output of his algorithms.<\/p>\n<blockquote class=\"twitter-tweet\" data-lang=\"en\">\n<p lang=\"en\" dir=\"ltr\">Algorithmic decision-making is decision-making. Algorithmic editorializing: editorializing. <a href=\"https:\/\/t.co\/58tJYvE8bL\">https:\/\/t.co\/58tJYvE8bL<\/a> <a href=\"https:\/\/t.co\/hzFDUMS5yU\">pic.twitter.com\/hzFDUMS5yU<\/a><\/p>\n<p>&mdash; Zeynep Tufekci (@zeynep) <a href=\"https:\/\/twitter.com\/zeynep\/status\/773504373397880832\">September 7, 2016<\/a><\/p><\/blockquote>\n<p><script async src=\"\/\/platform.twitter.com\/widgets.js\" charset=\"utf-8\"><\/script><\/p>\n<blockquote class=\"twitter-tweet\" data-conversation=\"none\" data-lang=\"en\">\n<p lang=\"en\" dir=\"ltr\">What Mark Zuckerberg misses here: Media&#39;s key role is *choosing* what to highlight\u2014just like Facebook&#39;s algorithm. <a href=\"https:\/\/t.co\/SnUr65y9HS\">pic.twitter.com\/SnUr65y9HS<\/a><\/p>\n<p>&mdash; Zeynep Tufekci (@zeynep) <a href=\"https:\/\/twitter.com\/zeynep\/status\/773506254195023873\">September 7, 2016<\/a><\/p><\/blockquote>\n<p><script async src=\"\/\/platform.twitter.com\/widgets.js\" charset=\"utf-8\"><\/script><\/p>\n<p>It\u2019s surprising he\u2019s taken this argument, and not just because it didn\u2019t help Microsoft when they tried it after <a href=\"http:\/\/www.telegraph.co.uk\/technology\/2016\/03\/24\/microsofts-teen-girl-ai-turns-into-a-hitler-loving-sex-robot-wit\/\">their twitter bot turned into a Nazi.<\/a><\/p>\n<p>Facebook is acting as if the question of authorship of algorithmic output is an open question, when this has been settled in the arts for decades. Musicians have been using algorithmic processes for years. Some John Cage scores are lists of operations performers should undertake in order to generate a \u2018performance score\u2019 which is then \u2018realised\u2019. The 1958 score of <i><a href=\"http:\/\/johncage.org\/pp\/John-Cage-Work-Detail.cfm?work_ID=79\">Fonatana Mix<\/a><\/i>  \u2018consists of 10 sheets of paper and 12 transparencies\u2019 and a set of instructions on how to use these materials. (ibid) Any concert programme for a <a href=\"https:\/\/www.youtube.com\/watch?v=BB6vRvc58TI\">performance<\/a> of this piece would list Cage as the composer. That is, he assumes authorship of algorithmic output. The question of authorship has had an answer for at least 58 years.<\/p>\n<p>Indeed, other Silicon Valley companies, some located just down the road from Facebook have quite clearly acknowledged this. The Google-sponsored \u2018Net.art\u2019 exhibition, included in the Digital Revolutions show at London\u2019s Barbican in 2014, included artist attribution next to every single piece, including those making copious use of algorithms.<\/p>\n<p>Art has already tackled even the issues of collective and collaborative algorithmic authorship. In 1969 Cornelius Cardew published <i>Nature Study Notes: Improvisation Rites<\/i>, a collection of text pieces by Scratch Orchestra members. Each of the short pieces, or \u2018rites\u2019, has individually listed authors. However, when programmed for performance in 2015 at Cafe Oto, <a href=\"https:\/\/www.cafeoto.co.uk\/events\/nature-study-notes\/\">the programme<\/a> was billed as \u2018The Scratch Orchestra\u2019s Nature Study Notes,\u2019 thus indicating both individual and corporate authorship.  Some of these pieces are best described as algorithms, and indeed have been influential in tech circles. As Simon Yuill points out in his paper <i><a href=\"http:\/\/www.metamute.org\/editorial\/articles\/all-problems-notation-will-be-solved-masses\">All Problems of Notation Will Be Solved By The Masses<\/a><\/i>  the anti-copyright notice included with the score uses copy left mechanisms to encourage modification.<\/p>\n<p>Some may argue that the artist gains authorship through a curatorial process of selecting algorithmic output. Unlike Iannis Xenakis, John Cage never altered the output of his formulas. He did, however, throw away results that he deemed unsatisfactory.  Similarly, <i>Nature Study Notes<\/i> was curated by the listed editor, Cardew.  One can assume that performing musicians would make musical choices during performance of algorithmic scores. It&#8217;s arguable that these musical choices would also be a form of curation. However, composers have been making music that is played without human performers since the invention of the music box. To take a more recent algorithmic example, Clarence Barlow\u2019s piece <i><a href=\"http:\/\/www.musikwissenschaft.uni-mainz.de\/Autobusk\/\">Autobusk<\/a><\/i>, first released in 1986, is a fully autonomous music generation program for the Atari.  The piece uses algorithms to endlessly noodle out MIDI notes. Although phrasing the description of the piece in this way would seem to bestow some sort of agency upon it, any released recordings of the piece would certainly list Barlow as the composer.<\/p>\n<p>Facebook\u2019s odd claims to distance itself from it\u2019s tools fail by any standard I can think of. It\u2019s strange they would attempt this now, in light of not just Net.Art, but also Algorave music. That is dance music created by algorithms, an art form that is having a moment and which is tied in closely with the &#8216;live-code&#8217; movement. Composer\/performers <a href=\"http:\/\/yaxu.org\/\">Alex McLean<\/a>, <a href=\"https:\/\/composerprogrammer.com\/\">Nick Collins<\/a>, and <a href=\"https:\/\/shellyknotts.wordpress.com\/\">Shelly Knotts<\/a> are all examples of \u2018live-code\u2019 artists, who write algorithms on stage to produce music. This is the form of artistic programming that is perhaps the closest analogue to writing code for a live web service. Performers generate algorithms and try them out \u2013 live \u2013 to see if they work. Algorithms are deployed for as long as useful in context and then are tweaked, changed or replaced as needed. Results may be unpredictable or even undesired, but a skilled performer can put a stop to elements that are going awry.  Obviously, should someone\u2019s kickdrum go out of control in a problematic way, that\u2019s still attributable to the performer, not the algorithm. As the saying goes, \u2018the poor craftsman blames his tools.\u2019<\/p>\n<p>Algoraving is a slightly niche art form, but one that is moving towards the mainstream &#8211; the BBC covered live coded dance music in <a href=\"https:\/\/www.youtube.com\/watch?v=JJ5h1albAzY\">an interview with Dan Stowell<\/a> in 2009 and has programmed Algorave events since. Given Algorave\u2019s close relationship with technology, it tends to be performed at tech events. For example, The Electro-Magnetic Fields Festival of 2016 had an Algorave tent, sponsored by Spotify.  As would be expected, acts in the tent were billed by the performer, not tools. So the performance information for one act read \u2018Shelly Knotts and Holger Balweg\u2019, omitting reference to their programming language or code libraries.<\/p>\n<p>Should someone\u2019s algorithmically generated content somehow run afoul of the Code of Conduct (either that of the festival or of <a href=\"https:\/\/supercollider.github.io\/community\/code-of-conduct\">the one used by several live code communities<\/a>), it is the performer who would be asked to stop or leave, not their laptop. Live coders say that <a href=\"http:\/\/toplap.org\/wiki\/ManifestoDraft\">algorithms are more like ideas than tools<\/a>, but ideas do not have their own agency.<\/p>\n<p>Zuckerberg\u2019s assertion, \u2018Facebook builds tools\u2019, is similarly true of Algoravers. Indeed, like Algoravers, it is Facebook who is responsible for the final output. Shrugging their shoulders on clearly settled issues with regards to authorship is a weak defence for a company that has been promoting fascism to racists. Like a live coder, surely they can alter their algorithms when they go wrong \u2013 which they should be doing right now. To mount such a weak defence seems almost an admission that their actions are indefensible.<\/p>\n<p>Like many other young silicon valley millionaires, Zuckerberg is certainly aware of his own cleverness and the willingness of some members of a credulous press to cut and paste his assertions, however unconvincing. Perhaps he expects Wall Street Journal readers to be entirely unaware of the history of algorithmic art and music, but his milieu, which includes Google\u2019s sponsorship of such art, certainly is more informed. His disingenuous assertion insults us all.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>A recent Wall Street Journal article (paywalled, see below for relevant quotes) felt it necessary to quote associate professor Zeynep Tufekci on the seemingly self-evident assertion that \u2018Choosing what to highlight in the trending section, whether by algorithms or humans, is an editorial process\u2019. This quote was necessary, as Zuckerburg asserts Facebook is a technology &hellip; <a href=\"https:\/\/www.celesteh.com\/blog\/2016\/09\/08\/algorithms-and-authorship\/\" class=\"more-link\">Continue reading <span class=\"screen-reader-text\">Algorithms and Authorship<\/span><\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"activitypub_content_warning":"","activitypub_content_visibility":"","activitypub_max_image_attachments":4,"activitypub_interaction_policy_quote":"anyone","activitypub_status":"","footnotes":""},"categories":[1],"tags":[363,362,361,59],"class_list":["post-4744","post","type-post","status-publish","format-standard","hentry","category-uncategorised","tag-algorave","tag-alogrithm","tag-authourship","tag-live-code"],"_links":{"self":[{"href":"https:\/\/www.celesteh.com\/blog\/wp-json\/wp\/v2\/posts\/4744","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.celesteh.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.celesteh.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.celesteh.com\/blog\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.celesteh.com\/blog\/wp-json\/wp\/v2\/comments?post=4744"}],"version-history":[{"count":15,"href":"https:\/\/www.celesteh.com\/blog\/wp-json\/wp\/v2\/posts\/4744\/revisions"}],"predecessor-version":[{"id":4759,"href":"https:\/\/www.celesteh.com\/blog\/wp-json\/wp\/v2\/posts\/4744\/revisions\/4759"}],"wp:attachment":[{"href":"https:\/\/www.celesteh.com\/blog\/wp-json\/wp\/v2\/media?parent=4744"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.celesteh.com\/blog\/wp-json\/wp\/v2\/categories?post=4744"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.celesteh.com\/blog\/wp-json\/wp\/v2\/tags?post=4744"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}