Disorderly Conduct: some initial reflections on file-sharing
The phenomenon of "peer to peer", or "P2P" file-sharing over the internet is a transglobal expression of techno-social relations. We could say the same about other popular domesticised forms of internet usage, such as email, searches, blogging and photo sharing. However, P2P is different, like the 'special' child who doesn't really fit in with the rest of the family.
This syncretic phenomenon weaves together disparate traditions, ideologies, philosophies and practices. The result is a globalised, anational sphere of relations, simultaneously simple and complex, horizontal with some verticals in the mix. These relations influence the social shaping of the internet. In particular, P2P reanimates the early internet predecessor's primary function—the co-operative building of a network for sharing knowledge. Moreover, it extends the World Wide Web's core aim of easy document sharing.2 By unpicking some of the dimensions of peer to peer, examining it as a set of interconnected technologies, a generative sphere, and a field of intense communicative and cultural exchange, I propose that the phenomenon exhibits and exploits both hierarchies of order and also tangles of disorder. .
Peer to peer is a constitutive force with wider political implications, which is what makes it so interesting. It is a manifestation of innovative, constantly renewing forms of production and exchange that devour capitalist relations with oft-expressed relish. Its field of operations is within the heart of informational capitalism itself. Meanwhile, the aging body of the old order is burdened with pre-digital industrial, cultural and legal structures. This bloomered legacy is finding it increasingly difficult to get a leg up, let alone a leg over, in an era where peers are defiantly doing it for themselves, and each other.
I have been involved at the user level of P2P for only a few years, making me a relative latecomer compared to enthusiasts who began in the 1990s. Delving into the technical operations of P2P at the code and protocol levels I discovered that researchers approach the phenomenon using quite narrow frames of reference. Within the humanities field, the focus is on the tensions between intellectual property, copyright regimes, privatisation of knowledge and culture, and the evolution of a 'digital commons'. And in the Information Technology field, a search of academic databases uncovers a vast mass of articles dealing with technical aspects of P2P—papers with titles like 'Resource demand and supply in BitTorrent content-sharing communities', 'Safe Peer-to-Peer Self-Downloading', and 'Profiling a Million User DHT'. Clearly P2P has been exciting software engineers and network analysts for quite some time.
Yet peer to peer could be examined through other lenses, to encompass other fields of knowledge. Not only philosophy but economics, linguistics, art history, politics, cognitive sciences, game theory, globalism studies, and ethics, could enrich our understanding of this expanding phenomenon. For example, in terms of P2P's productive potential it is helpful to employ the Fluxus concept of 'intermedia'. What might be created in the interstitial spaces between P2P and architecture, between P2P and neuroscience, between the syncretism of P2P and the syncretism of voudou? And how could order and disorder play out in these particular instances of intermediality? Similarly, a serious exploration of the praxis of DIY, or Do It Yourself, and its prefiguring, and enactment, of social change provides another lens to illuminate P2P, just as it does with punk culture.
But before such metaphysical explorations can take place, a basic understanding of the mechanics of P2P is useful. And so we start with some definitions and significations.
Firstly, peer to peer refers to the social process of internet-based 'file sharing', a form of information exchange occuring typically amongst people whose real-life identities are masked by their online nicknames or their DNS numbers (the addresses of their computer on the internet). The shared files are generally cultural artefacts, and include digitised films, television series, music recordings, graphic files, software packages, electronic books, technical tutorials and computer games. Usually the copyright attached to these artefacts has been assigned to so-called content owners (generally not creators of original material, but rather publishing houses, national and multinational media conglomerates). Therefore their localised digital reproduction and translocal gratis exchange amongst file-sharing peers contravenes copyright law to varying degrees in the many jurisdictions that have legislated intellectual property regimes. The enforcement of such regimes is a different kettle of fish however.
Due to its ambivalent and relative legal status, the practice of P2P is sometimes said to be inhabiting a creating a 'grey commons'. This shadowy grey commons conjures up different images, evoking for some total social disorder, and for others the seeds of a new social order. Protagonists of P2P are cast, or cast themselves, as pirates, outlaws, data dandies, or info anarchists. Activist coders Palle Torsson and Rasmus Fleischer explored some of these ideas in their widely distributed address to a computer congress in Berlin in December 2005. In The Grey Commons - strategic considerations in the copyfight, Torsson argues that:
It is not a grey commons in terms of the law but as possibility, as technology and technique. It is not optional but inscribed in the technique we use every day. The grey is not here exactly by an effort but rather as the shortest way to make life work with technology. The shading, the tuning and twisting is omnipresent; it is not something you can wish away. What this really is about is our conditions of living, how information is used, transferred and owned in society. As humans, creators, amateurs or fans, in a desire for pleasure and in a chain of small habits we make the world appear....With the remix as the norm, steps to a democratization of creativity are taken and in the process we are liberating the myth of a special class of artists isolated from the rest of us fans, amateurs or consumers.3
While proponents of the grey commons embrace the creative chaos of promiscuous sharing, the players on the neighbouring conceptual turf of the digital commons have a different approach, siding more with the rule of law and order, arguing to change the law rather than questioning its underlying premises. The notional digital commons has been heavily promoted, made respectable and ready for semi- or full commodification by power players in academia and legal spheres. The quest of techno-libertarian Lawrence Lessig and his foot soldiers is well-documented, and orderly with its proliferation of 'content' licenses. It is, I suspect, almost as boring as Scientology, but without the fun of those funky Thetans. The custodians, spokespersons and gleaners on this more orderly electronic commons often seem quite earnest—the wikipedians, Creative Commons licensees, citizen journos, bloggers and their reading cadres. Of course these two renderings of an immaterial commons, so far in time and space from local histories of fertile forests and fields, do not signify mutually exclusive spheres of ideas and practices. In reality, the relations are intertwined, and to some extent, interdependent, and both involve the core practice of sharing information, knowledges, and opinions. MOOers, MUDDERs, MIPPERS, and Second Lifers often straddle both spheres for example.
The second meaning of P2P refers to the suite of software programs that enable this form of digital exchange. They are based around the breaking up of whole files into numerous discrete parts which can be shared amongst people connected to the internet. Release details and technical discussions of P2P softwares and their development trajectories occur in mainly geeky realms, such as slashdot.org. The various kinds of P2P softwares for digitising material and uploading the digitised files can be cross-platform (different versions of the same program to run on Linux, PC or Mac operating systems), or platform-specific.
The programs have been developed by enthusiasts—individual developers and small groups who often are aligned with open source and/or free software (FLOSS) philosophies, and sometimes with specific communities. Consequently, the production of these programs has been subject to the relatively ordered and communicative regime of distributed software development. Order is either mutually agreed upon (or in some cases 'decreed' by the lead developer) to avoid the technical problem of 'forking'.4
Adhering to FLOSS principles, the typically small P2P softwares are distributed online, free of charge. Specific softwares emerge, are subsequently improved upon, and sometimes disappear. The 'disappeared' have generally removed themselves, or been removed, through threatened and enacted legal actions instigated by corporate content owners and industry lobby groups. The infamous antecedent of P2P, the Napster file-sharing software, more homo sedens than homo sapiens in the evolutionary scheme of P2P, is a case in point.5
But resistance is not always futile, as demonstrated by P2P facilitators, Swedish-based The Pirate Bay (TPB). The Pirate Bay facilitate file-sharing by storing on their servers meta-data indices of material scattered across the internet on people's home computers that is available to be downloaded. These digitised materials themselves are not stored by TPB, only small 'tracker' files which point to their locations on peer computers. The Pirate Bay is run by a small autonomous group, whose members refused to capitulate to corporate pressure exerted through United States laws when their servers were seized by the Swedish police. Their defiant stance has gained widespread support not only in Sweden, but amongst an international cohort of file-sharers. The verdict in the recent court case against them found them guilty of aiding file-sharing, and imposed hefty fines and a year's jail time for each of the 4 defendants.6 However, as the case will be appealed, many commentators anticipate the ruling will be overturned. Big Media's hegemonic order cannot sleep soundly yet.
The third sense of P2P refers to the wider sphere of production and circulation. This includes the human nodes and material networks of faciliating techno-social 'machines' such as TPB, Mininova, Demonoid and others. Internet Service Providers (ISPs) are important components of this spehere. The original raison d'etre of ISPs in the mid-1990s was to facilitate people's access to internet services such as email, news groups, and the emerging World Wide Web. The line in was slow—remember the thrill of upgrading from a 720 baud modem to a 1400? Today the landscape has totally changed and ISPs make big profits by selling contracts offering ultra-fast download speeds plus gigabytes of download traffic. Now the only reason someone besides the online gamers would want it so fast and hard, is because they are engaging in P2P. So a very sizable industry exists built on the desires of millions of peers in thousands of locations across the globe. Order (business models, technical infrastructures, returns on investments) literally streaming out of a very disorderly multitude. It is hardly surprising that ISPs in most countries are currently vigorously resisting the attempts of the State/Big Media complex to monitor and constrain the activities of their profit-producing client base. The symbiotic relationship between P2P and ISPs, explored through the lense of productive disorder, is a subject meriting some serious research.
Production in the context of P2P is in fact more a matter of post-production, than the creation of something 'original'. It involves the digitisation of cultural artefacts, either by capturing live streamed content (from digital tv and pay television) with a software like AverTV and compressing it with a different program, or by ripping material from digital storage media such as audio CDs, DVDs, software disks, and e-books using capture and processing softwares. There are different processes and softwares for digitising different kinds of original materials.
People who undertake this labour, digitising the cultural artefacts and making them available online via their own computers (seeding), are both rippers and initial seeders. Seeding also encompasses the activity of downloaders, who in an ideal P2P world, maintain the practice of keeping artefacts accessible on their individual computers for a period of time, thereby maintaining a viable network of file sharers, or peers, for any one file. When a file is not reseeded by anyone it is for all intents and purposes dead.
Consequently, it is considered neighbourly to do both of these things, rip and seed, especially the latter. Seeding is a way of giving back to the virtual and ephemeral community that forms around an artefact. Anyone who has the minimum bandwidth for uploading (seeding) files, and who is not constrained by legal factors (in some jurisdictions downloading is permitted whereas uploading is not), can, in theory, be a seeder. Ripping, however, requires some technical knowledge about how to optimise video and audio file compression. Given the poor quantity of numerous rips out in the world, ripping is not for everyone. However, more people stay in the role of leechers, those who take without either doing the initial work, or who download and then take their copy of the artefact off the network, thereby not contributing to the ongoing and broadly distributed task of file sharing.7 To be a leecher is to be scorned by the community-that-is-not-a-community.
How do ripping, seeding and leeching relate to the workshop's theme of disorder? The process of ripping, to be productive in the sense of achieving a good outcome, that is, a decent copy of the original file, requires an orderly and methodical approach. As people build online reputations based on the quality of their rips, order is valorised, in terms of peer esteem. A few dud rips and years of net rep can go down the drain. This is not the place for a DIY attitude based solely on disorder and intuitiveness, no staring into chaos's abyss.
Ripping is an act that is both destructive and creative. For example, an initial seeder might rip an original DVD of the Taviani Brothers' film Kaos using the Handbrakea software, or capture a video stream using VirtualDub. These softwares in a way atomise the digital data packets, then recompress the components, and stitch it all back together again. The copy is visibly and audibly daggier than the original; this is no seamless reproduction, but a bit of a hack, a messy workaround the problem of compressing an original file down to something a tenth of its size. And in the digital world, size does matter.
After an artefact has been ripped, it needs to be placed online and indexed, so that peer downloaders can locate it. Many popular P2P clients such as µTorrent and Vuze implement the BitTorrent file-sharing protocol.8 Each BitTorrent client can prepare, request and transmit any digital artefact over a computer network via this protocol. Utorrent and other clients work in the following way:
To share a file or group of files, a seeder first creates a small file called a torrent (eg, kaos_DVDrip.torrent). This file contains metadata about the files to be shared and about the tracker, the computer that coordinates the file distribution.
The seeder then uploads the torrent to the internet, sending it to one or more tracker sites.
Peers wanting to download the file must first obtain a torrent file for it, and connect to the specified tracker, which automates connections with other peers who have the complete file, or who are in the process of downloading it, packet by packet.9
Circulation of artefacts is enabled through specific websites, which do not store the actual artefacts, but contain indexes of torrent files. These indices contain elements of order and disorder, often relating to how well they are monitored and maintained by their human guardians—the sys admins and others. These websites index and describe the artfacts, and, depending on the system, may or not contain other information on trackers.
A tracker “maintains lists of the clients currently participating in the torrent. Alternatively, in a trackerless system (decentralized tracking) every peer acts as a tracker.”10 There are numerous such facilitating sites which range from the general (an analogy could be a public library) like mininova.org, demonoid.com and isohunt to the niche (specialty bookshop) like secret-cinema.net and thebox.bz. Such sites, typically run by volunteers and deriving income from advertising, often also are home to discussion forums, technical FAQs, search clouds, socialising areas, and meta-data on users/members. This meta-data varies from site to site, and can include information on members' (avatars') interests, and lists of artefacts they have downloaded.
This aspect of P2P also requires a high level of order to function smoothly, as they are essentially libraries with vast repositories. Trackers must be kept online and named correctly; indexing of material into categories and sub-categories assists downloaders find what they are searching for; moderated discussion forums enable people to place requests for specific artefacts and to comment on those already in circulation; and comments sections attached to each artefact facilitates the exchange of information amongst peers on the technical quality of each rip. Disorder in these libraries manifests in unwelcome and destructive ways. A major problem is the insertion of malware, viruses and trojans into seemingly innocent digital files. Password-protected files often signal such danger. It is rumoured that agents acting on behalf of the big content owners, a cadre of dark rippers, are responsible for much of this disorder.
On the other hand, the seeding of files is a paradigm of positive disorder. Although some P2P softwares like Vuze have visualisation functions, where diagrams of the seed dynamically portray the file and its seeders, the animation tantalises without satisfying.
Seeding involves strangers sharing with strangers, each one motivated by a mix of self-interest, principle and altruism. Seeding is one big series of questions, and very few answers. You never know who you are seeding to, how long they will stay online for, if they will like what you are sharing with them, how they might make use of it, and so on. The whole system is built upon anonymity, randomness, and chance. And this is its strength, because compared to earlier software programs that enabled file sharing through systems of identification and storage of complete files, the later "bit torrent" programs depend on fragmentation and limited identification. From this apparently disordered base, this rejection of naming and taming, a meshwork of nodes and networks involving millions of people around the world has evolved.
Leeching, like ripping, tends more on the side of order. Even if a peer has no intention of giving back to the imaginary community by seeding their downloads, a certain amount of order in file management is required. If a downloader is too haphazard in what they are grabbing, chances for disaster increase. Corrupted files, trackers going offline, no seeders for that half-downloaded television series, running out of room on hard drive storage for all those semi-downloaded things.
Socio-economic class, a ubiquitous top-down form of ordering individuals and communities, is a consideration in any P2P discussion. Surplus value is added to pirated products, explains Jonas Andersson (2006: 69) "through labour which...appears entirely unpaid: the late-night tinkering of crackers, encoders, subtitlers, administrators, seeders, leechers. This is a mode of labour which....thrives on mutual recognition, informal systems of meritocracy, and just plain, sheer fun and curiosity." As the "gift economy" underpinning digital file-sharing originated in the "rich West," Andersson continues, this peer labour "is dependent on already established prosperity; it is a form of ‘free’ labour which one can afford, given that one has got the required material setup as well as the time, skill, and intellectual capacities."11 Looking only at P2P from a Western perspective, the phenomonon doesn't seems to challenge well-established hierarchies of power.
However, using examples of non-digital sharing of cultural knowledge and artfacts from India (public holdings of seed patents) and Asia (the counterfeit economy), Andersson presents a broader look at the notion of "affordance" and its challenges to capitalism, from within capitalism, like P2P. The "collective gain" in copying of digital artefacts is such that even those people "with modest margins of sustenance can afford to share that which is only multiplied and never reducible: culture, ideas, knowledge, information, software" (ibid.). If we accept this argument, then it suggests that the old class orders themselves could be threatened by the constituent outcome of general processes of file-sharing, that "aggregated strategic entity — the network" (ibid.)
Our discussion started with the process of file-sharing in a cultural and social sense, the swapping of artefacts over networks. We then widened the definition of P2P to include the softwares and websites which enable this artefact exchange. Finally, we considered P2P in the broadest sense of a techno-social machine, an abstraction developed from the material roles and processes of exchange. Now it is time to pull back from this macro view to the micro level, the underlying computer code and communication protocols which are at the heart of peer to peer. A fascinating interplay between order and disorder exists even at this micro level.
Peer to peer software essentially operates upwards and outwards from individual internet-connected personal computers. The software creates a peer to peer computer network that "uses diverse connectivity between participants in a network and the cumulative [internet] bandwidth of network participants."12 This P2P network represents a fundamentally different communications paradigm to traditional forms of electronic data exchange, in which a small number of centralised computer 'servers' transmit data to a relatively much larger number of distributed 'client' machines.
An example of the conventional client/server network in action would be a web designer using FTP software to upload an entire website from her work computer (client) to a specific location within a remote central bank of computers (server rack). The website is now permanently stored on the server, with its own unique web address (URL). This permanent address enables the website to be accessed by a large number of people from their own computers (clients). In visual terms, we can imagine a cone, with the server at the apex, and a ring of clients at the base. In this paradigm, the mute and obedient server is responsive to requests from chatty active clients.
This highly ordered set of relations does not make a scale-free network. On the contrary, the system is quite vulnerable. For example, if too many clients attempt to simultaneously access the server, they can cause it to crash. This limitation of the client/server relationship has been exploited by hacktivists via coordinated 'DOS' or 'Denial of Service' attacks since at least 1998, as a form of 'electronic civil disobedience' to highlight various social and political issues in the emerging public sphere of the internet.13
In contrast, a P2P network is formed by the interconnections between 'peer nodes' that "simultaneously function as both 'clients' and 'servers' to the other nodes on the network" (Wikipedia, op. cit.). The structure is horizontal, heterarchical, and decentralised. An example of a P2P network in action are the various processes involved in file-sharing. A content ripper uploads a tiny tracker file to a tracker such as The Pirate Bay. This tracker points to the location of a specific digitised artefact seeded on the ripper's computer. Interested peers start downloading the artefact, packet by packet, to their own computers via a BitTorrent client (software).
As each peer receives a packet of data onto their own computer, this data is available to be automatically seeded to any other peer connected in the active download 'swarm' associated with that particular artefact.
Thus all downloaders become active or potential uploaders, as the artefact proliferates throughout the temporary P2P network which has formed around a desire for it.
Once all packets of the file has been downloaded onto a peer's computer, they can watch/listen to/read/play the artefact. Often peers make copies of the artefacts on other media, such as audio CDs, DVDs, USB sticks, or portable hard drives, in order to be able to further share the cultural material with friends, family and colleagues.
This offline recirculation of shared material has not really been researched, yet it is I suspect, highly significant aspect of the affective dimension of file-sharing.
If the peer is inclined, they will continue to share the file online for a period of time, at least until they have reached a ratio of 1.0 for that artefact.
*Context for this text
These are notes I prepared for an informal workshop presentation I gave at the First IT & Disorder workshop held at University of Technology, Sydney (UTS), in November 2008. The It & Disorder workshop was co-ordinated by Jonathon Marshall, a UTS post-doctoral fellow who is in the early stages of a 5 year project to investigate "the production of disorder in everyday life through Information and Communication Technology." As the project website states "Despite continual claims of increased efficiency, administration, when distributed through ICT, routinely seems out of control and unpredictable even to those expecting to hold power. Even good software can unexpectedly produce disorder. If this is a common experience, then it cannot be ignored or taken as unusual or as unimportant."
My presentation covered 3 topics - Fluxus, the early British punk phenomenon, and P2P, exploring them in terms of DIY and grass roots participatory cultural production. A second workshop was held in March 2009, in which I further explored P2P, mainly in terms of current legal and regulatory developments.
It's pretty much like pulling teeth, trying to rework these notes into an actual text. But now I am having fun getting diverted with histories of Baa Baa Black Sheep and other detours. This is where The Next layer can be fun - I can play with images, which is not much of an option with my 90,000 word thesis (okay, it will have some screen shots of software interfaces but that will probably be it - no woolly lambs, no sire-ee!)
I suppose my aim is to use my workshop ideas as the basis for Open Code & Open Culture, which will be the final chapter of my thesis.
- 1. Here I am thinking of repugnant net uses (creepy paedophile exchanges using tightly secured networks), annoying uses (dissemination of spam, adware, pointless e-petitions, chainmail), and malicious uses ( trojans, malware, phishing).
- 2. But this time it is protocols and software clients based on BitTorrent that enables much of the world's P2P activity, rather than the Hypertext Mark Up Language, or HTML, that enabled the creation of the World Wide Web.
- 3. Palle Torsson and Rasmus Fleischer, The Grey Commons - strategic considerations in the copyfight, Speech transcript to 22C3, Berlin, December 2005. Source
- 4. Forking occurs when different developers take the software in divergent directions, resulting in unecessary duplication of labour resulting in incompatible versions of the same program. It could be considered a form of technical and social disorder produced when two orders, one existent, the other in a state of becoming, emerge in the same spatio-temporal field.
- 5. Napster, developed in 1998 by student Shawn Fanning, was fundamentally different software to Bit Torrent, as it involved direct file exchanges between peers. According to Kurt Fritz, "Napster was not a true P2P network because the site maintained a central server with information on where music files resided in the network" (Fritz, K, 'Playing a Different Tune', Information Today, December 2008). The Kazaa case is a more recent example.
- 6. Source
- 7. This ratio is easily verifiable by comparing the number of downloads of a particular file with the number of people actually seeding it.
- 8. “Programmer Bram Cohen designed the protocol in April 2001 and released a first implementation on 2 July 2001. It is now maintained by Cohen's company BitTorrent, Inc. Usage of the protocol accounts for significant Internet traffic, though the precise amount has proven difficult to measure. There are numerous BitTorrent clients available for a variety of computing platforms.” Source: Wikipedia.
- 9. Ibid.
- 10. Ibid.
- 11. Jonas Andersson, 'The Pirate Bay and the ethos of sharing,' published in Hadzi, Adnan et al. (2006) Deptford.TV Diaries, London: Openmute Publishing.
- 12. Source: Wikipedia (find better definition from a book or paper)
- 13. The Italian Strikenet, followed by EDT's Floodnet, are the first documented DOS actions.