ISPs against Net Neutrality: The P2P User’s Stealth Enemy?
The RIAA and its various international affiliates have been prosecuting people since September 2003, to precious little effect, even if more than 15,000 people have been targeted so far. Simultaneously they have sought to wrest back the trade in media files by licensing their archives to iTunes et al. and running publicity campaigns to ‘correct’ the hoi polloi’s acquired taste for the free copies available online. Joined by the MPAA and games manufacturers, the recording industry have another somewhat more capricious allies in the form of at least a sub-section of the Internet Service Provider industry, many of whom have initiated unilateral policies against all their users simply so as to make their lives easier, and more profitable. The key term is traffic shaping, or as users experience it, throttling, and the results are degraded performance of targeted protocols. Have you checked your ISP lately?
Behind this practice lie a variety of intentions. The first is the desire to restrict the amount of data traffic, and as is well known, p2p is responsible for probably a majority of network traffic; some estimates claim that Bit Torrent (BT) alone is responsible for 50% of all data exchanged. Furthermore (well behaved) p2p users upload large amounts of data relative to other users, which is costly for service providers, whose model is essentially built on a client-server rather than a peer based model, and prices subscriptions based on the presumption that download will exceed upload by a factor of four. Where the ISP doesn’t have adequate bandwidth, this can cause quality of service problems (QoS) for the use of other applications.
Traffic shaping is implemented using three techniques. The first is the restriction of data transfer speeds over ports commonly used for p2p applications such as 6881-6889 for BT. This can be easily circumnavigated by changing the port settings in the client’s preferences and updating the virtual server on the router. Frequently however sophisticated methods are being used which involve packet inspection of the data. This doesn’t require storing or looking at the contents of the data to identify the content, but merely looking at the header and making an adhoc decision as to how to treat it. If packets I’m sending to peers are being dumped or lost, this will have effect how I’m treated by other peers in the network, reducing my download speed. This technique can be thwarted in some cases by encrypting the data (as is possible on clients such as Azureus and uTorrent). Lastly, where encryption and port randomization is used to evade identification, the solution is simply traffic analysis and heuristics: if I have dozens of TCP connections open to numerous IP addresses over a sustained period, well it’s not impossible that I’m getting blocks of the same file from them. Products capable of excecuting such functions are much in demand today, check out Cisco’s Network Based Application Recognition (NBAR) or Allot’s NetEnforcer.
As mentioned above there are many instances where these techniques are driven either by cost and QoS considerations. However this is not always the case. For example Clearwire – a WiMax operator in Canada, Ireland and elsewhere – restrict the use of the Voice over IP services because they are marketing their own (they also make p2p services more or less unusable). 3G operators exclude VOIP for the same reason (they’re not interested in cannibalizing their mobile phone business model). In the future we’re going to see a lot more of this, particularly as ISPs try to position themselves as the delivery point for services such as video on demand – at that point their interest in blocking the untolled distribution of movies will be transparent.
The irony in all this is that ISPs with their own infrastructure effectively received a wealth transfer from the media industries from Napster ownwards. Access to free content was a key driver between the uptake of broadband subscriptions, the point has been made ad nauseam, but 2-8MB downloads are completely otiose for those whose ambition doesn’t exceed browsing the web and using mail. Indeed the media industries continually rammed the point down the throats of anyone who would listen: fierce legislative battles were fought to determine ISP liability for the data carried, concluding in the notice and takedown provisions of the DMCA and EUCD which grant service providers legal shelter from their user’s actions. This certainly facilitated the development of file-sharing culture to the point whereby irrespective of what happens prospectively the media industry has a massive legacy problem due to the sheer quantity of material available out there, for free. So the ISPs are screwing people at both ends: users and media corporations.
Secondly all of this occurs without ISP users being given any explanation, there is no transparency. Traffic shaping systems are introduced on the sly and performace degrades from one ay to the next, causing confusion and a massive waste of time on the part of users who are convinced that they have done something wrong. there are unusually honest companies such as the Australian Exetel, but they are the exceptions. Under cover of creative interpretations of the Terms and Conditions in subscribers’ contracts they can legally get away with changing the service delivered, but it speaks oceans about the contempt they resrve for their predominantly non-technical user-base.
Lastly, my motivation for this rant is that six years ago, together with many others, I realized the potential of distributed media platforms built on p2p architectures. The ambition was not really to get free hollywood and boy-band-bumf, but rather to promote and distribute independent material produced according to another economic logic. And empirically it is beyond question that the amount of material authorised for exchange is proliferating rapidly (Creative Commons, GPL, Free Culture Defintion etc). Perhaps it sounds like net-utopianism, vintage 1995, but this is the first time in communications history that there exists an infrastructure of immediately global scope, capable of being assembled in a modular grassroots manner by the users themselves, who can decide what they want to help push. Irrespective of one’s side in the various debates around media concentration, this is a good thing. In the absence of such shared scalable infrastructures the impact of ‘disruptive technologies’ on communications freedom is reduced to the youtube fallacy: you can feel free to upload your content, but only they have the deep pockets to finance reliable streaming servers that can satisfy demand, thus they ultimately control the conduit. Over the next period there will be efficient p2p based streaming solutions, but if people have become accustomed to the return to dreary client-server relations epitomised by web 2.0/iTunes/rapidshare, the benefits will have been squandered. For some time now there have been legislative initiatives in different jurisdictions to guarantee Net Neutrality, as yet none have managed to become law.
1 Comment »
- The FCC and the Tectonics of Commercial Surveillance
- End2End: Privacy Theatre or Promise Deferred?
- A 2016 Almanac
- The Machinic Sewer
- A Yahoo User’s Journey through the Unknown
- Filmpiraten Crush Austrofascists (at first instance…)
- Pirate Residuum
- Readings from the Book of (library) Genesis
- Cyberspace – the Fifth domain of Warfare?
- Demystifying AdTech
- The Hymn of Acxiom
- Knowledge is born free, yet is everywhere in chains…
- civil liberties
- Data Protection
- European Court of Justice
- european directives
- european regulations
- european union
- material culture
- open video
- Pirate Bay
- Pirate Party
- social cooperation
- steal this film