In recent days it has been announced that the EU wants all scientific research papers funded through its programs to be released under Open Access by 2020*. Newspaper coverage has credited the combined efforts of the Dutch government and EU Commissioner for Research, Innovation and Science, Carlos Moedas, for the initiative. Moedas caught my attention in April due to a speech he gave on ‘open science’ which began with a reference to Alexandra Elbakyan and the controversy around Sci-Hub. He went on:
“Elbakyan’s case raises many questions. To me the most important one is: is this a sign that academic journals will face the same fate as the music and media industries? If so – and there are strong parallels to be drawn − then scientific publishing is about to be transformed.
So, either we open up to a new publishing culture, with new business models, and lead the market… Or we keep things as they are, and let the opportunity pass us by. As I see it, European success now lies in sharing as soon as possible, because the days of “publish or die” are disappearing. The days of open science have arrived.”
Until recently it would have been impossible to imagine an institutional figure use Sci-hub as a springboard for a positive vision rather than an occasion for vitriol.
While legal action by Elsevier against Sci-Hub and Elbakyan grinds on in the US, it has succeeded only in generating large amounts of positive publicity for both – Elsevier’s attack on a library that once existed in the shadows has ended up biting the behemoth in the ass. Despite a court decision in their favor, the website remains online and usable.
Seeking to disable free access to scientific articles otherwise available only through overpriced subscriptions was never going to be a winning PR strategy. That Elsevier made an operating profit of 34% in 2014 doesn’t help their case, nor does the fact that the authors are not paid. Commentators have instead treated the liberation of academic work from copyright restrictions as an enlightenment gesture in favor of universal access to knowledge (which it is). There is sympathetic coverage all over, from Science to Le Monde ** – it’s all a far cry from the quiet annihilation of Library.nu/Gigapedia in 2012.
Academic work is special…
Such an outpouring was never going to emerge from the cases against Napster or the Pirate Bay – the shared objects at the centre of those trials were seen merely as trifling entertainment commodities. This is odd given how important shared cultural works are for shaping our identities, but somehow they are tainted by their association with pleasure and fun. Academic papers, on the other hand, are no terrain of indulgence; they are the stuff of seriousness, discipline, painful memories of homework…
Risk and Reward
With all this attention, use of Sci-Hub and Library Genesis is booming and presumably growing its holdings. Given that the entire collections are available for download via torrent it will be interesting to see if services on top of the corpus – text mining etc. Until now such techniques have been the preserve of the database owners or companies like Google, with the resources for both mass scanning efforts and sustained legal defense involved in their Library/Book project. So let’s see the unauthorised repositories become the substrate for experiment, analysis, and additional layers of meaning.
Elbakyan is now carrying a lot of personal risk and is owed our support. Aside from the injunction against her there are claims under the Computer Fraud & Abuse Act (the same law used to prosecute and intimidate Aaron Swartz). She is cagey about her location and is concerned about the threat of extradition to the US. But this must be weighed against what she has achieved: assembling and stewarding a system of self-provision for all those with inadequate access to literature wherever they are. Right now it is important that all those who believe in LibGen/SciHub state that support openly. Later this may also meant to step up and support her also materially
*There are caveats however, enabling exceptions for reasons of security… and intellectual property rights – an exception which could utterly undermine the rule depending on how it is interpreted.
**See also: Justin Peters’ article critiquing Science’s defence of their business model; a piece from Aaron Swartz’s former colleagues at The Baffler; the very useful bibliography regarding Sci-Hub/LibGen maintained by Stephen Mclaughlin.
On November 2nd the Southern District Court of New York granted Elsevier a preliminary injunction against Library Genesis for copyright infringement. The site is online repository of texts mostly, but not uniquely, of educational character, accessible to all. The defendant, Alexandra Elbakyan, never appeared in court but did submit a letter to the Judge explaining the reasons for the site, it’s worth reading. Library Genesis remains active for now and for technical reasons will be more difficult to kill than the last target of knowledge prohibition, Library.nu, which was shut down in February 2012.
Supporters and advocates for free and open access have issued a statement in support of LibGen which is also a Manifesto sorts.
# In solidarity with Library Genesis and Sci-Hub
In Antoine de Saint Exupéry’s tale the Little Prince meets a businessman who accumulates stars with the sole purpose of being able to buy more stars. The Little Prince is perplexed. He owns only a flower, which he waters every day. Three volcanoes, which he cleans every week. “It is of some use to my volcanoes, and it is of some use to my flower, that I own them,” he says, “but you are of no use to the stars that you own”.
There are many businessmen who own knowledge today. Consider Elsevier, the largest scholarly publisher, whose 37% profit margin[^1] stands in sharp contrast to the rising fees, expanding student loan debt and poverty-level wages for adjunct faculty. Elsevier owns some of the largest databases of academic material, which are licensed at prices so scandalously high that even Harvard, the richest university of the global north, has complained that it cannot afford them any longer. Robert Darnton, the past director of Harvard Library, says “We faculty do the research, write the papers, referee papers by other researchers, serve on editorial boards, all of it for free … and then we buy back the results of our labour at outrageous prices.”[^2] For all the work supported by public money benefiting scholarly publishers, particularly the peer review that grounds their legitimacy, journal articles are priced such that they prohibit access to science to many academics – and all non-academics – across the world, and render it a token of privilege[^3].
Please read the rest at their site.
If I had known earlier that Denny Chin was to deliver his decision on the fair use question in the Google Books case, I would made my way to Madison Avenue and lurk outside the office of the Authors Guild, the plaintiffs. There I might perhaps have heard a pitiful wailing and gnashing of teeth, sounds no doubt echoed in many a lawyer’s chamber around the city. For Denny Chin drop the bomb on their hopes, and found an affirmative fair use defense for Google’s scanning project. That the result was pronounced in the Federal District court of what has historically been the centre of the US publishing industry is also noteworthy. But this has been a never-ending saga of litigation so first let’s recap, check the reasoning, and lastly ponder the consequences.
1. Google Books comprises three classes of texts from a legal point of view: public domain works which can be made available in their entirety; books which are made available to preview through partner agreements between Google and publishers; and books which were scanned by Google without permission, the searching of which produces small ‘snippets’ of the text as results. This court case concerns the final group of books.
2. The Authors Guild and Association of American Publishers launched their legal action in 2005. In 2008 a settlement was announced by Google, it would be subsequently be amended, but the substance was to (a) make a payment to affected authors (b) pay the plaintiffs lawyers and (c) fund the establishment of a Book Rights Registry. This settlement was eventually rejected n multiple grounds by Denny Chin in early 2010. At this time he was a Federal District Court Judge in New York. Chin was subsequently promoted to the Court of appeals for the 2nd circuit, but as able to hold onto several cases from his previous post – including the Google Books case.
3. While the various parties involved attempted to reach a modified agreement which would be acceptable to the courts, Chin set a schedule for litigation of the original copyright infringement action. As the Authors Guild were to put the case for all the authors whose works were copied, they had to get ‘certification’ of the class – basically a decision from the court that it is appropriate that the plaintiff represent all members of the class and has the means to do so. Certification was issued by Chin in May and then appealed by Google in July. Obviously Chin did not hear the appeal of his own decision. The Court of Appeal sent the case back to Chin at the District Court to make a determination on the fair use defense to the charge of copyright infringement, as a decision in Google’s favour would make the certification issue irrelevant.
i. Google got access to the books from participating libraries, who received a digital copy of each of their books in exchange. All texts are processed for optical character recognition (OCR) so that a full word index can be constructed to enable search.
ii. Much emphasis was placed on the restrictions on access to those books scanned without permission, of which only ‘snippets’ are displayed. Each snippet is one eighth of a page and only three snippets are ever returned in the results field. In addition to this limitation, one out of the eight snippets is never displayed, and no snippets are available from one in ten pages. The upshot of all this is that the full text of the book is never displayed to users, even over long periods of time in a fragmentary fashion.
A. Chin found in favour of Google in the fair use determination. He analyzed the facts against the four factors of the fair use test codified in the law, but did so in the shadow of what his interpretation of copyright’s ‘very purpose’: “Copyright law seeks to achieve that purpose by providing sufficient protection to authors and inventors to stimulate creative activity, while at the same time permitting others to utilize protected works to advance the progress of the arts and sciences.” (page 16)
B. He then stressed that a key issue was whether the alleged infringement is ‘transformative’:
that is, whether the new work merely”supersedes” or “supplants” the original creation, or whether it: instead adds something new, with a further purpose or different character, altering the first with new expression, meaning, or message; it asks, in other words, whether and to what extent the new work is “transformative.” (page 18)
In the recent past this approach has been used to provide the fair use imprimatur for the basic technology of search, the cases Kelly v Arriba and Perfect 10 v Google.
C. He then applied the four factors in turn (pages 19-25).
- ‘the purpose and character of the use’; Chin found the use to be highly transformative, as (a) its cross-corpus index of words in books had quickly become crucial for research as well as (b) making possible whole new types of research such as text and data mining base on the quantitative analysis enabled, whilst (c) the service did not offer a competing way to actually read the books. Given all this it was of less import that Google is a commercial enterprise and undertook the project motivated by profit.
- ‘the nature of the copyrighted work’; most of the books scanned were non-fiction works whereas ‘works of fiction are entitled to greater copyright protection’
- ‘amount and substantiality of the portion used’; Google copies the entirety of the work, and whilst the making of full copies does not exclude the possibility of a fair use finding, this is the only point which Chin felt went against a fair use finding.
- ‘Effect of Use Upon Potential Market or Value’; this is often the determinant part of the analysis. Here the plaintiffs claimed that the value of their works was being undermined, but Chin disagreed. He argued that given that Google was not selling the scans it produced as part of building the library, what they were effectively doing is helping to build potential sales by making it easier to discover forgotten, lost or neglected works.
The Fair Use analysis is followed by a summary of the social benefits of the service:
In my view, Google Books provides significant public benefits. It advances the progress of the arts and sciences, while maintaining respectful consideration for the rights of authors and other creative individuals, and without adversely impacting the rights of copyright holders. It has become an invaluable research tool that permits students, teachers, librarians, and others to more efficiently identify and locate books. It has given scholars the ability, for the first time, to conduct full-text searches of tens of millions of books. It preserves books, in particular out-of-print and old books that have been forgotten in the bowels of libraries, and it gives them new life. It facilitates access to books for print-disabled and remote or underserved populations. It generates new audiences and creates new sources of income for authors and publishers. Indeed, all society benefits. (page 26)
As far as Chin is concerned the same analysis applies to objections to the libraries use of their scanned copies. And that’s that, a knock-out for Google and the libraries in the Southern district of New York.
Momentous as it is, for now this is just a District Court judgement; endorsement by a higher court will be necessary before its full impact is felt. In the short term the decision will surely be appealed. How willing will the 2nd circuit be to reverse one of its own judges, and one who has been sleeping with this litigation for so many years? Does that mean it will go to the Supreme Court?
More broadly, the fact that this went to court meant that this defense is now open/applicable to others as well. A huge concern with the Google books settlement was that it was a private agreement granting them exclusive shield from liability with regard to the corpus of books – the path is now open to others to do the same, like the Internet Archive perhaps. Furthermore the concept of transformative use comes out of this emboldened, and available potentially to others working with different forms of archives, such as moving images for example.
Of course the problems for those who would follow in their footsteps is that the rules are different for Google. Not only do they have the money to fight infinite legal battles, but they have the reach into our habits such as to make their tools ‘useful’ and ‘socially beneficial’. They benefit from a presumption of legitimacy because of our reliance upon their services. Should this decision survive the coming challenges, the real test for it will be whether it provides a shelter for the next technologists developing tools that upset an incumbent industry.
Apparently now is a time of reckoning for the ‘one click’ hosting services which have come to dominate filesharing since around 2005. While attention has focussed on supersized Megaupload operator Kim Dotcom and his bizarre universe, other more discrete circuits have also been closed. Library.nu, an enormous collection ranging from bestsellers to truly arcane academic titles in all formats, yesterday announced its own epitaph .
Books have always been available online; when I first got access to Usenet in 1992, some of the first things I came across were Bruce Sterling’s ‘The Hacker Crackdown’ and Hakim Bey’s ‘Temprary Autonomus Zone’, cult titles amongst early internet users. Books were inputed laboriously via keyboard and posted as .txt, first on usenet, then the web and ftp. Scanners were still in short supply at that point, and OCR software underdeveloped, but as they dispersed and improved the number of works mushroomed. But the delivery method was inconvenient, requiring the reader to remain at their screen or print to dead tree. Other larger collections were assembled, such as textz.com, which eventually ended up in legal wrangles with rightsholders.
With the growth of file-sharing into a mass phenomenon in the middle of the last decade, dedicated book sites appeared, sometimes linked explicitly to complaints about access and cost. This was the case with the Danish vidensdeling.nu founded in August 2005 to provide a platform for students to share course books. Publishers immediately shut the site down. A similar site in the US, Textbook Torrents launched in 2007, was closed in the summer of 2008 after an article in the Chronicle for Higher Education led to threats of legal action against its creator.
Ebook Readers, Meet Direct Downloads
Prior to the release of mass market book readers, the mainstream publishing industry felt relatively unaffected, but they understood that as the devices made their way into users’ hands they would find themselves losing control in a replay of the music and film sectors. As torrent sites came under sustained pressure, and their users were targetted with legal action, many closed or became private clubs. Direct downloads filled the gap left in their wake: requiring no software installation they were simple to use, and due to their FTP structure their users were not connected to a network transparent to monitoring and potential identification. Whilst these sites limited the quantity non-paying users could access, the small size of books vis a vis movies made such sites playgrounds for book fans. As Amazon ramped up marketing and volume on the Kindle, and then tablets like the iPad took off in popularity, the bumpiness in the user experience of digital text diminshed, and the protective buffer around the publishers receded.
With this in mind the German Boersenverein developed a strategy in winter 2008 which was subsequently circulated to Publishers’ organisations internationally the following spring. Here they outlined an approach which combined political lobbying with stigmatization of unauthorised copying of books. Parallel to this they proposed to increase the availability of authorised ebooks, and to instigate a legal campaign against “systematically ‘suitable’ services”, one-clicks hosts in particular. In this manner the demand then flowing towards pirate sites could be intercepted and rerouted by an industry doing a better job at supply.
To this end a relationship was established with the Lausen legal practice in Munich. The first target was Rapidshare: in 2009 they campaigned to have the site blocked by German ISPs. Unsuccessful on this score, a group of national and international publishers initiated legal proceedings, represented by Lausen. In February 2010 an injunction was obtained from a court in Hamburg ordering the removal of 148 works from Rapidshare (many of them also text books) and further monitoring to ensure that the works did not reappear. As some titles continued to be available, the plaintiffs brought rapidshare back to court, where the latter were fined 150,000 euros in December of the same year for failure to comply with the terms of the injunction, and not having introduced adequate filtering mechanisms.
Curtains for Library.nu
In 2011 Lausen and the publishers turned their attention to library.nu, a site providing a central register of books available for download from a series of direct download sites and active since 2006. An article published in the Sunday Times in mid-December last reported that the operators of the site had been traced to Galway, Ireland, and that one of the addresses provided to the domain registrar was the headquarters of Anglo Irish Bank (the administrator obviously has some sense of humour as Anglo was the biggest crap-out of the property bubbble collapse).
Between Christmas and New Year the publishers successfully applied for a series of court orders at the Landesgericht in Munich. Apparently the orders to cease and desist were passed then to Ireland in the last week. The plaintiffs are claiming that library.nu was a massive commercial piracy operation making eight million pounds a year, an improbable figure given that virtually all of their income derived from advertising and donations. According to an article in torrentfreak premium membership was introduced for purchase only in November last, which didn’t leave them with much time to make hay.
Whilst the library.nu domain has not been seized, the operators have decided themselves to take it offline. According to a press release from the American Association of Publishers, the operators will now be pursued:
One positive outcome from this complicated process is that the platform operators themselves are now being held responsible as perpetrators for the copyright infringements on their sites and will therefore not merely be liable for the illegal conduct of their users. All four copyright chambers at the LG of Munich I who dealt with this issue and who promptly issued the 17 interim injunctions were in agreement on this matter.
Although how this is being dealt with jurisdictionally remains unclear.
A Blip or the End?
The tiny size of contemporary epubs makes them incredibly easy to store and distribute. As is the case with much online enforcement activity this is more about the show than the substance, intended to scare other operators and send a message to errant users. It is true that as long as these sites are structured in a centralized manner they will have a limited half-life. One would expect the recent closures to lead to a renewed interest in distributed and even quasi-anonymised systems, such as i2p.
Centralisation constitutes a honey-pot for profit-focussed pirates: without it there is no audience whose attention can be sold to advertisers, nor a fixed infrastructure on which a toll can be charged for access or better performance. It is a great irony that what began as a campaign against p2p has now had the unforeseen consequence of creating a market for a client-server system of unauthorised media distribution, thereby offering significant incentives for a particular type of entrepreneur. This client-server architecture is the very negation of the potential of the net, returning users to the role of passive customers.
On a final note, the case of library.nu is significant because the demand for the works offered there demonstrates that filesharing is not just about pop music, porn and cams of action movies, but also those forms and sources of knowledge whose acquisition are ritually celebrated within ‘enlightenment’ culture. Many of those whose works were offered derive income not from royalties, but from related activities such as teaching and research. Such people were themselves an important component library.nu’ user base. Some have other means to access the same materials, others, especially those in countries with weaker education infrastructures and more emaciated library budgets, do not. Outside of formal education, the millions of online autodidacts may be denied access to material, seriously impinging on their lives and possibilities. When one considers the cost of text books and more especially scholarly articles, that is no hyperbole, and applies not only to the global south but the post-industrial north as well, awash in its dreams of knowledge economies and human capital.
But maybe such a concern is sheer melodrama, given the likelihood of the same works becoming freely available elsewhere. Time will tell.
“Obsession with present-mindedness precludes speculation in terms of duration and time…. That process is due to causes which affect the mental temper as a whole, and pour round us an atmosphere that enervates our judgment from end to end, not more in politics than in morality, and not more in morality than in philosophy, in art, and in religion.”
Harold Innis, The Bias of Communication, 1954
Elisabeth Eisenstein, in her extraordinary work on the social impact of the printing press (1), emphasizes how the full ramifications of its invention did not reveal themselves immediately, but instead set in chain a series of changes that were apparent only over the longue durée. She remarks that for the hundred years after Gutenberg’s Psalter came off the press, a contemporary would not have noticed any great change. Print and scribal culture co-existed initially, and the former had to await refinement of printing techniques, changes in paper manufacture and other developments before ousting the hand-copied word. And triumphing in the battle for supremacy over the form of the word was just the beginning: the changes ushered in by print required the slow development of social practices emanating from divergent sources: theological dissent, the networking of dispersed enthusiasts of the arts and scientists, the self-interested drive of map-makers and commercial printers. Eisenstein argues that these forces combined to create the material conditions which were a precondition for the reformation and the emergence of modern science. If such dramatic consequences were not anticipated by Gutenberg and Fust’s contemporaries in the fifteenth century, it is understandable; after all this was the first communications revolution. Those witnessing the events of the (public) internet’s first twenty five years don’t have that excuse.
(1) Elizabeth Eisenstein, The Printing Press as an Agent of Change, 1979, Cambridge University Press
Last thursday the European Court of Justice issued its decision in the Commission’s case against Ireland regarding the transposition of Directive 92/100/EEC of 19 November 1992 on rental right and lending right. Irish law exempted public, educational and academic institutions to which members of the public have access, from having to pay remuneration to copyright owners. The Irish based their defence on the derogation set out in Article 5(3), allowing exemptions of certain categores of lending establisjments based on the needs for national cultural policies.
Alas, the ECJ gave short shrift to the article, insisting the effect of the law as transposed was to deny authors exactly the remuneration the Directive was intended to provide. The result of course is that the public will have to cough up and pay the collecting societies out of the tax-kitty, even if the pain will pass unperceived as it will be a matter for the The Department of the Environment, Heritage and Local Government and the County/City Councils who run the library system.
- The Machinic Sewer
- A Yahoo User’s Journey through the Unknown
- Filmpiraten Crush Austrofascists (at first instance…)
- Pirate Residuum
- Readings from the Book of (library) Genesis
- Cyberspace – the Fifth domain of Warfare?
- Demystifying AdTech
- The Hymn of Acxiom
- Knowledge is born free, yet is everywhere in chains…
- Adam Curtis in Berlin
- Baking Privacy and User Choice into the Web with Do Not Track
- Party Like it’s 2000: Revisiting Crypto
- civil liberties
- Data Protection
- European Court of Justice
- european directives
- european regulations
- european union
- material culture
- open video
- Pirate Bay
- Pirate Party
- social cooperation
- steal this film