I recently visited the Wikipedia page of a left wing German politician. She had been hit with a pie by a critic of her views on migration. Wikipedia linked to a report on Russia Today containing a video of the incident uploaded to Youtube by RT’s European unit, Ruptly. At the Youtube URL most of the videos displayed in the related content sidebar were about migration, few were from Ruptly, and many were strongly anti-immigrant. I clicked on one where an old woman was interviewed about her fears, feelings and hostility towards immigrants and closed the video after a couple of minutes. The next time that I opened Youtube seven of the ten videos recommended to me concerned immigration. Five of them were clearly produced by right wing media activists and this flavour of curation extended to the videos on the right hand column as I browsed.
This experience captures what Eli Pariser characterised as a ‘filter bubble’ in a book of the same name: I was suddenly thrust into of a media universe imagined for me on the basis of one or two clicked links, and it felt weird. Encountering world-views contrary to my own doesn’t bother me – in fact I enjoy the conflict, but the skimpy basis for subjecting me to this flood of ideological personalisation is bothersome. If the viewer is uninterested in politics and has no knowledge of the mechanism selecting the stories presented to them, what are they to make of such goings on? In this universe old ladies, innocent blond haired teenagers, and middle-aged men are at one in insisting that migrants are criminals who should be deported – is that the normcore position? Sure viewers aren’t going to swallow propaganda whole, they’ll cross reference it with their own experience and knowledge, and apply some critical thinking. But the persistence of these recommendations for about week did make me feel like as if I was surrounded. I had fallen into the filter sewer and was being sprayed with a fire-hose of horseshit.
Google would argue that if I logged in to Youtube they would know more about me so I would not have ended up in the sewer. But I don’t want my media consumption tracked or personalised. Never logging in to Google is the best I can do to minimise the tracking, short of systematically using a VPN or Tor, and because I want to have some idea of what the general experience of the web is, I will not do that. I do use anti-tracking tools such as Privacy Badger, Disconnect and uBlock Origin, but none of them can fully protect you from ‘the Google’.
Of course our media environments have always had their ‘bias’, and that was the case before the internet. Journalists wax about objectivity and balance but there have always been ideological assumptions and frameworks: the basic credibility of government statements and explanations of its actions; the virtues of capitalism and liberal democracy etc – the world inside the Overton window. Because Pariser wrote the book in the internet era, and focused on the results of algorithmic filtering, it was understood (perhaps unfairly?) as arguing that the problem was new, when it was actually an evolved iteration of an older phenomenon. [It reminds me of the fear that adblocking will wreck journalism – yet newspapers were in crisis already in the 1990s as the industry became more concentrated and the new owners expanded advertising sales whilst sacking journalists, a phenomenon chronicled by Ben Bagdikian in his classic, the New Media Monopoly.]
Old media was also driven by advertising logic: demographic targeting etc but the difference lies as much in the ease of of individualised distribution as in the availability of algorithmic engines. In the newspaper age you could tell a lot about a person politically and socially from the newspaper they read, their choice was also a filter, and the advertisers who bought slots chose silos for their campaigns. But the paper still had to appeal to a mass market so the silo was big, somewhat diversified and had to cover a range of subjects. Not so today.
Pariser’s book is actually about personalization but he must have thought filter bubble was a catchier term. Individualized customization of information flows is heralded as the compass to navigate a sea of excess information, but this has mostly meant that users should surrender control to machinic decision making whose logic is opaque. If the past once allowed room for the illusion that this could work out well, the future is now over and we’ve seen it’s not so rosy. Information filters are needed but only as tools under the control of the user. But such user sovereignty is not a tendency the economic forces of the web want to foster. In the web of 2016 the user is object of a system designed to shape them rather than a subject to be supported in their own self-development.
In The Daily You, Joseph Turrow outlined how the the idea of the powerful consumer is promoted whilst advertisers and marketers engage in ever more intrusive information gathering processes which lead to the separation of consumers into targets and ‘waste’. And if the consumer is so potent, surely they don’t need to be protected by regulation? In the adtech world that the real end goal of personalization is revealed: collect every last bit of data so as to eventually facilitate the encounter between consumer desire and business operation. This intrusion is presented as a means of giving you ‘what you want’ and clothed in the innocuous language of ’relevance’. But the user is never asked what they want, nor given the means to control the data and advertising flows around them – the answers are to be found by spying on them.
All Our Yesterdays…
In its early days the web was embraced by media critics as a formal remedy to the ills of the mass media – newspapers, television, radio, and film. The net/web was to undermine the tyranny of intermediaries and enable a direct dialogue between individuals and groups. It was not to be. The human decision-makers have had their wings clipped, but have merely been replaced by tech-moguls (unwilling to acknowledge their editorial power) and opaque machinic processes cast as agents of divine right.
If algorithms are the new monarchs, a renewed republicanism needs to dethrone them and their owners. Users do need tools to master the data flow, but they must be under their control, transparent in their logic and designed to nurture their autonomy.
“We fight any requests that we deem unclear, improper, overbroad, or unlawful,”
Ron Bell, Yahoo General Counsel
Oh Yahoo, what have you gone and done now. You strange company, whose services I have rarely had occasion to use, save for the occasional casual email account useful for keeping commercial spam away from my real address and the odd photo uploaded to flickr. And yet I cannot help but feel disappointed, because behind that Yahoo octopus whose ink barely obscures a huge advertising-surveillance system, I actually thought that there were individuals serious about defending their users’ privacy at least vis a vis the state. This belief was not without foundation: in September 2014 documents were released chronicling Yahoo’s fight at the FISA court against the NSA’s mass surveillance program. They were alone in this legal resistance. Google, who like to see and portray themselves as the user’s friend, never challenged the government in court.
I had this on my mind in the autumn of 2014, when I was getting increasingly fed up with Google search and looking for an alternative. This was driven by disgust at their relentless data harvesting and disregard for user privacy, but also by the sense that Google’s results seemed to be getting noisier, including a lot of trash and click-bait pages designed solely to exploit the modalities of the algorithm. I thus embarked on an exploration of the alternatives first Bing, then Yahoo…
This was, I know, an eccentric decision – Yahoo has if anything a worse policy regarding retention of search queries than Google. The results themselves were ok, and the key discovery that I made as I test drove the other engines is that 80% of our queries can be resolved by any of them. It is only when you are searching for an exact phrase or rarefied subject matter that the distinctions emerge. Basically Google spiders more of the web, has a better index, and has a better chance of unearthing the obscure. But I did enjoy the apostasy of using Yahoo, and bragging about it; I remember a dinner with a google engineer in SF who stared at me in amazement when I told him of my search engine heresy and explained my motivation (on that point, why are so many at Google in denial about the fact that it is an advertising company rather than a vocation to make the world better through engineering?).
Truth be told, however, this dalliance didn’t last long. After three months I had shifted again to DuckDuckGo, where I have stayed. There are wrinkles to this too: DDG buy search results from Yahoo, Bing and Yandex, which they then combine with other sources and reprocess. But DDG are sound on privacy: they never track users and they’ve adopted the EFF’s Do Not Track policy, a document close to my heart. I resort to Google only as needed, in pursuit of the esoteric and arcane, but what I thereby disclose offers such a marginal (and bizarre) view into my head and habits and I can live with that. Firefox has all the major engines in their search box, thus switching involves no overhead, and I run Opera in parallel.
So it was just a fling with Yahoo but enough to make me sick when I read that they had adapted a child pornography and malware filter and repurposed it to search the entirety of the mail passing through the @yahoo.*** system. (Incidentally the journalist who broke the story, Joseph Menn, is the author of the excellent All the Rave, which tells the story of Shawn Fanning and Napster – most enjoyable). It made me think how maddening, how insanely inconsistent, Yahoo is. Corporate Beelzebubery comes as no surprise, it’s the wild shenanigans that get to me. That’s what I intended to write about, before the rant above took shape, so here are some examples which come to mind.
Search Query Retention Times
Back in 2007 the Article 29 Working Group, an entity which drafts opinions on data protection/privacy in the EU intended to guide the actions of the Data Protection Authorities, started to breath down the necks of the search companies about how long they were retaining user query data. At the time Yahoo held the data for 13 months, Microsoft 18 months, and Google started ‘making it less identifiable’ after 9. In December 2008, Yahoo announced that they were going to start de-identifying the data after 3 months. Bravo!
Then in April 2011, Yahoo announced that they were needed to retain the data whole for… eighteen months! Otherwise they couldn’t compete! By this point Google were saying that they wouldn’t go below eighteen months either, only Microsoft’s Bing had adopted 6 months.
Do Not Track
In March 2012 Yahoo announced that they would be implementing support for the Do Not Track signal that users can enable in their browsers to tell sites that they don’t want to be tracked. This is not a message which advertising companies are pleased to receive and they have wasted a lot of people’s time at the W3C and elsewhere trying to make the subject more complex than necessary, basically as a means of stalling and sabotaging. No details were ever provided about what this Yahoo implementation would consist of, the sceptical might wonder if it was anything but air?
In April 2014 Yahoo announced that they would no longer honor DNT signals, because they believed that the default web should be ‘personalized’ i.e. tailored for you based on knowledge of what you’ve been up to; personalized thus joins relevant and interest-based as synonyms (and alarm bells) for surveillance-based advertising and content selection.
But Yahoo wasn’t finished: following a deal where they bought themselves the default search box on Firefox, they announced in November 2015 that they would be honoring DNT requests for Firefox users. Mmmmh. Why only Firefox users – oh did Mozilla make them sign up to that? Perhaps because Mozilla was one of the birthplaces of DNT? And what would honor mean exactly? It hardly matters as Yahoo may well change their position again once their takeover is complete. Or perhaps they’ll claim that they couldn’t do anything for the last five years because they were waiting for agreement at the W3C. Yawn.
So now to the most delicious irony of all. After the uproar surrounding the Snowden revelations one of big tech’s responses was to implement encryption at various points in the network. The aspect of this closest to users was Google’s project to develop an end-to-end encryption plug-in for gmail. This was an open-source project and Yahoo declared that they would make it available for their webmail system as well. This was good for users but it would also involve a cost for the companies as both sell advertising based on scanning users’ email to select ‘relevant’ ‘personalized’ ads. If the mails are encrypted this type of analysis is not possible, resulting in lower revenues. But the NSA revelations hurt a lot of people’s pride and made the tech industry as a whole look compromised, poodles of the US government’s PRISM program, so some notional loss could be stomached.
Alex Stamos, then head of security at Yahoo, set about recruiting programmers and engineers to move it on. In March 2015 this system was ready to demo and was unveiled at SXSW. Right around then Yahoo had been requested to search their whole email traffic for a specified identifier. This was implemented secretly and without consultation with Stamos and the security team, so that when they uncovered it they mistook it for a hostile insert placed by an intruder. The rest is well known: Stamos left Yahoo shortly afterwards to become head of security at Facebook. The Chrome extension for end-to-end encryption of Yahoo mail in Chrome was never officially completed and launched, although one of the lead developers says it’s basically good to go. (Incidentally, what happened to Google’s much trumpeted efforts in this regard?)
Yahoo has many other sins uncatalogued here, but what astonishes me is how erratic and capricious they are. What would you trust them with? Better, as the Intercept suggests, to just delete your account.
An Austrian court issued an interesting judgment this week. A leftist film collective, Filmpiraten, took a court case against the far-right Freedom Party of Austia (FPÖ) for copyright-infringing reuse of material published on youtube under a creative commons license. The video at issue documented antifascist protests against the Viennese Akademikerball, an annual event held by the FPÖ which has been the target of demonstrators for many years.
Filmpiraten publish their work on the website and on youtube under a BY-NC-SA license. this means that others are free to use the material without permission providing the use is non-commercial, the work is attributed to them, and that whatever work is created downstream using it is distributed under the same licensing conditions.
The FPÖ operate their own youtube channel which includes a program called FPÖ-TV, published as work in which copyright is claimed. The court case thus concerned a violation of Creative Commons licensing terms under which the Filmpiraten had made their work available. Where a would-be user of material available under a CC license does not accept the licensing conditions, they must make a licensing agreement with the copyright holder in the usual way. Unless they are using it on the basis of one of the statutory exceptions (criticism, commentary etc).
In any case the Filmpiraten were successful in the Viennese court, so this is a significant decision for anyone interest in the treatment of CC licenses in the courts. The FPÖ will appeal. The newspaper report from Der Standard is available here (German).
Hard to believe that only four or five years ago the Pirate Party (PP) were enjoying a German honeymoon, winning large numbers of votes and entering four regional parliaments. In the Berlin election in 2011 their results were so strong that they did not have enough candidates to fill all the seats won; candidates who ran with with little hope of getting into district assemblies were instead elected to the major-league Senate – the citywide parliament. But this unexpected triumph was to be their zenith, thereafter the party formed a circular firing squad.
During the five years of the Berlin Senate the PP parliamentary group had five chairs and co-chairs, of these four are no longer members of the party (although all continue to sit as part of the Pirate group) – Alexander Spies is the last of this band carrying a party card. Two of these former chairs were among 35 former Berlin Pirates who published an open letter in January announcing their defection to Die Linke (the Left party) while another flirts with joining the SPD. Three other PP members elected to the Senate have also departed. This means that having started the Parliamentary session with 15 representatives, they now have 8.
A further twist to the current Berlin election is that former national chairperson of the Pirates, Bernd Schlömer, is running as a leading candidate for the FDP (Liberals) having joined them last October. This is less surprising that it may seem as both FDP and Die Linke (as well as the Greens and the Pirates) once participated in the Freiheit Statt Angst! (Freedom Not Fear!) demonstrations, an annual field day of the forces opposed to mass surveillance/social control which used to take place in Berlin each September.
Berlin Election 2016
Polling currently puts the PP on 3%, well below the 5% threshold required to be allocated any seats in the Parliament. As in 2011 they are running an eye-catching campaign focused on issues where they have campaigned effectively: housing, the investigation into the billion euro airport scandal, against racism. But the nature of their public meltdown both at national and local level after 2012 has wrecked their credibility. (If one wants to vote for a neo-Dadaist anti-party Berlin already has one, die Partei, who also have a European MEP!)
The departure of former members for other parties also undermines their position as self-appointed interpreters of the magic powers of technology. This should not be underestimated: until 2012 they were effectively identified as the ‘party of the internet’, the people who wanted to usher in a streamlined tomorrow, the epitome of progress and forward thinking. But this stranglehold on the tech-dream is over.
The Berlin PP was regarded as representing the party’s left-wing and some of its votes will now return to Die Linke or move to the Greens. Meanwhile, populist discontent has shifted decisively right after the controversy over refugee policy met the gunpowder of the sexual assaults in Cologne on New Year’s Eve. electorally this means pay dirt for the Alternative fur Deutschland (AFD), a toxic brew of xenophobes, alienated conservatives, economic liberals and populists, who will almost certainly enter the city Parliament this month.
In recent days it has been announced that the EU wants all scientific research papers funded through its programs to be released under Open Access by 2020*. Newspaper coverage has credited the combined efforts of the Dutch government and EU Commissioner for Research, Innovation and Science, Carlos Moedas, for the initiative. Moedas caught my attention in April due to a speech he gave on ‘open science’ which began with a reference to Alexandra Elbakyan and the controversy around Sci-Hub. He went on:
“Elbakyan’s case raises many questions. To me the most important one is: is this a sign that academic journals will face the same fate as the music and media industries? If so – and there are strong parallels to be drawn − then scientific publishing is about to be transformed.
So, either we open up to a new publishing culture, with new business models, and lead the market… Or we keep things as they are, and let the opportunity pass us by. As I see it, European success now lies in sharing as soon as possible, because the days of “publish or die” are disappearing. The days of open science have arrived.”
Until recently it would have been impossible to imagine an institutional figure use Sci-hub as a springboard for a positive vision rather than an occasion for vitriol.
While legal action by Elsevier against Sci-Hub and Elbakyan grinds on in the US, it has succeeded only in generating large amounts of positive publicity for both – Elsevier’s attack on a library that once existed in the shadows has ended up biting the behemoth in the ass. Despite a court decision in their favor, the website remains online and usable.
Seeking to disable free access to scientific articles otherwise available only through overpriced subscriptions was never going to be a winning PR strategy. That Elsevier made an operating profit of 34% in 2014 doesn’t help their case, nor does the fact that the authors are not paid. Commentators have instead treated the liberation of academic work from copyright restrictions as an enlightenment gesture in favor of universal access to knowledge (which it is). There is sympathetic coverage all over, from Science to Le Monde ** – it’s all a far cry from the quiet annihilation of Library.nu/Gigapedia in 2012.
Academic work is special…
Such an outpouring was never going to emerge from the cases against Napster or the Pirate Bay – the shared objects at the centre of those trials were seen merely as trifling entertainment commodities. This is odd given how important shared cultural works are for shaping our identities, but somehow they are tainted by their association with pleasure and fun. Academic papers, on the other hand, are no terrain of indulgence; they are the stuff of seriousness, discipline, painful memories of homework…
Risk and Reward
With all this attention, use of Sci-Hub and Library Genesis is booming and presumably growing its holdings. Given that the entire collections are available for download via torrent it will be interesting to see if services on top of the corpus – text mining etc. Until now such techniques have been the preserve of the database owners or companies like Google, with the resources for both mass scanning efforts and sustained legal defense involved in their Library/Book project. So let’s see the unauthorised repositories become the substrate for experiment, analysis, and additional layers of meaning.
Elbakyan is now carrying a lot of personal risk and is owed our support. Aside from the injunction against her there are claims under the Computer Fraud & Abuse Act (the same law used to prosecute and intimidate Aaron Swartz). She is cagey about her location and is concerned about the threat of extradition to the US. But this must be weighed against what she has achieved: assembling and stewarding a system of self-provision for all those with inadequate access to literature wherever they are. Right now it is important that all those who believe in LibGen/SciHub state that support openly. Later this may also meant to step up and support her also materially
*There are caveats however, enabling exceptions for reasons of security… and intellectual property rights – an exception which could utterly undermine the rule depending on how it is interpreted.
**See also: Justin Peters’ article critiquing Science’s defence of their business model; a piece from Aaron Swartz’s former colleagues at The Baffler; the very useful bibliography regarding Sci-Hub/LibGen maintained by Stephen Mclaughlin.
Spotted this today in Treptow, it reads:
“Germany’s freedom is also defended in cyberspace. Do what really counts.”
As you may have guessed, it is part of an advertising campaign launched by the German army to recruit people with IT training.
Critical discussions about tracking, targeted advertising, surveillance capitalism seem to easily stray onto the terrain of paranoia and speculation. Cookies are associated with an inchoate but rather mild evil and almost no-one can explain how they produce their odious effects.
Cookies, however, are only the first hurdle in understanding the much more opaque universe of AdTech. This is a poorly understood world in part because it is rather new and has yet to assume a stable shape. The industry is dominated by companies which are far from being household names, and they describe themselves in terms of roles which are not easily grasped – demand side platforms, data management platforms, ad exchanges etc. The jargon accompanying the recreation of the advertising pipeline for real-time delivery is just the surface manifestation of a complex technical system. Little wonder then that most people in the advertising industry itself don’t get it, never mind us mortals (aka ‘targets’ and ‘waste’) who are being bought and sold billions of times a day.
The trade press is a great source of information, as are company blogs, and even the mainstream media occasionally does something decent, but mostly it’s fragmented. Of course there’s a copious academic literature, mostly coming out of computer science, if you want to get into the detail. But for an overview one could do a lot worse that a to look at a report produced by the Norwegian Data Protection Authority last December, titled “The Great Data Race” (mercifully in English). The first half of the report (particularly pages 10-29) provide a good breakdown of the new division of labour, examine how data is collected and breakdown the actual process of ‘programmatic buying’ and real-time bidding.
somebody hears you. you know that. you know that.
somebody hears you. you know that inside.
someone is learning the colors of all your moods, to
(say just the right thing and) show that you’re understood.
here you’re known.
leave your life open. you don’t have. you don’t have.
leave your life open. you don’t have to hide.
someone is gathering every crumb you drop, these
(mindless decisions and) moments you long forgot.
keep them all.
let our formulas find your soul.
we’ll divine your artesian source (in your mind),
marshal feed and force (our machines will
to design you a perfect love—
or (better still) a perfect lust.
o how glorious, glorious: a brand new need is born.
now we possess you. you’ll own that. you’ll own that.
now we possess you. you’ll own that in time.
now we will build you an endlessly upward world,
(reach in your pocket) embrace you for all you’re worth.
is that wrong?
isn’t this what you want?
On November 2nd the Southern District Court of New York granted Elsevier a preliminary injunction against Library Genesis for copyright infringement. The site is online repository of texts mostly, but not uniquely, of educational character, accessible to all. The defendant, Alexandra Elbakyan, never appeared in court but did submit a letter to the Judge explaining the reasons for the site, it’s worth reading. Library Genesis remains active for now and for technical reasons will be more difficult to kill than the last target of knowledge prohibition, Library.nu, which was shut down in February 2012.
Supporters and advocates for free and open access have issued a statement in support of LibGen which is also a Manifesto sorts.
# In solidarity with Library Genesis and Sci-Hub
In Antoine de Saint Exupéry’s tale the Little Prince meets a businessman who accumulates stars with the sole purpose of being able to buy more stars. The Little Prince is perplexed. He owns only a flower, which he waters every day. Three volcanoes, which he cleans every week. “It is of some use to my volcanoes, and it is of some use to my flower, that I own them,” he says, “but you are of no use to the stars that you own”.
There are many businessmen who own knowledge today. Consider Elsevier, the largest scholarly publisher, whose 37% profit margin[^1] stands in sharp contrast to the rising fees, expanding student loan debt and poverty-level wages for adjunct faculty. Elsevier owns some of the largest databases of academic material, which are licensed at prices so scandalously high that even Harvard, the richest university of the global north, has complained that it cannot afford them any longer. Robert Darnton, the past director of Harvard Library, says “We faculty do the research, write the papers, referee papers by other researchers, serve on editorial boards, all of it for free … and then we buy back the results of our labour at outrageous prices.”[^2] For all the work supported by public money benefiting scholarly publishers, particularly the peer review that grounds their legitimacy, journal articles are priced such that they prohibit access to science to many academics – and all non-academics – across the world, and render it a token of privilege[^3].
Please read the rest at their site.
The documentary maker Adam Curtis was at the Hebel Am Ufer theatre in Berlin this weekend. There were screening of his films Bitter Lake and the Century of the Self (and the selection prefigured the arguments that he was to make), but the main events were a lecture and two public dialogues, one of which with the leftwing critic Mark Fisher. Contrary to what one might expect there isn’t much online of Curtis speaking about his work, so I went to check it out.
From the outset he insisted on positioning himself as a journalist rather than filmmaker, and he consistently emphasized the value of narrative, the importance of stories, especially as regards political movements’ capacity to inspire and shape the materialization of new worlds in times of crisis (i.e. opportunity). Questions focused on more formal aspects of documentary production were pooh-poohed: filmmaking choices were tersely explained as being either a matter of personal preference, an intentionally self-evident result of the propaganda approach, or simply more economic to produce,
It turned out that what Curtis wanted to talk about was the failure of liberals and the liberal left (amongst whom he counts himself) to achieve ‘real change’, their inability to imagine another type of future as embodied in the defeat of the Tahir Square and Occupy rebellions. Instead he described the descent into ‘oh dearism’, or the posture of impotently observing one disaster after another with no idea about how to intervene, to end or ameliorate the situation. He links this to the end of the era of mass democracy, where organizations made alliances and formed blocks capable of confronting embedded power structures meaningfully, and the failure to find any analog in a time where the basic unit of politics is not the collective but the individual.
This segues nicely into the thesis of The Century of the Self, whose second half tells how the defeat of the new left/counterculture of 1968 led to retreat by that generation into technologies of the self and a turning away from society. Curtis curses the new left for painting all politicians as corrupt, and sees this as both a simplification and a precondition for the refusal of politics wholesale by what he calls ‘hippies’. Later he remarked how radical it would be to make a series about the ‘nobility of politicians’ as a necessary upending of this cynical attribution of corrupt motives to all politicians. This judgement is seen by him as both a simplification of the facts and an abiding impediment to the organization of meaningful political action.
Century of the Self chronicles the emergence of a new type of social actor/subject, whose sense of their own centrality represents a decisive break with the type of collective subject of the era of mass democracy. Now individuals are said to require that they be addressed in a more persona manner, they grant inflated importance to self-expression, and seek their own personal utopias – as one interviewee characterizes it, their aim is ’socialism in one person’.
Curtis sees this personality type as representing the vital battlefield for political struggle in this time. He condemns the ‘left’ for failing either to appreciate it or find ways to appeal to it. His prescription is always the same: the crucial failure is the inability to imagine a future and convey it in a form which this new type of individual can find compelling and persuasive. What the form of this storytelling might be was left almost entirely unspecified, but we were told that it was to exclude economics, because it was ‘boring’; the mere mention of collateralized debt obligations would make people’s eyes roll in a stupefied mixture of bafflement and tedium. Simultaneous with this rejection of ‘wonk-ery’ however, he repeatedly decried the tendency towards simplification and worried that were a major crisis to occur, not only would the political vision be found wanting but the individuals would find itself confronted with a level of complexity so unfamiliar as to be irresolvable.
Today the EFF announced the adoption of its Do Not Track (DNT) policy by the adtech company Adzerk, they are the first advertising company to sign up to a meaningful DNT policy and their involvement will have two immediate consequences.
1. Companies have claimed that the technical obstacles to implementing DNT in the ad environment are insurmountable; they no longer have this alibi. On a more positive note, there is also reason to believe that other ad companies will emulate Adzerk’s example.
2. It puts in place another piece of scaffolding for those publishers considering DNT adoption but unsure how it can be implemented. Offering a version of the site where users are not tracked means reviewing all the third parties used on the site, many of which gather user data: analytics, embedded video hosts, social network ‘like’ buttons, and of course *ads*. These sources of data leaks to third parties need to be disarmed rather than gotten rid of entirely (something users’ expectations will not allow). Adzerk doesn’t supply ads themselves, but it provides the infrastructure for their delivery. As more publishers adopt DNT, it will become easier to convince advertisers that this is an audience worth addressing.
Whilst a lot of attention has been given to online tracking the responses have so far been ineffective. The relevant W3C working group failed to reach a compromise that would change industry practice voluntarily, whilst regulators appear unwilling to take on a sector which has grown during an otherwise lackluster economic period. Where legislation has been tried, the results have been ineffective (e.g. the ‘Cookie Directive’ in Europe). The EFF’s DNT effort aims to construct an alternative ecology where privacy protection and informed user choice is the design imperative behind modified services, and to overcome the engineering obstacles to that objective a step at a time.
At the time when I first studied law, my interest in technology was entirely separate and parallel. On just one occasion they intersected, due to the requirement of a note from one’s tutor stating that the requested email address/shell account was necessary for purposes of scholarly activities; in those days emails were issued automatically only to maths and computer science students, everyone else had to demonstrate that they needed one.
There followed many all night sessions in the computer labs (the only buildings open 24 hours!) and conversations with nerds who began to drop in to the bookshop where I worked. Sometimes this just meant riffing about the exotic ideas encountered on Usenet (Ireland was seriously theocratic and very insular), but inexorably discussion would return to speculation on the political consequences of the new medium in two areas: copyright and surveillance/political control.
So when I later decided to return to law, it was natural to focus on these conflicts. My emphasis was originally on cryptography. In retrospect I guess this is because that moment was a kind of peak of political absurdity. Encryption technologies were still classed as dual use technologies by government, meaning that they had both civilian and military applications, and were thus subjected to a special regulatory regime limiting their export. At the same time the encryption software PGP (Pretty Good Privacy) was available for download from the net in flagrant breach of US export controls – the International Traffic in Arms Regulations (ITAR). Daniel Bernstein was challenging the constitutionality of these arrangements in the US while Phil Karn was filing requests with the US State Department to check whether a book, Applied Cryptography, and accompanying floppy disk were subject to export restrictions; it turned out the book wasn’t and the floppy was (I got a copies from amazon and never used either!).
Investigative journalist Duncan Campbell had already uncovered the first bits of information about a surveillance dragnet called Echelon. Meanwhile the US government had spent years trying to inject compromised encryption systems via hardware into the public’s computers and phones via its Clipper Chip proposal. This would have provided law enforcement with a side-door entrance to encrypted communications on foot of a warrant obtained as part of an investigation, but required that the secret keys necessary for this be stored at a location accessible to the police. Were they to be excluded from access to plaintext, we were told, the consequences would be dire: the four horsemen of infocalypse – terrorists, drug dealers, paedophiles and money-launderers – would ride forth unleashing their villainy on the innocent. A little later there was an international scandal involving a shady Swiss firm called Crypto AG, who were supplying compromised encryption systems to governments. When the exploit was revealed the Vatican was the first ‘user’ to change its system. … In short, these were exciting times, the rock and roll period of the so-called crypto wars.
Absurdly it was still possible then to imagine a field of ‘computer technology and the law’: the number of users was still small; the legal disputes actually reaching a judge were few: even the range of devices was limited. I gobbled it all up: digital signatures, data protection and copyright. Then I came across articles about Digital Rights Management systems and realized that where I had imagined a politically mobilized populace embracing PGP to engage in oppositional politics, it was more likely that users would encounter encryption as a lock preventing them from having access to the media cookie jar. Whereas the inability of governments to prevent civilian access to strong cryptography was foretold, the copyright and allied industries (mostly in the patent and trademark sectors) were well-organised, and had achieved considerable success in rewriting the law at both domestic (DMCA, European Copyright duration Directive) and international levels (GATT-TRIPS, WIPO-WCT). Thus in the United States the DMCA made it an offense both (a) to produce and distribute tools for the circumvention of DRM access controls on media and (b) to engage in the act of circumventing itself – irrespective of whether a breach of copyright occurred.
But the copyright industry’s victory turned out to be easier at the level of lobbying and legislation than it was in reality once these technologies were released into the wild. The dream of perfect technological control turned out to be a mirage. Worse, the internet ensured that once an access control technology was defeated once, it was effectively defeated everywhere, as the developers of the protection systems for DVDs and digital music formats were to discover at some expense. In 1999, just as the means to neutralize the DRM on DVDs was being made public, Napster, the first p2p system, appeared on the scene.
Thus at the turn of the millennium the struggle for public access to strong cryptography seemed to have been won, and the copyright industry’s efforts to retain control of distribution seemed to be skidding on the black ice of technological history. Such was the mood in January 2001 when Steven Levy’s celebratory account, Crypto, was published, with the unfortunate subtitle ‘How the Code Rebels Beat the Government Saving Privacy in the Digital Age ‘. By year’s end that tune would appear mistaken.
To be continued.
Since December the law office of Urmann + Collegen have become notorious in Germany due to their action against alleged users of Redtube – a streaming site dedicated to pornography. Copyright enforcement has hitherto been limited to users of file-sharing systems and the operators of streaming sites such as Kino.to. Pursuit of those using streaming facilities would represent a new escalation. Major questions about the plausibility of the offense, the manner of the evidence collection, and the bona fides of the plaintiffs are tied up in this litigation. Hopefully the affair will help discredit the current system.
First a word about one of the protagonists: this is not the first time the spotlight has fallen on Urmann + Collegen. In 2011 I wrote about how they attempted to sell-off the right to pursue alleged copyright infringers for compensation and legal costs under the abmahnung procedure (these are letters which demand the recipient desist from specific behaviour and pay both the costs of the letter’s production and some compensation). By 2012 they were threatening to publish the names of all those unwilling to cough up the amount demanded in the abmahnung for downloading porno movies using bittorrent. The German Data Protection office had other ideas.
1. Origins of the Redtube Affair
In the most recent episode abmahnungen were mailed to ten thousand users whose names and addresses were acquired following an order by the Civil Court in Koln (historically especially amenable to copyright owners requests). They were alleged to have infringed copyright by viewing porno movies on the streaming website Redtube. The action was launched on behalf of on behalf of The Archive AG, a Swiss registered company purportedly the owner of films being made available on the redtube website. dresses collected on behalf of the owners of the infringed copyrights. Multiple chambers of the court granted the plaintiffs request to require ISPs to identify the users behind IP addresses collected on behalf of the owners of the infringed copyrights.
Daniel Sebastian, a lawyer representing Archive AG, said that the IP data had been collected by a company called ‘itGuard’ who had used a piece of software called ‘GLADII 1.1.3’. This company was registered in Delaware in March 2013 but claims to be based in California. It turns out that The Archive AG’s website was registered that same month and that their website uses the same webserver as itGuard.
The Archive AG claimed to have purchased the rights to the infringed films in July of 2013. Around the same time the domain retdube.net was registered. Such site are often registered in order to capitalise on typing mistakes, or can be used by phishing/spam emails to draw traffic. One hypothesis is that this site (whose owners remain unidentified in a Panamanian registry) was set up to trap and track website users. Dates of the alleged infringements are consistent with this timeline.
The legal process began in August 2013 when Sebastian submitted a request for identifying information
The plaintiffs request for subscriber identification information was granted in September. The first letters went out in early December. There followed a flurry of actions including one undertaken by Redtube itself: on December 19th they obtained a decision from a Hamburg court ordering that no further abmahnung be issued to redtube users. However the real turning point came as the result of an appeal by four alleged Redtube users in mid-January. They argued that their information had been wrongly provided to the plaintiffs and in late January the Koln court upheld their appeal. For the moment this brings the substance of the case to a close. The flawed original decision by the various chambers of the Cologne Civil Court was based on numerous errors which it is worth itemising.
2. Confusion in Court: Streaming and Reproduction
Irrespective of the relationship between itGuard and The Archive AG, it appears that the Koln court which ordered that subscriber to be divulged was either confused or misled. They appear to have believed that Redtube was a filesharing system rather than a streaming service. Submissions to the court by their lawyer, Daniel Sebastian, reinforced this impression by referring to downloads rather than streams.
In a decision announced on January 27th the Court upheld an appeal by one of the recipients of the letters. They stated that they had been confused by the use of the term download in the original application and that streaming has not been found to constitute an act of reproduction.
3. More Confusion: Acquiring the IP Addresses
In his original submissions to the Court in Cologne, Sebastian included a document drawn up by a Munich patent attorney from the firm Diehl & Partner, verifying the proper functioning and reliability of the GLADII. Nowhere in this twelve page document is there any explanation as to how the software actually interacts with the target site to collect the user data.
When the Cologne Court issued its statement connected to the successful appeal by one of the abmahnung recipients, the Judge raised again the troublesome mystery of how the GLADII software functions and noted that requests for further information had gone unanswered:
“even after indication from the Court, the questions remains unanswered as to how the software program can access a two-sided communication.”
4. Doubts about Ownership
The Archive AG claimed that they had purchased the rights to ‘Amanda’s Secret’ and other clips from a Berlin firm, Hausner Productions, who supposedly bought them from their original producer, a Spanish firm Serrato Consultants. But Hausner Productions does not exist, and Serrato never produced these films, which were shot by a company in California who continue to commercialize them.
As each day passes the affair unwinds further. Urmann is now facing an action taken by a Berlin firm on behalf of abmahnung recipients alleging extortion and fraud. Meanwhile at the The Archive AG it’s all go: they moved their HQ to a Swiss village called Weisselingen and their director, the German Phillip Wiik, has been replaced by a certain Djengue Nounagnon Sedjro Crespin, a native of Benin. Oh, and their phone number no longer functions and the website is offline. apparently Swiss authorities have started an investigation into the directors for fraud. a reader of the German magazine Telepolis visited the office address of the software developer ‘itGuard’ in San Jose and found only a supplier of office services who had rented a letterbox to a company of that name.
Amusing as the details of this scam are, and unpleasant as some of the characters in this story may be, the real issue here is the mindless machination of a copyright enforcement industry. By the end of 2012 this apparatus had produced more than four million abmahnungen: it is a crazed monster and out of control. On the basis of sketchy evidence, possibly gathered illegally, multiple chambers of the Cologne Civil Court ordered the identification of tens of thousands of users to a firm who did not have to prove they owned the rights – this is evidence of institutional dementia.
Lawyers have cranked this apparatus up because the business model produces a lot of money for them in fees, far more than that earned by any notional rightsholder. Thomas Urmann didn’t even bother checking if his clients actually owned the rights they claimed, just sent out the 20,000 letters and waited for the cash to roll in. In early January he was promising further letters in relation to other streaming sites. And if there are further ‘issues’? No problem, he says, ‘we’ve got full liability insurance’.
For years now there has been discussion of reform to eliminate such abuse, but in the SPD/CDU Coalition agreement there is no commitment to do anything other than investigate how the current system functions. Until the next time folks.
2013 will be remembered as the year when Edward Snowden hauled the debate about state surveillance into the conditions of the 21st century. His revelations constitute a vast canvas made up of interconnecting elements, and the combination of scale and detail makes it difficult to fully find one’s bearings. It has often made me think of Plato’s famous Allegory of the Cave which he recounts in the Republic.
Plato used the allegory of the cave to illustrate the place of philosophers in society. He told of a people whose knowledge of the world was derived from the shadows of moving people and objects cast on a wall by firelight. One of the prisoners is freed and the illusion is revealed to him. When he looks at the fire it hurts his eyes. He sees the sun and it takes time for his sight to adjust, but it does and he can see the objects and people who were formerly only shadows.
Before May of this year we had some inkling of what was going on. After all it was sixteen years since the Science and Technology Options Assessment (STOA) office of the European Parliament commissioned two reports touching on global communications interception (including Echelon). This led eventually to a Parliamentary procedure in 2001. But investigations were based on piecing together and inference not documentary corroboration. Now we are confronted with the flow charts, slide-shows, and even doodles of the undertaking – a collision with the plumbing of modern power. Time is needed to take it all in.
Sovereign power is back on display, its capability multiplied by the rise of the data harvesting industries and the centralisation of data on their servers. Trust in these companies – Facebook, Google, Yahoo, Microsoft et al. – has been injured and the wound will fester, both amongst users and non-US governments. Meanwhile ‘users’ drift virtually naked in a sea of insecure communications and with precious little data that is still ‘personal’ …. That’s the bad news. The good news is that it’s going to get better from here, because now people know and will begin to respond through litigation, agitation in the public sphere and tool development.
There is much to say but for now I’ll recommend some other voices: if you haven’t followed Eben Moglen’s lecture series, Snowden and the Future, then take the time to read or listen to his four lectures and absorb his analysis of the broad picture. Those interested in an accessible presentation of the technical aspect should watch his dialogue with security expert Bruce Schneier. Good background on the recent expansion of the surveillance culture in the US is contained in Ryan Lizza’s article State of Deception from the New Yorker. Finally Glenn Greewald, who broke the story with Laura Poitras, gave the keynote at the Chaos Computer conference last week, check it out here.
- The Machinic Sewer
- A Yahoo User’s Journey through the Unknown
- Filmpiraten Crush Austrofascists (at first instance…)
- Pirate Residuum
- Readings from the Book of (library) Genesis
- Cyberspace – the Fifth domain of Warfare?
- Demystifying AdTech
- The Hymn of Acxiom
- Knowledge is born free, yet is everywhere in chains…
- Adam Curtis in Berlin
- Baking Privacy and User Choice into the Web with Do Not Track
- Party Like it’s 2000: Revisiting Crypto
- civil liberties
- Data Protection
- European Court of Justice
- european directives
- european regulations
- european union
- material culture
- open video
- Pirate Bay
- Pirate Party
- social cooperation
- steal this film