I recently visited the Wikipedia page of a left wing German politician. She had been hit with a pie by a critic of her views on migration. Wikipedia linked to a report on Russia Today containing a video of the incident uploaded to Youtube by RT’s European unit, Ruptly. At the Youtube URL most of the videos displayed in the related content sidebar were about migration, few were from Ruptly, and many were strongly anti-immigrant. I clicked on one where an old woman was interviewed about her fears, feelings and hostility towards immigrants and closed the video after a couple of minutes. The next time that I opened Youtube seven of the ten videos recommended to me concerned immigration. Five of them were clearly produced by right wing media activists and this flavour of curation extended to the videos on the right hand column as I browsed.
This experience captures what Eli Pariser characterised as a ‘filter bubble’ in a book of the same name: I was suddenly thrust into of a media universe imagined for me on the basis of one or two clicked links, and it felt weird. Encountering world-views contrary to my own doesn’t bother me – in fact I enjoy the conflict, but the skimpy basis for subjecting me to this flood of ideological personalisation is bothersome. If the viewer is uninterested in politics and has no knowledge of the mechanism selecting the stories presented to them, what are they to make of such goings on? In this universe old ladies, innocent blond haired teenagers, and middle-aged men are at one in insisting that migrants are criminals who should be deported – is that the normcore position? Sure viewers aren’t going to swallow propaganda whole, they’ll cross reference it with their own experience and knowledge, and apply some critical thinking. But the persistence of these recommendations for about week did make me feel like as if I was surrounded. I had fallen into the filter sewer and was being sprayed with a fire-hose of horseshit.
Google would argue that if I logged in to Youtube they would know more about me so I would not have ended up in the sewer. But I don’t want my media consumption tracked or personalised. Never logging in to Google is the best I can do to minimise the tracking, short of systematically using a VPN or Tor, and because I want to have some idea of what the general experience of the web is, I will not do that. I do use anti-tracking tools such as Privacy Badger, Disconnect and uBlock Origin, but none of them can fully protect you from ‘the Google’.
Of course our media environments have always had their ‘bias’, and that was the case before the internet. Journalists wax about objectivity and balance but there have always been ideological assumptions and frameworks: the basic credibility of government statements and explanations of its actions; the virtues of capitalism and liberal democracy etc – the world inside the Overton window. Because Pariser wrote the book in the internet era, and focused on the results of algorithmic filtering, it was understood (perhaps unfairly?) as arguing that the problem was new, when it was actually an evolved iteration of an older phenomenon. [It reminds me of the fear that adblocking will wreck journalism – yet newspapers were in crisis already in the 1990s as the industry became more concentrated and the new owners expanded advertising sales whilst sacking journalists, a phenomenon chronicled by Ben Bagdikian in his classic, the New Media Monopoly.]
Old media was also driven by advertising logic: demographic targeting etc but the difference lies as much in the ease of of individualised distribution as in the availability of algorithmic engines. In the newspaper age you could tell a lot about a person politically and socially from the newspaper they read, their choice was also a filter, and the advertisers who bought slots chose silos for their campaigns. But the paper still had to appeal to a mass market so the silo was big, somewhat diversified and had to cover a range of subjects. Not so today.
Pariser’s book is actually about personalization but he must have thought filter bubble was a catchier term. Individualized customization of information flows is heralded as the compass to navigate a sea of excess information, but this has mostly meant that users should surrender control to machinic decision making whose logic is opaque. If the past once allowed room for the illusion that this could work out well, the future is now over and we’ve seen it’s not so rosy. Information filters are needed but only as tools under the control of the user. But such user sovereignty is not a tendency the economic forces of the web want to foster. In the web of 2016 the user is object of a system designed to shape them rather than a subject to be supported in their own self-development.
In The Daily You, Joseph Turrow outlined how the the idea of the powerful consumer is promoted whilst advertisers and marketers engage in ever more intrusive information gathering processes which lead to the separation of consumers into targets and ‘waste’. And if the consumer is so potent, surely they don’t need to be protected by regulation? In the adtech world that the real end goal of personalization is revealed: collect every last bit of data so as to eventually facilitate the encounter between consumer desire and business operation. This intrusion is presented as a means of giving you ‘what you want’ and clothed in the innocuous language of ’relevance’. But the user is never asked what they want, nor given the means to control the data and advertising flows around them – the answers are to be found by spying on them.
All Our Yesterdays…
In its early days the web was embraced by media critics as a formal remedy to the ills of the mass media – newspapers, television, radio, and film. The net/web was to undermine the tyranny of intermediaries and enable a direct dialogue between individuals and groups. It was not to be. The human decision-makers have had their wings clipped, but have merely been replaced by tech-moguls (unwilling to acknowledge their editorial power) and opaque machinic processes cast as agents of divine right.
If algorithms are the new monarchs, a renewed republicanism needs to dethrone them and their owners. Users do need tools to master the data flow, but they must be under their control, transparent in their logic and designed to nurture their autonomy.
- The FCC and the Tectonics of Commercial Surveillance
- End2End: Privacy Theatre or Promise Deferred?
- A 2016 Almanac
- The Machinic Sewer
- A Yahoo User’s Journey through the Unknown
- Filmpiraten Crush Austrofascists (at first instance…)
- Pirate Residuum
- Readings from the Book of (library) Genesis
- Cyberspace – the Fifth domain of Warfare?
- Demystifying AdTech
- The Hymn of Acxiom
- Knowledge is born free, yet is everywhere in chains…
- civil liberties
- Data Protection
- European Court of Justice
- european directives
- european regulations
- european union
- material culture
- open video
- Pirate Bay
- Pirate Party
- social cooperation
- steal this film