Below is the talk I gave today at the Virtual Gender conference. It’s another take on the colliding worlds theme, this time aimed at a feminist rather than a digital rights audience.
ETA 30/09/13
There is now partial video of this talk which you can see below. Huge thanks to @drcabl3 for this!
When Worlds Collide
I’m a feminist of the queer, sex-positive, and intersectional kind. I value allies to my causes and I believe in trying to be the best ally I can to other, less privileged people. One of the causes I’m engaged with is LGBT domestic abuse; another is violence against women – I’m an abuse survivor. But for longer than I’ve been a feminist, I’ve been a digital rights activist.
I grew up in a communist country where freedom of expression, freedom of association and political activism didn’t exist. There was one version of the truth, it was the Party’s version, and you lived with it. And so I believe the freedoms we have in this society – the freedoms to have conferences like this one, to imagine a better world, to meet and work with likeminded people on common goals – are to be cherished and protected. I also believe that the internet provides us with both unprecedented opportunities and never before imagined threats in this.
The internet allows us to organise and exchange ideas beyond the narrow confines of geographic proximity. Where in the past we may have thought we were alone, the internet now opens a window into the world. It allows us to find many other people like us, no matter who we are. It gives even the most marginalised and oppressed groups a voice. It allows disabled people to fight government cuts. It enables trans* people to speak out against media portrayals that are often phobic and downright vicious. It allows sex workers to share their experiences – good and bad – and fight for rights the rest of us take for granted.
At the same time, inherent in both the technology and our use of it is the potential for censorship and state surveillance. When most of us rely on Google to access the information and spaces we need, you only need to stop Google from displaying results for certain search terms to consign an issue or a group of people to eternal obscurity. When most of our communications rely on a relatively small number of undersea cables, it is easy for agencies of the state to monitor everything we say. When each and every one of us carries a tracking device in our pocket virtually day and night, reconstructing your movements from the data your mobile operator holds about you is trivial. As politically active feminists I am sure we will all agree these are clear threats to civil society, freedom of expression and freedom of association. They are threats to women and to feminism.
At a high, abstract level, issues of censorship and surveillance of the internet seem like a no-brainer to political activists of pretty much any flavour. Yet often when it comes to the intersection between feminism and digital rights, what was a no-brainer five minutes ago is suddenly a deeply divisive and contentious issue. On more than one front today, superficially feminist arguments are being made to justify censorship and surveillance of digital spaces.
I want to talk to you today about the importance of looking past those superficial arguments; about the dangers of trying to solve social problems with technical measures; about our duty as political campaigners to understand the technology we use for our campaigns, and not to let our causes be used to harm that technological infrastructure. And I want to talk to you about how we can best engage with issues at the intersection between feminism and digital rights in ways that find constructive solutions that work.
All of us in this room know that being female on the internet can be a less than pleasant experience. A recent example of this is the case of Caroline Criado Perez, the campaigner behind the initiative to put more women on banknotes. The abuse she and other prominent women received on Twitter over the course of several days in July included rape threats and death threats. It was vicious, violent and despicable. It was highly organised and intended to intimidate and silence. Having said that, the vast majority of these threats were not credible in the sense that most of the men behind them were unlikely to leave the safety of their own bedrooms to do real physical harm.
The immediate, knee-jerk fix demanded here by many feminist activists was for Twitter to implement an “Abuse” button – an instant way to flag tweets or users as abusive that would lead to the quick and automatic suspension of accounts. I can see where these calls are coming from. I can sympathise with them, I have been on the receiving end of similar abuse. But the digital rights activist in me, the one who believes that freedom of speech is sacrosanct, balks at the idea.
The irony here is of course that the first use an abuse button would be put to is to silence exactly the same people who were previously receiving the abuse. Because let’s face it, if the abusers are organised enough to sustain a campaign of threats in shifts over several days – and they are – they are also organised enough to hit the Abuse button until an activist’s account is suspended. What’s even worse is that the more vulnerable and marginalised a group is, the more disproportionately affected they would be by such campaigns. Sex workers and trans* activists in particular expressed serious concerns about the proposed Abuse button, and as an ally to those groups, as well as a digital rights activist I cannot in good conscience support those proposals.
This is not a simple issue, and knee-jerk reactions will not solve the problem. We need to look at the different facets. As digital rights activists we need to recognise what we already know as feminists: that campaigns of misogynist online abuse are a free speech issue in and of themselves. It doesn’t matter if it’s the state doing the censoring, or Facebook, or a bunch of trolls who make you feel unsafe about speaking out – the effect is the same. As feminists we need to acknowledge what we already know as digital rights activists: that automated censorship is open to abuse and tends to create more problems than it solves.
And we need to use that knowledge to find solutions that work. One blogger has suggested a “Panic” button that restricts the mentions a user can see to those from people they follow. This way the user is not silenced by having to make their account private or take a break from Twitter entirely, and they are not subjected to the distress of having to see the abuse in their timeline. I would add to that a way to identify credible threats – for instance the publication of personal details like address and telephone number – and enable the user to report those to the police. Distributed block lists are another way of dealing with this issue. Ultimately what matters here is finding solutions that address the real issues, not implementing a quick fix that may look good but does more harm than good.
Let me give you another example: David Cameron’s proposals to filter the web in the name of “protecting children”. In his speech at the NSPCC in July Cameron proposed three measures:
- Default on web filters at ISP level filtering out pornography and other “harmful content”;
- Forcing search engines to not return results for keywords commonly associated with child sex abuse material;
- And a ban on the possession of visual depictions of simulated rape.
I know many anti-porn feminists welcome these measures. But as a feminist, as a digital rights activist, and as a survivor of child sex abuse, I find them deeply objectionable. None of them will do anything to tackle real issues, like the fact that many children do receive their sex and relationships education from hardcore pornography. The way to tackle that is to provide mandatory, high-quality SRE in schools – something this government voted against nearly unanimously. Instead, the proposed filters are highly likely to restrict young people’s access to vital materials on sexual health, pregnancy and abortion advice and LGBTQ issues.
But what is even worse is how open to abuse these measures are. Let’s say the NSA and GCHQ don’t want us discussing their programmes of mass internet surveillance? Google already has the technology to filter search results, the PM is about to strengthen legal powers to do so. Internet Service Providers are already filtering pornography, self-harm websites and “esoteric material”. It doesn’t take much to add “internet surveillance” to the list without anyone noticing. Think that sounds unlikely? If on the 5th of June, the day before the Snowden revelations, I had told you that the legal and technical framework enabling the NSA’s PRISM programme existed, would you have called me a conspiracy theorist? The potential for abuse is there – it’s only a matter of time until it happens.
These are the issues where my worlds collide. They’re the issues where the intersection between digital rights and feminism for me becomes really, really difficult. What I hope you take away from this is that difficult is good. Being able to see more than one side to an argument is good. Being able to see past the kneejerk reaction that invariably will cause more problems than it solves is good.
And I also hope that this has convinced you that it is vital for feminists to engage with digital rights issues. It is vital for us to understand technologies as well as their social impacts. It is vital to examine the motives behind proposed technological fixes and the effects they will have on different groups. We as feminists get intersectionality, we get oppression. We have a responsibility to ensure technology is not used for oppression, particularly not in our name.