Too far for Facebook? How the site decides what to ban

  • Published
Facebook 'rape page' screenshotImage source, Other
Image caption,

The creator of the page has called the critics "small-minded"

A controversial Facebook page that jokes about rape has caused a strong reaction with nearly 5,000 people in the UK and 176,000 in the US signing an online petition - yet Facebook is refusing to take it down.

It says the page, called "You know she's playing hard to get when your chasing her down an alleyway", does not break rules on posting hateful content or inciting violence.

The page currently has 194,000 likes.

But exactly where does the site draw the line and how does it decide on what to take down?

"You will not post content that: is hateful, threatening, or pornographic; incites violence; or contains nudity or graphic or gratuitous violence," states Facebook's statement of rights and responsibilities, external.

The light touch approach aims to keep censorship of Facebook's 800 million users to a minimum, with the freedom to create all manner of random and risqué groups.

"We sometimes find people discussing and posting about controversial topics," the site told the BBC in August.

"It is very important to point out that what one person finds offensive another can find entertaining.

"Just as telling a rude joke won't get you thrown out of your local pub, it won't get you thrown off Facebook."

Despite the controversy, Facebook is maintaining its position.

A further statement said: "We want Facebook to be a place where people can openly discuss issues and express their views, while respecting the rights and feelings of others.

"Now, with more than 30 million people in the UK expressing varying opinions and ideals using Facebook as a place to discuss and share things that are important to them, we sometimes find people discussing and posting about controversial topics.

"Groups or pages that express an opinion on a state, institution, or set of beliefs - even if that opinion is outrageous or offensive to some - do not by themselves violate our policies.

"These online discussions are a reflection of those happening offline, where conversations happen freely in people's homes, in cafes and on the telephone."

'Duty of care'

The campaign group Women's Views on News (WVN) says the "rape page" is not the only one promoting violence against females.

For example, a page about "kicking sluts in the face" has attracted 88,000 Facebook likes and is still available to view.

"It is becoming clear that Facebook managers are completely dismissing the concerns of nearly 200,000 people from both sides of the pond who have signed petitions against pages that advocate rape and violence against women," said Jane Osmond from WVN.

"The powers that be who run Facebook make a lot of money off the back of its subscribers through advertising so surely it has a duty of care towards its users - especially those who are aged 13, which is the minimum age to join Facebook and also to half of its audience, women?"

The creator of the rape page also posted to his wall this week to defend the joke.

"I did not make this page to support rape," said the post.

"This group was a joke between me and my friends and you people that think this is about rape do not get this joke because they are small-minded people."

'No nipples'

The decision not to take down the rape page may seem strange at first glance, considering Facebook has previously barred pictures posted by breastfeeding mothers.

In that case, in 2009, it said the problem was not the act itself but the sight of an exposed nipple, external. Online protests and petitions by angry mums followed the decision, but the social media giant stuck to its guns.

Image source, Reuters
Image caption,

Facebook is not against breastfeeding but its rules don't allow nudity

"Whether it's obscene, art or a natural act — we'd rather just leave it at nudity and draw the line there," said spokesman Barry Schnitt at the time, who added the policy was in place to protect the site's younger users.

The issue came up again in January 2011 when breastfeeding support group The Leaky Boob, which has 31,000 members, had its profile deleted, before later being reinstated.

Facebook was also criticised for not automatically removing Holocaust denial pages in 2009, external.

Despite the pages only having a few hundred members, critics said the basic concept of denying the Holocaust was hateful and that the pages contained blatant anti-Semitic postings.

Facebook investigated and later removed those it decided were breaking its rules.

A difficult call

The point where edgy discussion and dodgy jokes become hateful may sometimes be tricky to pin down, but often the decision is more straightforward.

Facebook has been quick to remove pages such as "I Hate Muslims in Oz" and the "Isle of Man KKK" group, started by a small group of school children, because they contain what it calls "explicit statements of hate".

And when it comes to child protection, removing or investigating postings and profiles for example, the site is also quick to act.

The Child Exploitation and Online Protection Centre (Ceop) said they had a "very good working relationship" with Facebook and the site had always been co-operative.

Facebook told the BBC that reports of offensive content are handled by its User Operations Team in Dublin, which measures all content against the Statement of Rights and Responsibilities.

A spokesman said the decision was not just made by "one person in a cubicle" and that difficult cases were often decided by discussion with its Global Content Policy Team.

He said the vital distinction was still between controversial discussion and genuine attempts at humour on one hand, and obvious hate speech on the other but that content targeting a particular minority would always be removed.

Related internet links

The BBC is not responsible for the content of external sites.