Facebook hires 3,000 to review content
- Published
Facebook is hiring 3,000 people for its "community operations team", to help stop hate speech, child abuse and self-harm being broadcast on the website.
Chief executive Mark Zuckerberg said it had been "heartbreaking" to see people "hurting themselves and others" in videos streamed live on Facebook.
He added he would make reporting problematic videos easier.
The move follows cases of murder and suicide being broadcast live on the social network.
In April, a man was killed in a video streamed live on Facebook. Later in the same month, a Thai man killed his baby daughter and then himself in a live stream.
Mr Zuckerberg said the additional staff, joining the 4,500 existing people on the community operations team, would help the company respond more quickly when content was reported.
In a post on his Facebook, external profile, he said the company would develop new tools to manage the millions of content reports it received every week.
"We're going to make it simpler to report problems to us, faster for our reviewers to determine which posts violate our standards and easier for them to contact law enforcement if someone needs help," he said.
The post suggested Facebook's moderators would contact law enforcement, rather than contacting members directly if they were at risk of harm.
"Just last week, we got a report that someone on Live [video] was considering suicide. We immediately reached out to law enforcement, and they were able to prevent him from hurting himself", said Mr Zuckerberg.
- Published18 April 2017
- Published18 April 2017
- Published26 April 2017
- Published3 May 2017