Time limit 'protects YouTube moderators' from disturbing content
- Published
Workers employed to sift through and remove disturbing content on YouTube will be limited to doing so for four hours per day.
YouTube had made the change to protect their mental health, chief executive Susan Wojcicki told the South by Southwest conference in Austin, Texas.
It comes as the video-sharing platform faces scrutiny over how it deals with unsuitable and violent content.
It has pledged to hire 10,000 extra people to address the issue.
Ms Wojcicki said: "This is a real issue, and I myself have spent a lot of time looking at this content over the past year. It is really hard."
The workers would also receive "wellness benefits", she said.
Ms Wojcicki also announced YouTube would introduce "information cues" to debunk videos promoting conspiracy theories, with links to fact-based articles on Wikipedia.
It comes in the wake of criticism levelled at YouTube for showing a hoax video about David Hogg, one of the pupils who survived the Parkland school shooting in Florida.
Psychological toll
The workers used by YouTube and other social media platforms to moderate content are often employed on a contract basis and paid relatively low wages, external.
Many leave within a year of starting the job, partly due to the psychological toll it takes on them.
Videos can include images of child sexual abuse, violence to animals, murder and suicide.
Some have spoken publicly about how viewing illegal videos and posts has affected their mental health.
A woman employed as moderator by Facebook told the Wall Street Journal , externalin December that she had reviewed as many as 8,000 posts a day with little training on how to handle the distress.
Professor Neil Greenberg from the Royal College of Psychiatrists told the BBC that people taking on jobs as moderators can become traumatised by what they see and can, in some cases, develop post-traumatic stress disorder (PTSD).
"This is a positive first move from YouTube - a recognition that this material can be toxic," he told the BBC.
"Supervisors needs mental health awareness and peers should be trained to look for problems. Those developing issues need early intervention and treatments such as trauma-focused cognitive therapy," he added.
Supervisors need mental health training and should be able to spot when an individual needs intervention from a mental health professional.
YouTube is increasingly turning to machine-learning algorithms to help root out such content.
In December it said it had removed 150,000 videos for violent extremism since June, and 98% of the videos had been flagged by algorithm.
It did not clarify whether humans had viewed the videos after they had been flagged.
The BBC contacted YouTube to clarify other issues, including:
how many moderators it employed in total
whether all of them would benefit from the new contract limits
whether this would affect current pay
what the wellness benefits would include
YouTube responded, saying it had "nothing more to share at this time".
- Published14 March 2018
- Published13 March 2018