“Facebook’s Content Moderation Centre In Kenya Will Create Employment“, read the headlines when the company announced that is was opening a moderation centre in Kenya to ensure a safe online environment for all Facebook users.
“We want Facebook to be a place where people can express themselves and freely discuss different points of view while ensuring that it remains safe for everyone,” said Fadzai Madzingira, Public Policy Associate for content, Facebook, during the announcement.
This is not the first moderation centre that Facebook has opened, there are quite a number of those across the world, with over 30,000 people employed and the social media giant has been on a roll to increase these centres in an effort to keep the streets of Facebook timelines clean of any moral filth.
A noble job if you ask anyone and quite a necessary task for Facebook if they are to maintain sanity on their platform. If that’s the case, then why should we be worried about these moderation centres?
First, it is important to understand that Facebook doesn’t, in itself, run these moderation centres, instead, they subcontract third-party firms to do this job – a method that is believed to be a cost-saving factor as well as to protect the company from unending legal battles regarding moderation of content.
In Kenya, the company has subcontracted Samasource to take up this task. Samasource made it public that they plan to hire over 100 moderators whose job will be to filter content that goes against Facebook’s community guidelines in the region.
Moderating a platform with over 2.3 billion monthly users is not an easy task as the kind of content posted on the platform can quickly go from innocent sharing of daily happenings in people’s lives to graphic murders, suicides, hate speech, propaganda and sexual videos.
Moderating Facebook has been labelled as the impossible job by Jason Koebler and Joseph Cox from Motherboard.
The other day, online publication, The Verge, ran an exposé that highlighted the alarming conditions that these moderators work under with a good number of moderators suffering from post-traumatic stress disorder (PTSD) among other mental health issues as a result of the nature of the job.
“The video depicts a man being murdered. Someone is stabbing him, dozens of times, while he screams and begs for his life. Chloe’s job is to tell the room whether this post should be removed,” Reads a part of the exposé.
The exposé was based on one of the third-party firms, Cognizant in Phoenix, USA. But subsequent reports by various media publications quickly showed the issue to cut across various moderation centres.
As per the exposé, moderators are subjected to low salaries, limited breaks while at work (two 15-minute breaks, and one 30-minute lunch), unhealthy micromanagement topped up by strict policies and passive employee counselling after exposure to disturbing content on Facebook that range from videos of murders, suicide, sex, racism and conspiracy theories, which all led to long-term impacts of the job on the mental health of the moderators.
Prior to The Verge’s report, there are records showing one Facebook’s moderator filling a lawsuit against the company citing the development of PTSD and psychological trauma caused by the job. The former moderator accused Facebook of failing to provide a safe workplace for its moderators who are entrusted to provide the safest environment possible for Facebook users.
“Moderators cope with seeing traumatic images and videos by telling dark jokes about committing suicide, then smoking weed during breaks to numb their emotions. Moderators are routinely high at work,” writes The Verge’s Casey Newton. The Verge also notes that Cognizant’s employees coped with the stress of the job through sex – “Employees have been found having sex inside stairwells and a room reserved for lactating mothers, in what one employee describes as “trauma bonding.”
Will the moderation centre in Kenya do things differently or will they carry on the seemingly toxic culture of Facebook’s moderation centres
The whole report goes into deeper details of employee mistreatment and toxic working conditions, you should find some time and read it, and this is the worrying factor. If the various online reports are anything to go by, there’s something to worry about.
Interestingly, these reports come after Facebook had publicly spoken that they actually do take measures to protect their moderators, both in-house and outsourced, “We recognize that this work can often be difficult. That is why we take the support of our content moderators incredibly seriously, starting with their training, the benefits they receive and ensuring that every person reviewing Facebook content is offered psychological support and wellness resources. Facebook employees receive these in-house and we also require companies that we partner with for content review to provide resources and psychological support, including on-site counselling and other wellness resources like relaxation areas at many of our larger facilities,” said Facebook.
Will the moderation centre in Kenya do things differently or will they carry on the seemingly toxic culture of Facebook’s moderation centres to create zombie-like human beings in the name of moderating the world’s largest social media platform?
After the lava spill, Facebook came out in defence claiming that they will do more to oversee the content moderation centres, “We are putting in place rigorous and regular compliance and audit process for all of our outsourced partners to ensure they are complying with the contracts and care we expect… This will include even more regular and comprehensive focus groups with vendor employees than we do today,” writes Justin Osofsky, Facebook’s vice president of global operations.
On top of this, the company also said that it will now perform resiliency tests – the ability to bounce back from seeing traumatic content and continue doing their jobs, on all their moderators.
We reached out to Samasource seeking their assurance on measures they have put in place to protect their moderators from such a work environment, at the time of going to press, Somasource was yet to respond.
So now all we have is Facebook’s word, an ocean of negative reports regarding these moderation centres and this, “One day I can be really happy and doing really good. The next day, I’m more or less of a zombie. It’s not that I’m depressed. I’m just stuck… I don’t think it’s possible to do the job and not come out of it with some acute stress disorder or PTSD,” said one of Cognizant’s former employees.
Comments