Facebook in a blog post on Thursday came out with substantial details around the company’s thousands of content reviewers and moderators who look at millions of posts that are reported by users on the social networking platform. Reported content can include anything from hate speech to counterterrorism to even nudity. While Facebook maintains its stand that the reviewers should remain anonymous, it has gone ahead and revealed certain statistics and training mechanisms that help ensure that content moderation standards remain constant across regions and languages.
In the blog post titled Hard Questions, Facebook explains that its team working on safety and security will be doubled this year to 20,000. The current lot of 7,500 content reviewers working at Facebook includes a mix of full-time employees, contractors, and companies that claim to cover all time zones and more than 50 languages. They come from a wide range of backgrounds and professional experiences.
Language proficiency is an important prerequisite for the job. “If someone reports a Tagalog-language post in the middle of the night in the Philippines, for instance, there will always be a Tagalog-speaking reviewer – either locally or based in another time zone – that the report can be routed to for quick review,” the blog states. Nudity, however, remains a category of content that can moderated by any reviewer across the globe.
Apart from that, Facebook also looks at how well a reviewer can handling looking at disturbing imagery for too long. Background checks are made and all applicable local employment laws and requirements are also taken care of.
Reviewers, after their hiring process, are taken through steps of pre-training, hands-on learning, and ongoing coaching to keep them updated with policy changes or any decisions that may impact their job. Before they get to the real deal, a minimum of 80 hours is set aside for moderators to learn their tasks on a replica of the original system.
As for the process, Facebook says, “Once something is reported, it’s automatically routed to a content review team based on language or the type of violation. This way the team that has specific training in the relevant policy area reviews the report – and, if needed, can escalate it to subject matter experts on the Community Operations Escalations or the content policy teams.”
In terms of taking care of the reviewers, Facebook has made available four clinical psychologists across three regions – responsible for designing, delivering and evaluating resiliency programmes. Other than that, all reviewers have access to health care benefits.