A report finds a major flaw in Facebook’s method of filtering contents. The content moderators of Facebook depend on a chain of PowerPoint slides which are full of inaccuracies and backdated information to finalize the contents that is allowed on their social media network. Since Facebook became acknowledged worldwide, it has attracted severe criticism for undermining the purpose of democracy. It has also instigated conflicts in small and large societies.
FB has over 7500 moderators, which are given a rulebook consisted of over 1400 pages to help them decide. The employee who handed the rulebook was afraid that the company was acting too powerful and overlooking the insight, thus making numerous mistakes. After analyzing the information, it came out that there were several loopholes, partiality and prominent mistakes. For instance – due to a faulty paperwork, an extremist group in Myanmar, alleged of performing genocide, were allowed to be on Facebook for many months. The employees who are hired to set the rules try to extract a simple yes or no answer to complicated matters. The moderators are also highly unskilled people hired from random call centers.
The moderators hence need to depend on Google Translator to either approve or reject hundreds of posts everyday on the platform. Often, the translator fails to perform accurately, thus adding to the confusion even more. Facebook executives have stood up to this serious allegation saying they are trying their best to not allow harmful posts. However, a senior engineer Sara Su remarked that Facebook is not a platform to teach people to use dignified language but their attempt has always been to emphasize standards of communities on the network.
Facebook has banned Proud Boys in US who were an extreme pro Trump group. The company has also blocked a provoking ad regarding Central American migrants that was created by the political team of Trump.