Immagine di copertina
Credit: Daniel LEAL-OLIVAS / AFP

Every day their task is to analyze and evaluate contents. I am constantly in contact with vulgar texts or insults, sentences full of hatred and often dirty and vulgar images. It's the hard work of Facebook moderators. Scrutinize, sift, select daily what to let go on the most popular social network in the world and what not.

Through an inquiry, the British newspaper Guardian interviewed a group of employees who for years worked in the moderation centers of the social network at the central office in Berlin

Workers are asked to work quickly and work well. For eight hours a day must scrutinize in few minutes graphic contents full of violence , images of nude and often episodes of bullying . The shifts also include the night time and weekend .

Workers are forced to speak anonymously because they have signed non-disclosure agreements with Facebook

Chilling, according to some employees, are private chats between adults and minors , which the moderators found themselves analyzing because an algorithm had marked them as probable containers of elements for sexual exploitation.

Most of the conversations – defined “ disturbing “by the moderators – the dynamic is always the same: wealthy white men (from Europe or the United States) write to children in the Philippines looking for to obtain sexual photos in exchange for a few dollars.

The employees are aware of being part of a team of a new job, still in an experimental phase. The goal in itself is noble: to protect the users of the giant created by Mark Zuckerberg from mistreatment, hatred and racial prejudice. But it is also important to protect workers, whose job is to act as a “buffer” between individual users and the social media.

The effects of the work on the moderators are sometimes devastating: an employee said he caught a colleague buying a Taser online because he felt increasingly scared, worried to walk in the street at night and be surrounded by foreigners

Some of them also said that the hatred they face every day also influences them political ideology . Many of them realized that they were more conservative .

Facebook moderators are aware that many of the hate content disclosed and dealt with on a daily basis are false news that aims to share very particular political opinions.

Dentro il Rojava, guerra di Siria

In February, the site The Verge had already conducted a similar investigation in the United States, producing one of the first reports of the “Behind the scenes” of Facebook

The results were very similar to those obtained in the Berlin survey: even the Americans reported that the videos and memes they saw every day gradually led them to embrace more visions extremist and that a former moderator slept with a gun by his side after being traumatized by a stabbing video.

Other US moderators were given drugs and alcohol , as well as those in Germany. American moderators had complained about the psychological help that was provided to them

The consultants who were supposed to provide on-site support were largely passive. The same happened in Berlin, where psychological counseling services suggested to seek medical care

“Some colleagues went to the consultant and when they proved to have real problems, they were asked to find a suitable psychologist outside the company” said the employees of Berlin.

“We have to share our stories, because people know nothing about us, about our work, about what we do to make a living,” have Facebook employees of the Berlin office at Guardian .

Dentro il Rojava, guerra di Siria

Facebook, after the report published by Verge , seemed to have taken measures to protect its employees. But hatred on social networks remains a difficult plague to stem.

Facebook starts the tests to remove the number of likes from posts

Author

Born on Sunday, the first day of summer of 1992. Lucana. Journalist practicing since 2018. Student of the journalism school of Urbino. Passionate about photography, books, travel.

LEAVE A REPLY

Please enter your comment!
Please enter your name here