Behind the Screen: What happens to Tiktok’s Content Moderators?

7 min read
Behind the Screen What happens to Tiktok's Content Moderators

Luis, a Colombian student of 28 years old, works through the night to moderate videos on TikTok. He attempts to sleep throughout the day, but the films occasionally haunt his dreams.

He recalls a video recorded at a party showing two individuals carrying what seemed to be bits of flesh. When they turned around, it seemed that they were carrying flayed human facial skin and gristle. “The scariest part was that the pals were utilizing human faces as masks in their games,” he claimed.

Luis listed the kind of content he regularly encounters: “Murder, suicide, paedophilic content, pornographic content, accidents, and cannibalism.”

Carlos, a former TikTok moderator, had nightmares over a video depicting child sexual assault. He stated that the video depicted a five or six-year-old girl. It was so near that she appeared to be turning her back to the camera while dancing.

As a father myself, it struck him extremely hard, he explained. He pressed pause, stepped outside to smoke a cigarette, and returned to the video queue a few minutes later.

As part of their daily duties, TikTok censors in Colombia must sift through horrifying videos like these. They reported to the Bureau of Investigative Journalism pervasive workplace stress and little psychological assistance, demanding or unachievable performance goals, punitive wage reductions, and excessive surveillance. Their continuous attempts to unionize for improved working conditions have been met with opposition.

Keep Reading


With an estimated 100 million subscribers in Latin America, TikTok has hired hundreds of moderators in Colombia to fight a never-ending war against offensive material. They work six days a week on day and night shifts and are paid as little as £235 per month, compared to around £2,000 per month for content moderators in the United Kingdom.

The workers questioned by the Bureau were contracted by Teleperformance, a global services outsourcing firm with over 42,000 employees in Colombia, making it one of the country’s largest private employers. The nine moderators could only talk anonymously for fear of losing their jobs or jeopardizing their work chances in the future.

Neither TikTok nor Teleperformance reacted to specific complaints lists for this article. Both made comments expressing their dedication to the welfare of their personnel.

A traumatizing job

The TikTok recommendation system is widely regarded as one of the most successful artificial intelligence (AI) applications in the world. It learns with almost terrifying precision what each user finds amusing or appealing and then shows them more stuff they are likely to appreciate.

However, TikTok’s AI skill is limited. The firm employs both humans and artificial intelligence to keep its platform free of hazardous information. In addition, when content moderators at TikTok and other sites flag a piece of content for removal, they do not just remove it. In addition, they are gathering information on the precise policies that are violated, which may be used to educate the platform’s machine learning algorithms to recognize such content in the future.

Some social media companies have difficulty with even very simple duties, such as recognizing duplicates of deleted terrorist videos. However, their duty gets more difficult when they are required to immediately erase stuff that no one has ever seen. “The human brain is the most effective instrument for identifying harmful material,” said Roi Carthy, the chief marketing officer of L1ght, an artificial intelligence business specializing in content moderation. When dangerous information is presented in novel formats and settings that AI may not recognize, humans become especially important.

Carthy stated, “Nobody understands how to address content moderation comprehensively, period. This does not exist.”

Carthy stated that the existence of a low-paid and insecure global workforce may be compounding the issue. Videos, which are more complicated than photographs and text, demand a greater computational capacity. This makes the development of AI for video moderation very costly.

“From a financial standpoint, content moderation AI cannot compete with $1.80 per hour,” Carthy added, referring to the average hourly income of content moderators in the global south. “If that’s the only factor you consider, then no AI content moderation business can compete with you.”

Load More By Burapha
Load More In Lifestyle
Subscribe
Notify of
guest
0 Comments
Inline Feedbacks
View all comments

Check Also

What is the Cambodian Food Capital of Canada?

Kingston, a university town noted for its farmers’ markets and limestone courtyards,…