Content moderators for Big Tech unite to tackle mental trauma


Content moderators from the Philippines to Turkey are uniting to push for greater mental health support to help them cope with the psychological effects of exposure to a rising tide of disturbing images online.

The people tasked with removing harmful content from tech giants like Meta Platforms or TikTok, report a range of noxious health effects from loss of appetite to anxiety and suicidal thoughts.

“Before I would sleep seven hours,” said one Filipino content moderator who asked to remain anonymous to avoid problems with their employer. “Now I only sleep around four hours.”

Workers are gagged by non-disclosure agreements with the tech platforms or companies that do the outsourced work, meaning they cannot discuss exact details of the content they are seeing.

But videos of people being burned alive by the Islamic State, babies dying in Gaza and gruesome pictures from the Air India crash in June were given as examples by moderators who spoke to the Thomson Reuters Foundation.

Social media companies, which often outsource content moderation to third parties, are facing increasing pressure to address the emotional toll of moderation.

Meta, which owns Facebook, WhatsApp and Instagram, has already been hit with workers’ rights lawsuits in Kenya and Ghana, and in 2020 the firm paid a $52 million settlement to American content moderators suffering long-term mental health issues.

The Global Trade Union Alliance of Content Moderators was launched in Nairobi in April to establish worker protections for what they dub “a 21st century hazardous job”, similar to the work of emergency responders.

Their first demand is for tech companies to adopt mental health protocols, such as exposure limits and trauma training, in their supply chains.

“They say we’re the ones protecting the internet, keeping kids safe online,” the Filipino worker said, “But we are not protected enough.”

Globally, tens of thousands of content moderators spend up to 10 hours a day scrolling through social media posts to remove harmful content, and the mental toll is well-documented.

“I’ve had bad dreams because of the graphic content, and I’m smoking more, losing focus,” said Berfin Sirin Tunc, a content moderator for TikTok in Turkey employed via Canadian-based tech company Telus, which also does work for Meta.

In a video call with the Thomson Reuters Foundation, she said the first time she saw graphic content as part of her job she had to leave the room and go home.

While some employers do provide psychological support, some workers say it is just for show, with advice to count numbers or do breathing exercises.

Therapy is limited to either group sessions or a recommendation to switch off for a certain number of “wellness break” minutes. But taking them is another thing.

“If you don’t go back to the computer, your team leader will ask where are you and (say) that the queue of videos is growing,” said Tunc, “Bosses see us just as machines.”

In emailed statements to the Thomson Reuters Foundation, Telus and Meta said the well-being of their employees is a top priority and that employees should have access to 24/7 healthcare support.

Moderators have seen an uptick in violent videos. A report by Meta for the first quarter of 2025 showed a rise in the sharing of violent content on Facebook, after the company changed its content moderation policies in a commitment to “free expression.”

However, Telus said in its emailed response that internal estimates show that distressing material represents less than 5% of the total content reviewed.

Adding to the pressure on moderators is a fear of losing jobs as companies shift towards artificial intelligence-powered moderation.

Meta, which invested billions and hired thousands of content moderators globally over the years to police extreme content, scrapped its U.S. fact-checking programme in January, following the election of Donald Trump.

In April, 2,000 Barcelona-based workers were sent home after Meta severed a contract with Telus.

A Meta spokesperson said the company has moved the services that were being performed from Barcelona to other locations.

“I’m waiting for Telus to fire me,” said Tunc, “because they fired my friends from our union.” Fifteen workers in Turkey are suing the company after being dismissed, they say, after organising a union and attending protests this year.

A spokesperson for Telus said in an emailed response that the company “respects the rights of workers to organise”.

Telus said a May report by Turkey’s Ministry of Labour found contract terminations were based on performance and it could not be concluded that the terminations were union-related.

The Labour Ministry did not immediately respond to a request for comment.

Moderators in low-income countries say that the low wages, productivity pressure and inadequate mental health support can be remedied if companies sign up to the Global Alliance’s eight protocols.

These include limiting exposure time, making realistic quotas and 24/7 counselling, as well as living wages, mental health training and the right to join a union.

Telus said in its statement that it was already in compliance with the demands, and Meta said it conducts audits to check that companies are providing required on-site support.

New European Union rules, such as the Digital Services Act, the AI Act and supply chain regulations which demand tech companies address risks to workers, should give stronger legal grounds to protect content moderators’ rights, according to labour experts.

“Bad things are happening in the world. Someone has to do this job and protect social media,” said Tunc.

“With better conditions, we can do this better. If you feel like a human, you can work like a human.”

Published – July 04, 2025 09:52 am IST



Source link

Share

Latest Updates

Frequently Asked Questions

Related Articles

Ring’s battery-powered video doorbell drops to best price ahead of Prime Day

There’s a certain peace of mind that comes from being able to see...

YouTube to Revise Monetisation Policy to Target Mass-Produced and Repetitive Content

YouTube is updating its monetisation rules to increase its scrutiny of mass-produced content....

People Are Taking Massive Doses of Psychedelic Drugs and Using AI as a Tripsitter

Artificial intelligence, which is already trippy enough, has taken on a startling new...

Tesla’s German car sales continue to fall in June

Tesla's sales volume in Germany fell by 60% in June, the German road...
testing11