Dutch ministry to check best practices for online content moderation
The investigation comes after video sharing application TikTok said it will close its Amsterdam office.
The Dutch interior ministry is conducting a study to discover the best methods for content moderation on very large online platforms and search engines, in the wake of redundancies by video sharing app TikTok in the Netherlands, state secretary Zsolt Szabó has announced.
“The government believes that too many social media platforms have failed to make sufficient investments in context-specific, linguistic and cultural human expertise in the Netherlands,” Szabó said in an answer to parliamentary questions.
The results of the study will be available in the autumn of 2025 and will also be shared with the European Commission.
Szabó said that the ministry was in touch with TikTok in the Netherlands to address the cancelling of its Dutch Trust and Safety office in Amsterdam with some 300 employees, last month.
He added that the company promised that the layoffs “will not affect the total number of moderators in the EU” and that there “will be no major changes” to the number of moderators by language group, and that it has no plans to significantly reduce the number of human content moderators.
“However, if it turns out that TikTok’s information is incorrect and there are significantly fewer Dutch moderators, as well as increased systemic risks, it is up to the European Commission, as supervisory authority, to further investigate this,” Szabó added.
According to TikTok’s last Digital Services Act (DSA) transparency report, the company has some 160 content moderators in charge of Dutch language. These employees are not necessarily based in the Netherlands but can also work on content from other offices.
A spokesperson for the video sharing app told Euronews that the company has “over 6,000 people supporting EU content moderation and these changes [the closing of the Dutch Trust and Safety operations] will not reduce this overall number.”
Human moderation
TikTok’s content moderation is done automatically by default, and in case the system detects a violation, human moderation can help improve the platform’s machine learning tools by providing feedback as well as giving additional context and nuance, the company said in its DSA report.
Euronews reported in October on increases in numbers of those moderating TikTok content in the EU: it employed some 6,354 content moderators in the first half of this year – up from 6,287 in the period October-December 2023, its report said.
The platform now has some 1,498 English language content moderators, which has been significantly reduced from 2,334 in the previous reporting period. The number of non-language specific moderators — meaning those who review profiles or photos — has risen to 1,508 from 413 last year.
However, the platform still has no moderators dedicated to lesser spoken languages such as Maltese and Irish, and just a few for Estonian (with six people), Croatian (with eight, who are also tasked to look at Serbian posts), and Latvian and Lithuanian, with ten apiece.
The company claims to have “language capabilities covering at least one official language for each of the 27 European Union Member States, consistent with previous reporting periods”.
World News || Latest News || U.S. News
Source link