• btc = $66 637.00 -3 252.04 (-4.65 %)

  • eth = $3 471.93 - 220.52 (-5.97 %)

  • ton = $6.90 -0.28 (-3.89 %)

  • btc = $66 637.00 -3 252.04 (-4.65 %)

  • eth = $3 471.93 - 220.52 (-5.97 %)

  • ton = $6.90 -0.28 (-3.89 %)

4 Feb, 2023
1 min time to read

The firm states it aims to increase transparency in its moderation process for the benefit of impacted content creators.

TikTok has updated its moderation system with account strikes, similar to YouTube's, to improve clarity and prevent exploitation. The system will assign a strike to an account if its content, such as a video or comment, is removed for violating a community guideline. The strike will last for 90 days and different types of strikes can be given for specific features or policy violations.

Receiving enough strikes in any category can lead to a permanent ban, although the exact threshold depends on the potential harm to the community. However, severe violations, such as child sexual abuse material or threats of violence, will result in an immediate ban. The company says the strike system will not apply to these severe cases.

TikTok is introducing an upgrade to its Safety Center which will allow creators to view and challenge strikes and provide users with a warning before reaching a permanent ban. The app is also testing a feature that informs users if their video won't be displayed on the For You page, with an explanation for the ineligibility. The move is aimed at increasing transparency in moderation decisions and clarifying rules for creators who unintentionally violate them, while cracking down on repeat offenders.

The previous enforcement system, which included temporary bans and restrictions, was sometimes confusing for creators. The company has faced criticism for its lack of transparency and accountability in content recommendations, with the threat of a ban looming in the US as more and more government-owned devices are being blocked.