• btc = $87 284.00 - 488.00 (-0.56 %)

  • eth = $2 928.22 -9.05 (-0.31 %)

  • ton = $1.62 -0.02 (-1.29 %)

  • btc = $87 284.00 - 488.00 (-0.56 %)

  • eth = $2 928.22 -9.05 (-0.31 %)

  • ton = $1.62 -0.02 (-1.29 %)

29 Dec, 2025
1 min time to read

Chinese authorities have tightened regulations on artificial intelligence, citing concerns that the rapid development of chatbots and generative models could threaten political stability and the authority of the Communist Party.

The Wall Street Journal reports that while Beijing is moving to strengthen oversight, it is also trying to avoid excessively restricting the industry in order to remain competitive with the United States in the global technology race. New rules introduced in November require AI companies to train their models exclusively on “politically safe” data, and all AI-generated content must now be clearly labeled.

Developers are also required to conduct pre-deployment testing. Chatbots must reject at least 95% of prompts that could undermine the state system, promote discrimination, or incite violence.

The regulatory framework was developed in collaboration with major market players, including Alibaba and DeepSeek. Companies are required to review thousands of training data samples for each content format—text, images, and video—with at least 96% deemed safe. Content classified as unsafe includes politically sensitive topics, calls for violence, and the use of images without permission.

According to regulators, around 960,000 pieces of illegal or harmful AI-generated content were removed over a three-month period. Artificial intelligence has also been added to China’s national emergency response framework, alongside epidemics and natural disasters.

At the same time, authorities stress that they do not intend to place excessive pressure on the industry. Beijing acknowledges that AI offers unprecedented opportunities for development, while also introducing new risks, and balancing the two is expected to be a key challenge for the state in the coming years.