deepfake
China is pushing ahead of the European Union and the United States with its new synthetic content regulations. New draft provisions would place more responsibility on platforms to preserve social stability, with potential costs for online freedoms. They show that the Chinese Communist Party is prepared to protect itself against the unique threats of emerging technologies.
On 28 January 2022, China’s State Internet Information Office released the “Provisions on the Administration of Deep Synthesis Internet Information Services (Draft for solicitation of comments)”1. The proposal (henceforth the Provisions) is a draft of regulations for deep synthesis technology, an umbrella term covering “text, images, audio, video, virtual scenes, or other information” created with generative models (Article 2). Also known as synthetically generated content, this includes deepfakes (photos, videos or audio that depicts a person doing or saying things that they have not been recorded doing or saying), generated texts such as those produced by OpenAI’s GPT-3, advanced image enhancement techniques and the construction of virtual ‘scenes’, akin to the immersive movies described in the classic novel Ready Player One. Once enacted, the Provisions will represent a leap past the USA and the EU in deepfake regulation and a considerable advance in the Chinese government’s efforts to control the content of its domestic Internet, and social stability more generally.
Read more
L'IA : opportunité ou menace ? Les DSI de la finance s'interrogent Alors que l'intelligence…
Sécurité des identités : un pilier essentiel pour la conformité au règlement DORA dans le…
La transformation numérique du secteur financier n'a pas que du bon : elle augmente aussi…
Telegram envisage de quitter la France : le chiffrement de bout en bout au cœur…
L'intelligence artificielle (IA) révolutionne le paysage de la cybersécurité, mais pas toujours dans le bon…
TISAX® et ISO 27001 sont toutes deux des normes dédiées à la sécurité de l’information. Bien qu’elles aient…
This website uses cookies.