AI-generated content that mimics human voices and visual likeness poses an escalating global risk.
More than 300 experts from diverse fields, including technology, artificial intelligence, digital ethics, child safety, entertainment, and academia, have released an open letter urging immediate action from government leaders to combat the escalating threat posed by deepfake content.
The call comes in the wake of a surge in harmful deepfakes involving sexual imagery, fraud, and political disinformation.
Prominent figures endorsing the letter include US politician and lobbyist Andrew Yang, MIT lab computer scientist Joy Buolamwini, British computer scientist Stuart Russell and psycholinguist Steven Pinker.
The letter entitled “Disrupting the Deepfake Supply Chain” highlights the inadequacy of current laws in addressing deepfake production and dissemination. It supports ongoing legislative efforts and provides key recommendations to hold the entire deepfake supply chain accountable.
Key recommendations include the full criminalisation of deepfake child pornography, irrespective of the depiction involving fictional children. The letter advocates for criminal penalties against individuals involved in creating or spreading harmful deepfakes knowingly.
The letter proposes measures to make software developers and distributors liable for preventing their products from generating harmful deepfakes.
Directive NIS 2 : Comprendre les nouvelles obligations en cybersécurité pour les entreprises européennes La…
Alors que la directive européenne NIS 2 s’apprête à transformer en profondeur la gouvernance de…
L'intelligence artificielle (IA) révolutionne le paysage de la cybersécurité, mais pas toujours dans le bon…
Des chercheurs en cybersécurité ont détecté une intensification des activités du groupe APT36, affilié au…
📡 Objets connectés : des alliés numériques aux risques bien réels Les objets connectés (IoT)…
Identifier les signes d'une cyberattaque La vigilance est essentielle pour repérer rapidement une intrusion. Certains…
This website uses cookies.