- Age-old biases are being hard-wired into new age tech in predictive and profiling AI systems
- Indian police have also moved to using AI-based technology in efforts to control crimes
Civil society organizations believe AI systems in law enforcement, especially in the use of predictive and profiling, discriminate against the most marginalized in society, infringe on liberty and fair trial rights, and reinforce structural discrimination.
“Age-old discrimination is being hard-wired into new age technologies in the form of predictive and profiling AI systems used by law enforcement and criminal justice authorities. Seeking to predict people’s future behaviour and punish them, for it is completely incompatible with the fundamental right to be presumed innocent until proven guilty. The only way to protect people from these harms and other fundamental rights infringements is to prohibit their use. » said Griff Ferris, legal and policy officer, Fair Trials in a statement.
Le règlement DORA : un tournant majeur pour la cybersécurité des institutions financières Le 17…
L’Agence nationale de la sécurité des systèmes d'information (ANSSI) a publié un rapport sur les…
Directive NIS 2 : Comprendre les nouvelles obligations en cybersécurité pour les entreprises européennes La…
Alors que la directive européenne NIS 2 s’apprête à transformer en profondeur la gouvernance de…
L'intelligence artificielle (IA) révolutionne le paysage de la cybersécurité, mais pas toujours dans le bon…
Des chercheurs en cybersécurité ont détecté une intensification des activités du groupe APT36, affilié au…
This website uses cookies.