Home / Intelligence Artificielle / Artificially intelligent robot perpetuates racist and sexist prejudice

Artificially intelligent robot perpetuates racist and sexist prejudice

Artificially intelligent robot perpetuates racist and sexist prejudice

Virtual robot run by artificial intelligence acts in a way that conforms to toxic stereotypes when asked to pick faces that belong to criminals or homemakers

A robot running an artificial intelligence (AI) model carries out actions that perpetuate racist and sexist stereotypes, highlighting the issues that exist when tech learns from data sets with inherent biases.

Andrew Hundt at the Georgia Institute of Technology in Atlanta and his colleagues set up a virtual experiment using a robot running CLIP, a neural network developed by a company called OpenAI and trained by pairing together images taken from the internet and accompanying …

Read more: https://www.newscientist.com/article/2326129-artificially-intelligent-robot-perpetuates-racist-and-sexist-prejudice/

Mots-clés : cybersécurité, sécurité informatique, protection des données, menaces cybernétiques, veille cyber, analyse de vulnérabilités, sécurité des réseaux, cyberattaques, conformité RGPD, NIS2, DORA, PCIDSS, DEVSECOPS, eSANTE, intelligence artificielle, IA en cybersécurité, apprentissage automatique, deep learning, algorithmes de sécurité, détection des anomalies, systèmes intelligents, automatisation de la sécurité, IA pour la prévention des cyberattaques.