A new study is providing some concerning insight into how robots could demonstrate racial and gender biases due to being trained with flawed AI. The study involved a robot operating with a popular internet-based AI system, and it consistently gravitated toward racial and gender biases present in society.
The study was led by Johns Hopkins University, Georgia Institute of Technology, and University of Washington researchers. It is believed to be the first of its kind to show that robots loaded with this widely-accepted and used model operate with significant gender and racial biases.
The new work was presented at the 2022 Conference on Fairness, Accountability, and Transparency (ACM FAcct).
Andrew Hundt is an author of the research and a postdoctoral fellow at Georgia Tech. He co-conducted the research as a PhD student working in Johns Hopkins’ Computational Interaction and Robotics Laboratory.
“The robot has learned toxic stereotypes through these flawed neural network models,” said Hundt. “We’re at risk of creating a generation of racist and sexist robots but people and organizations have decided it’s OK to create these products without addressing the issues.”
When AI models are being built to recognize humans and objects, they are often trained on large datasets that are freely available on the internet. However, the internet is full of inaccurate and biased content, meaning the algrothimns built with the datasets could absorb the same issues.
Introduction La cybersécurité est devenue une priorité stratégique pour toutes les entreprises, grandes ou petites.…
Cybersécurité : les établissements de santé renforcent leur défense grâce aux exercices de crise Face…
La transformation numérique du secteur financier n'a pas que du bon : elle augmente aussi…
L'IA : opportunité ou menace ? Les DSI de la finance s'interrogent Alors que l'intelligence…
Telegram envisage de quitter la France : le chiffrement de bout en bout au cœur…
Sécurité des identités : un pilier essentiel pour la conformité au règlement DORA dans le…
This website uses cookies.