Cybersecurity

ARE AI ALGORITHMS MISOGYNIST? WILL TECH GO AGAINST WOMEN?

How can AI algorithms be racist and sexist? Is it against women even in the digital world?

Even though the first person to write an algorithm was a woman in the 19th century, Artificial Intelligence may now be discriminating against women. Artificial intelligence may inherit, or even amplify, the biases of its creators. and most AI hasn’t heard about the global feminist movement. artificial intelligence has issued a stark warning against the use of race- and gender-biased algorithms for making critical decisions

AI bias occurs when results cannot be generalized widely. We often think of bias resulting from preferences or exclusions in training data, but bias can also be introduced by how data is obtained, how algorithms are designed, and how AI outputs are interpreted. In 2018, researchers discovered that popular facial recognition services from Microsoft, IBM, and Face++ can discriminate based on gender and race. Microsoft was unable to detect darker-skinned females 21% of the time, while IBM and Face++ wouldn’t work on darker-skinned females in roughly 35% of cases. There are so many AI gender biases happening now, this impact on the daily life of all women: from job searches to security checkpoints at airports.

How do AI algorithms discriminate against women?
  • An employer was advertising for a job opening in a male-dominated industry via a social media platform. The platform’s ad algorithm pushed jobs to only men to maximize returns on the number and quality of applicants
  • A white man who goes to an airport, he will quickly pass, but a woman with dark skin will be waiting in a long line.”
  • In job recruitment processing industry is male-dominated, the majority of the resumes used to teach the AI were from men, which ultimately led the AI to discriminate against recommending women
  • Facebook posted ads for better-paid jobs to white men, while women and people of color were shown ads for less well-paid jobs.
  • Face-analysis AI programs display gender and racial bias, if you are a woman with dark skin, it will work worse. demonstrating low errors for determining the gender of lighter-skinned men but high errors in determining gender for darker-skinned women
  • Voice-activated technology in cars systems is tone-deaf to women’s voices.
  • Google searches for ‘black girls’ to produced sexist and pornographic results.

Read more

Veille-cyber

Share
Published by
Veille-cyber

Recent Posts

Directive NIS 2 : Comprendre les obligations en cybersécurité pour les entreprises européennes

Directive NIS 2 : Comprendre les nouvelles obligations en cybersécurité pour les entreprises européennes La…

22 heures ago

NIS 2 : entre retard politique et pression cybersécuritaire, les entreprises dans le flou

Alors que la directive européenne NIS 2 s’apprête à transformer en profondeur la gouvernance de…

2 jours ago

Quand l’IA devient l’alliée des hackers : le phishing entre dans une nouvelle ère

L'intelligence artificielle (IA) révolutionne le paysage de la cybersécurité, mais pas toujours dans le bon…

3 jours ago

APT36 frappe l’Inde : des cyberattaques furtives infiltrent chemins de fer et énergie

Des chercheurs en cybersécurité ont détecté une intensification des activités du groupe APT36, affilié au…

3 jours ago

Vulnérabilités des objets connectés : comment protéger efficacement son réseau en 2025

📡 Objets connectés : des alliés numériques aux risques bien réels Les objets connectés (IoT)…

6 jours ago

Cybersécurité : comment détecter, réagir et se protéger efficacement en 2025

Identifier les signes d'une cyberattaque La vigilance est essentielle pour repérer rapidement une intrusion. Certains…

6 jours ago

This website uses cookies.