Cybersecurity

ARE AI ALGORITHMS MISOGYNIST? WILL TECH GO AGAINST WOMEN?

How can AI algorithms be racist and sexist? Is it against women even in the digital world?

Even though the first person to write an algorithm was a woman in the 19th century, Artificial Intelligence may now be discriminating against women. Artificial intelligence may inherit, or even amplify, the biases of its creators. and most AI hasn’t heard about the global feminist movement. artificial intelligence has issued a stark warning against the use of race- and gender-biased algorithms for making critical decisions

AI bias occurs when results cannot be generalized widely. We often think of bias resulting from preferences or exclusions in training data, but bias can also be introduced by how data is obtained, how algorithms are designed, and how AI outputs are interpreted. In 2018, researchers discovered that popular facial recognition services from Microsoft, IBM, and Face++ can discriminate based on gender and race. Microsoft was unable to detect darker-skinned females 21% of the time, while IBM and Face++ wouldn’t work on darker-skinned females in roughly 35% of cases. There are so many AI gender biases happening now, this impact on the daily life of all women: from job searches to security checkpoints at airports.

How do AI algorithms discriminate against women?
  • An employer was advertising for a job opening in a male-dominated industry via a social media platform. The platform’s ad algorithm pushed jobs to only men to maximize returns on the number and quality of applicants
  • A white man who goes to an airport, he will quickly pass, but a woman with dark skin will be waiting in a long line.”
  • In job recruitment processing industry is male-dominated, the majority of the resumes used to teach the AI were from men, which ultimately led the AI to discriminate against recommending women
  • Facebook posted ads for better-paid jobs to white men, while women and people of color were shown ads for less well-paid jobs.
  • Face-analysis AI programs display gender and racial bias, if you are a woman with dark skin, it will work worse. demonstrating low errors for determining the gender of lighter-skinned men but high errors in determining gender for darker-skinned women
  • Voice-activated technology in cars systems is tone-deaf to women’s voices.
  • Google searches for ‘black girls’ to produced sexist and pornographic results.

Read more

Veille-cyber

Share
Published by
Veille-cyber

Recent Posts

VPN : un outil indispensable pour protéger vos données

VPN : un outil indispensable pour protéger vos données Le VPN, ou « Virtual Private…

5 heures ago

Cybersécurité et PME : les risques à ne pas sous-estimer

Cybersécurité et PME : les risques à ne pas sous-estimer On pense souvent que seules…

1 jour ago

Phishing : comment reconnaître une attaque et s’en protéger efficacement

Comment reconnaître une attaque de phishing et s’en protéger Le phishing ou « hameçonnage »…

4 jours ago

Qu’est-ce que la cybersécurité ? Définition, enjeux et bonnes pratiques en 2025

Qu’est-ce que la cybersécurité ? Définition, enjeux et bonnes pratiques en 2025 La cybersécurité est…

5 jours ago

Cybersécurité : Vers une montée en compétence des établissements de santé grâce aux exercices de crise

Cybersécurité : les établissements de santé renforcent leur défense grâce aux exercices de crise Face…

1 semaine ago

L’IA : opportunité ou menace ? Les DSI de la finance s’interrogent

L'IA : opportunité ou menace ? Les DSI de la finance s'interrogent Alors que l'intelligence…

2 semaines ago

This website uses cookies.