Cybersecurity

ARE AI ALGORITHMS MISOGYNIST? WILL TECH GO AGAINST WOMEN?

How can AI algorithms be racist and sexist? Is it against women even in the digital world?

Even though the first person to write an algorithm was a woman in the 19th century, Artificial Intelligence may now be discriminating against women. Artificial intelligence may inherit, or even amplify, the biases of its creators. and most AI hasn’t heard about the global feminist movement. artificial intelligence has issued a stark warning against the use of race- and gender-biased algorithms for making critical decisions

AI bias occurs when results cannot be generalized widely. We often think of bias resulting from preferences or exclusions in training data, but bias can also be introduced by how data is obtained, how algorithms are designed, and how AI outputs are interpreted. In 2018, researchers discovered that popular facial recognition services from Microsoft, IBM, and Face++ can discriminate based on gender and race. Microsoft was unable to detect darker-skinned females 21% of the time, while IBM and Face++ wouldn’t work on darker-skinned females in roughly 35% of cases. There are so many AI gender biases happening now, this impact on the daily life of all women: from job searches to security checkpoints at airports.

How do AI algorithms discriminate against women?
  • An employer was advertising for a job opening in a male-dominated industry via a social media platform. The platform’s ad algorithm pushed jobs to only men to maximize returns on the number and quality of applicants
  • A white man who goes to an airport, he will quickly pass, but a woman with dark skin will be waiting in a long line.”
  • In job recruitment processing industry is male-dominated, the majority of the resumes used to teach the AI were from men, which ultimately led the AI to discriminate against recommending women
  • Facebook posted ads for better-paid jobs to white men, while women and people of color were shown ads for less well-paid jobs.
  • Face-analysis AI programs display gender and racial bias, if you are a woman with dark skin, it will work worse. demonstrating low errors for determining the gender of lighter-skinned men but high errors in determining gender for darker-skinned women
  • Voice-activated technology in cars systems is tone-deaf to women’s voices.
  • Google searches for ‘black girls’ to produced sexist and pornographic results.

Read more

Veille-cyber

Share
Published by
Veille-cyber

Recent Posts

Panorama des menaces cyber en 2025

Panorama des menaces cyber en 2025 : Implications pour les entreprises françaises à l'ère de…

13 heures ago

Risques émergents de l’Intelligence Artificielle

Introduction L'adoption croissante des technologies d'intelligence artificielle dans le secteur de la santé offre des…

2 jours ago

Cybersécurité et IA en santé : enjeux stratégiques pour les DSI d’établissements de soins

La révolution IA dans le secteur de la santé : nouveaux défis de cybersécurité La…

2 jours ago

Sécurité des PME : échapper à l’enfer des questionnaires de sécurité

En tant que PME sous-traitante de grands groupes, vous connaissez trop bien ce scénario :…

5 jours ago

Votre entreprise a été cyberattaquée : pourquoi la technologie seule ne vous sauvera pas

Votre entreprise vient de subir une cyberattaque. Dans le feu de l'action, vous avez mobilisé…

5 jours ago

Mieux connaitre vos faiblesses pour mieux vous protéger

"Mais concrètement, à quoi sert un scanner de vulnérabilité pour une entreprise comme la nôtre?"…

5 jours ago

This website uses cookies.