Cybersecurity

ARE AI ALGORITHMS MISOGYNIST? WILL TECH GO AGAINST WOMEN?

How can AI algorithms be racist and sexist? Is it against women even in the digital world?

Even though the first person to write an algorithm was a woman in the 19th century, Artificial Intelligence may now be discriminating against women. Artificial intelligence may inherit, or even amplify, the biases of its creators. and most AI hasn’t heard about the global feminist movement. artificial intelligence has issued a stark warning against the use of race- and gender-biased algorithms for making critical decisions

AI bias occurs when results cannot be generalized widely. We often think of bias resulting from preferences or exclusions in training data, but bias can also be introduced by how data is obtained, how algorithms are designed, and how AI outputs are interpreted. In 2018, researchers discovered that popular facial recognition services from Microsoft, IBM, and Face++ can discriminate based on gender and race. Microsoft was unable to detect darker-skinned females 21% of the time, while IBM and Face++ wouldn’t work on darker-skinned females in roughly 35% of cases. There are so many AI gender biases happening now, this impact on the daily life of all women: from job searches to security checkpoints at airports.

How do AI algorithms discriminate against women?
  • An employer was advertising for a job opening in a male-dominated industry via a social media platform. The platform’s ad algorithm pushed jobs to only men to maximize returns on the number and quality of applicants
  • A white man who goes to an airport, he will quickly pass, but a woman with dark skin will be waiting in a long line.”
  • In job recruitment processing industry is male-dominated, the majority of the resumes used to teach the AI were from men, which ultimately led the AI to discriminate against recommending women
  • Facebook posted ads for better-paid jobs to white men, while women and people of color were shown ads for less well-paid jobs.
  • Face-analysis AI programs display gender and racial bias, if you are a woman with dark skin, it will work worse. demonstrating low errors for determining the gender of lighter-skinned men but high errors in determining gender for darker-skinned women
  • Voice-activated technology in cars systems is tone-deaf to women’s voices.
  • Google searches for ‘black girls’ to produced sexist and pornographic results.

Read more

Mots-clés : cybersécurité, sécurité informatique, protection des données, menaces cybernétiques, veille cyber, analyse de vulnérabilités, sécurité des réseaux, cyberattaques, conformité RGPD, NIS2, DORA, PCIDSS, DEVSECOPS, eSANTE, intelligence artificielle, IA en cybersécurité, apprentissage automatique, deep learning, algorithmes de sécurité, détection des anomalies, systèmes intelligents, automatisation de la sécurité, IA pour la prévention des cyberattaques.

Veille-cyber

Share
Published by
Veille-cyber

Recent Posts

Bots et IA biaisées : menaces pour la cybersécurité

Bots et IA biaisées : une menace silencieuse pour la cybersécurité des entreprises Introduction Les…

2 semaines ago

Cloudflare en Panne

Cloudflare en Panne : Causes Officielles, Impacts et Risques pour les Entreprises  Le 5 décembre…

2 semaines ago

Alerte sur le Malware Brickstorm : Une Menace pour les Infrastructures Critiques

Introduction La cybersécurité est aujourd’hui une priorité mondiale. Récemment, la CISA (Cybersecurity and Infrastructure Security…

2 semaines ago

Cloud Computing : État de la menace et stratégies de protection

  La transformation numérique face aux nouvelles menaces Le cloud computing s’impose aujourd’hui comme un…

2 semaines ago

Attaque DDoS record : Cloudflare face au botnet Aisuru – Une analyse de l’évolution des cybermenaces

Les attaques par déni de service distribué (DDoS) continuent d'évoluer en sophistication et en ampleur,…

2 semaines ago

Poèmes Pirates : La Nouvelle Arme Contre Votre IA

Face à l'adoption croissante des technologies d'IA dans les PME, une nouvelle menace cybersécuritaire émerge…

2 semaines ago

This website uses cookies.