deepfake
The generative AI industry will be worth about A$22 trillion by 2030, according to the CSIRO. These systems – of which ChatGPT is currently the best known – can write essays and code, generate music and artwork, and have entire conversations. But what happens when they’re turned to illegal uses?
Last week, the streaming community was rocked by a headline that links back to the misuse of generative AI. Popular Twitch streamer Atrioc issued an apology video, teary eyed, after being caught viewing pornography with the superimposed faces of other women streamers.
The “deepfake” technology needed to Photoshop a celebrity’s head on a porn actor’s body has been around for a while, but recent advances have made it much harder to detect.
And that’s the tip of the iceberg. In the wrong hands, generative AI could do untold damage. There’s a lot we stand to lose, should laws and regulation fail to keep up.
Last month, generative AI app Lensa came under fire for allowing its system to create fully nude and hyper-sexualised images from users’ headshots. Controversially, it also whitened the skin of women of colour and made their features more European.
The backlash was swift. But what’s relatively overlooked is the vast potential to use artistic generative AI in scams. At the far end of the spectrum, there are reports of these tools being able to fake fingerprints and facial scans (the method most of us use to lock our phones).
Source
Introduction La cybersécurité est devenue une priorité stratégique pour toutes les entreprises, grandes ou petites.…
Cybersécurité : les établissements de santé renforcent leur défense grâce aux exercices de crise Face…
La transformation numérique du secteur financier n'a pas que du bon : elle augmente aussi…
L'IA : opportunité ou menace ? Les DSI de la finance s'interrogent Alors que l'intelligence…
Telegram envisage de quitter la France : le chiffrement de bout en bout au cœur…
Sécurité des identités : un pilier essentiel pour la conformité au règlement DORA dans le…
This website uses cookies.