In the video record of the Ukraine war, a clumsy attempt to “deepfake” Ukrainian President Volodymyr Zelensky coexists alongside critical on-the-ground video evidence of abuses, pervasive misinformation for grift and attention, and Russian false flag operations.
These scenes from the war provide a glimpse into a future where, alongside existing forms of manipulation and misattribution, deepfake technology — images that have been “convincingly altered and manipulated to misrepresent someone doing or saying something that was not actually done or said” — will be more readily employed. More false videos will be forged and the ‘liar’s dividend’ will be used to cast doubt on authentic videos.
One set of solutions to these current and future problems proposes to better track where media comes from, what is synthesized, edited or changed, and how. This ‘authenticity and provenance’ infrastructure deserves close attention to its possibilities and preventative work on its risks.
In January, the Coalition for Content Provenance and Authenticity (C2PA) led by the BBC, Microsoft, Adobe, Intel, Twitter, TruePic, Sony and Arm, proposed the first global technical standards for better tracking what content is authentic and what is manipulated. The specifications provide a way to follow the origins and changes to a piece of media content from capture on a camera to editing to distribution by major media outlets or on a social media feed. Companies are
Cybersécurité et PME : les risques à ne pas sous-estimer On pense souvent que seules…
Comment reconnaître une attaque de phishing et s’en protéger Le phishing ou « hameçonnage »…
Qu’est-ce que la cybersécurité ? Définition, enjeux et bonnes pratiques en 2025 La cybersécurité est…
Cybersécurité : les établissements de santé renforcent leur défense grâce aux exercices de crise Face…
L'IA : opportunité ou menace ? Les DSI de la finance s'interrogent Alors que l'intelligence…
Sécurité des identités : un pilier essentiel pour la conformité au règlement DORA dans le…
This website uses cookies.