Buried in the National Defense Authorization Act for Fiscal Year 2022 (NDAA), recently signed by President Joe Biden, are two of the most consequential pieces of artificial intelligence (A.I.) legislation ever enacted into law: the Artificial intelligence Capabilities and Transparency (AICT) Act and the Artificial Intelligence for the Military (AIM) Act.
For the first time ever, Congress has signaled that the federal government is finally moving towards defining A.I. ethics as a core requirement of the U.S. national strategy, while also asserting that traditional American values must be integrated into government and Department of Defense (DOD) A.I. use cases.
While this legislation falls far short of the calls for regulation consistent with the European Union model and desired by many in the A.I. ethics community, it plants the seeds of a thoughtful and inevitable A.I. ethics regulatory regime.
Overall, the AICT and AIM Acts seek to accelerate the federal government and DOD’s ability to compete with the geopolitical reality of China’s (and, to a lesser extent, Russia’s) attempts to use artificial intelligence in ways that threaten the national security and economic interests of the United States.
Notably, the AICT defines A.I. ethics as “the quantitative analysis of artificial intelligence systems to address matters relating to the effects of such systems on individuals and society, such as matters of fairness or the potential for discrimination.”
Introduction La cybersécurité est devenue une priorité stratégique pour toutes les entreprises, grandes ou petites.…
Cybersécurité : les établissements de santé renforcent leur défense grâce aux exercices de crise Face…
La transformation numérique du secteur financier n'a pas que du bon : elle augmente aussi…
L'IA : opportunité ou menace ? Les DSI de la finance s'interrogent Alors que l'intelligence…
Telegram envisage de quitter la France : le chiffrement de bout en bout au cœur…
Sécurité des identités : un pilier essentiel pour la conformité au règlement DORA dans le…
This website uses cookies.