biden ai policy
About the author: Susan Ariel Aaronson is the director of the Digital Trade and Data Governance Hub and a senior fellow at the Centre for International Governance Innovation.
The Biden administration’s new “Blueprint for an AI Bill of Rights” is simultaneously a big step forward and a disappointment. Released last week, the blueprint articulates a set of principles that could address some of the major concerns about artificial intelligence design and deployment. But policymakers will need to do more to achieve an elusive objective: trust in AI.
AI’s trust problems have been apparent for some time. In 2021, the National Institute for Standards published a paper explaining the relationship between artificial intelligence systems and the consumers and firms who use AI systems to make decisions. The AI user has to trust the AI system because of its complexity, unpredictability and lack of moral or ethical capacity, changing the dynamic between user and system into a relationship. So if AI designers and deployers want AI to be trusted, they must encourage trustworthy behavior by the system as well as trust in the system.
Why is trust in this technology so important? AI is now essential to U.S. national security, economic development and innovation. It also underpins other new technologies such as virtual reality. Moreover, competitiveness in AI is essential to U.S. productivity and innovation. It holds great promise to help mitigate wicked problems such as climate change.
Sécurité des mots de passe : bonnes pratiques pour éviter les failles La sécurité des…
Ransomware : comment prévenir et réagir face à une attaque Le ransomware est l’une des…
Cybersécurité et e-commerce : protéger vos clients et vos ventes En 2025, les sites e-commerce…
Les ransomwares : comprendre et se défendre contre cette menace En 2025, les ransomwares représentent…
RGPD et cybersécurité : comment rester conforme en 2025 Depuis sa mise en application en…
VPN : un outil indispensable pour protéger vos données Le VPN, ou « Virtual Private…
This website uses cookies.