chatgpt ai writer
AI could benefit society, but it could also become a monster. To guide the way, we need leadership and understanding
We’re at a Frankenstein moment.
An artificial intelligence boom is taking over Silicon Valley, with hi-tech firms racing to develop everything from self-driving cars to chatbots capable of writing poetry.
Yet AI could also spread conspiracy theories and lies even more quickly than the internet already does – fueling political polarization, hate, violence and mental illness in young people. It could undermine national security with deepfakes.
In recent weeks, members of Congress have sounded the alarm over the dangers of AI but no bill has been proposed to protect individuals or stop the development of AI’s most threatening aspects.
Most lawmakers don’t even know what AI is, according to Representative Jay Obernolte, the only member of Congress with a master’s degree in artificial intelligence.
What to do?
Many tech executives claim they can simultaneously look out for their company’s interests and for society’s. Rubbish. Why should we assume that their profit motives align perfectly with the public’s needs?
Sam Altman – the CEO of OpenAI, the company responsible for some of the most mind-blowing recent advances in AI – believes no company, including his, should be trusted to solve these problems. The boundaries of AI should be decided, he says, not by “Microsoft or OpenAI, but society, governments, something like that”.
Cybersécurité et PME : les risques à ne pas sous-estimer On pense souvent que seules…
Comment reconnaître une attaque de phishing et s’en protéger Le phishing ou « hameçonnage »…
Qu’est-ce que la cybersécurité ? Définition, enjeux et bonnes pratiques en 2025 La cybersécurité est…
Cybersécurité : les établissements de santé renforcent leur défense grâce aux exercices de crise Face…
L'IA : opportunité ou menace ? Les DSI de la finance s'interrogent Alors que l'intelligence…
Sécurité des identités : un pilier essentiel pour la conformité au règlement DORA dans le…
This website uses cookies.