A group of leading experts, including Geoffrey Hinton and Yoshua Bengio, have highlighted the pressing need for stronger regulatory actions and increased funding in AI safety research to address the potential risks associated with rapid advancements in AI technology.

On 20 May 2024, a group of 25 experts, including Geoffrey Hinton and Yoshua Bengio, widely recognized as “godfathers” of artificial intelligence (AI), published a paper in the journal Science, highlighting the world’s lack of preparedness for rapid advancements in AI technology. The experts, who also include economist Daniel Kahneman and AI researchers Dawn Song and Sheila McIlraith, argue that current government safety frameworks are insufficient to address the potential risks associated with the technology’s rapid development.

The paper, released ahead of a two-day AI Safety Summit in Seoul, underscores the necessity for robust regulatory actions and increased funding for AI safety research. The recommendations involve establishing stringent government safety frameworks that adapt in response to AI capabilities, mandating comprehensive risk assessments by tech firms, and restricting the deployment of autonomous AI in crucial societal roles.

The call to action follows the outcomes of last year’s Global AI Safety Summit at Bletchley Park, UK, which secured voluntary safety agreements with major tech companies like Google, Microsoft, and Meta. Additionally, regulatory initiatives such as the EU’s AI Act and a recent White House executive order have introduced new AI safety requirements.

The experts warn that while AI can offer significant benefits, such as improving healthcare and raising living standards, it also poses risks of social instability, malicious use, and loss of human oversight, particularly with the industry’s shift toward autonomous AI systems. The paper highlights the potential threats of AI systems being used in cybercrime, warfare, and other harmful activities, urging immediate and comprehensive regulatory measures to mitigate such risks.

Share.
Leave A Reply

Exit mobile version