Open Letter Calls for Superintelligence Development Ban
A diverse coalition of over 1,000 leaders urges halting AI superintelligence development citing risks including economic disruption, civil liberties loss, and potential human extinction.
- On Wednesday, the Future of Life Institute released a concise statement calling for a prohibition on developing superintelligence until broad scientific consensus and strong public buy-in.
- Organizers say tech giants Google, OpenAI and Meta Platforms race toward superintelligence without sufficient guardrails, raising risks from economic displacement to national security and possible human extinction.
- The roster mixes celebrities and policymakers, including Prince Harry and Meghan, Duke and Duchess of Sussex, alongside AI researchers Geoffrey Hinton, Yoshua Bengio and Stuart Russell.
- A new Future of Life Institute poll shows most Americans favor robust oversight before pursuing superintelligence, with 73% supporting regulation, 64% wanting proven safety, and only 5% backing unregulated development.
- After earlier pauses were ignored, the industry continued rolling out advanced models such as GPT‑4o and GPT‑5, while many major AI executives did not sign and OpenAI issued subpoenas against FLI last week.
222 Articles
222 Articles
‘What’s our AI strategy?’ is the wrong question for agency leaders
Since the Trump administration unveiled its AI Action Plan in July, federal agencies have been processing the artificial intelligence innovation and infrastructure policy recommendations amid their struggles to navigate the influx of new opportunities to streamline ways of working. AI’s return on investment continues to be mixed or downright disappointing. According to a 2023 report from IBM, enterprise-wide AI initiatives achieved an ROI of on…
MIT professor’s 4 critical steps to stop AI from hijacking humanity
Artificial superintelligence is still a hypothetical, but we’re inching closer every day. What happens when we finally create a digital beast that vastly surpasses human intellect in all domains?MIT physics professor Max Tegmark warns that if that day comes, we’ll be in deeper trouble than we can imagine.Despite the evident dangers and widespread hesitation, people like OpenAI CEO Sam Altman, a leading figure in the AI boom, are determined to se…
Prominent Personalities Sign Letter Seeking Ban On 'Development Of Superintelligence'
Prominent Personalities Sign Letter Seeking Ban On 'Development Of Superintelligence' Authored by Andrew Moran via The Epoch Times, Hundreds of people, from conservative commentators to prominent tech executives, have signed a letter seeking a ban on “the development of superintelligence.” This year, leading technology firms such as Google, Meta Platforms, and OpenAI have accelerated efforts to build artificial intelligence (AI) systems capable …
'Godfathers of AI' and Steve Wozniak Join 850 Others in Call for Ban on 'Superintelligence' AI Development
A diverse group of influential individuals, from tech pioneers to politicians, has signed a statement urging a pause in the creation of artificial intelligence that surpasses human cognitive abilities. The list, which includes the legendary "Godfathers of AI" and Apple co-founder Steve Wozniak along with figures like Richard Branson, believe that superintelligent AI could lead to "human economic obsolescence and disempowerment, losses of freedom,
From Bannon to Prince Harry,800 against AI superintelligence - Science & Tecnology
Over 800 scientists, politicians, entrepreneurs, artists, and celebrities have signed a declaration calling for a halt to the development of artificial superintelligence, a yet-to-be-reached stage of AI that could pose a threat to humanity. (ANSA)
Prince Harry, Meghan add names to letter calling for ban on development of AI 'superintelligence'
Prince Harry and Meghan have joined a diverse group, including Steve Bannon and Glenn Beck, to call for a ban on AI "superintelligence" that threatens humanity.
Coverage Details
Bias Distribution
- 57% of the sources are Center
Factuality
To view factuality data please Upgrade to Premium
































