Skip to main content
See every side of every news story
Published loading...Updated

Open Letter Calls for Superintelligence Development Ban

The Future of Life Institute-led petition urges a halt on AI superintelligence development until safety and public consensus are ensured, citing risks like human extinction and economic disruption.

  • A group of prominent figures has called for a prohibition on developing superintelligence, a form of AI that would surpass human intellect, until safety and public support are ensured.
  • Concerns raised include economic obsolescence, loss of freedom, and potential human extinction, as highlighted in the letter.
  • The petition states the moratorium should continue until there is broad scientific consensus on safe development.
  • Many leading AI companies aim to build superintelligence within the next decade, with capabilities beyond human cognitive tasks.
Insights by Ground AI
Podcasts & Opinions

193 Articles

Center

Over 850 persons in the field of technology, science and politics, including the Founder of Apple, Steve Wozniak, and Founder Virgin Group, Richard Branson, have signed a statement published Wednesday by which they request that any efforts to create "super intelligence", a form of artificial intelligence capable of crossing people across all cognitive tasks, are transmitted to the CNBC.

·Romania
Read Full Article
Think freely.Subscribe and get full access to Ground NewsSubscriptions start at $9.99/yearSubscribe

Bias Distribution

  • 53% of the sources are Center
53% Center

Factuality 

To view factuality data please Upgrade to Premium

Ownership

To view ownership data please Upgrade to Vantage

understandingai.org broke the news in on Wednesday, November 15, 2023.
Sources are mostly out of (0)
News
For You
Search
BlindspotLocal