AI firms urged to calculate catastrophe odds like Oppenheimer
5 Articles
5 Articles
AI firms warned to calculate threat of super intelligence or risk it escaping human control
Artificial intelligence companies have been urged to replicate the safety calculations that underpinned Robert Oppenheimer’s first nuclear test before they release all-powerful systems.Max Tegmark, a leading voice in AI safety, said he had carried out calculations akin to those of the US physicist Arthur Compton before the Trinity test and had found a 90% probability that a highly advanced AI would pose an existential threat.The US government we…
Artificial intelligence companies have been urged to replicate the safety calculations used in Robert Oppenheimer's first nuclear test before launching their highly capable systems. Max Tegmark, a leading expert on AI safety, stated that he performed calculations similar to those performed by American physicist Arthur Compton before the Trinity test and found a 90% probability that AI would…
AI Companies Caution: Assess the Risks of Superintelligence or Face the Consequences of Losing Human Control
Prior to the deployment of the omnipotent system, AI companies are encouraged to replicate the safety assessments that formed the basis of Robert Oppenheimer’s initial nuclear test. Max Tegmark, a prominent advocate for AI safety, conducted analyses akin to those performed by American physicist Arthur Compton before the Trinity test, indicating a 90% likelihood that [...] Source The post AI Companies Caution: Assess the Risks of Superintelligenc…
Coverage Details
Bias Distribution
- 100% of the sources lean Left
Factuality
To view factuality data please Upgrade to Premium
