Advertisement
U.S. markets open in 3 hours 52 minutes
  • S&P Futures

    5,291.75
    -3.50 (-0.07%)
     
  • Dow Futures

    39,791.00
    -103.00 (-0.26%)
     
  • Nasdaq Futures

    18,484.50
    -12.75 (-0.07%)
     
  • Russell 2000 Futures

    2,123.80
    +0.80 (+0.04%)
     
  • Crude Oil

    85.17
    +1.46 (+1.74%)
     
  • Gold

    2,283.50
    +26.40 (+1.17%)
     
  • Silver

    25.83
    +0.76 (+3.02%)
     
  • EUR/USD

    1.0742
    -0.0005 (-0.04%)
     
  • 10-Yr Bond

    4.3290
    0.0000 (0.00%)
     
  • Vix

    13.70
    +0.05 (+0.37%)
     
  • GBP/USD

    1.2568
    +0.0017 (+0.13%)
     
  • USD/JPY

    151.6680
    +0.0350 (+0.02%)
     
  • Bitcoin USD

    66,089.24
    -3,442.88 (-4.95%)
     
  • CMC Crypto 200

    885.54
    0.00 (0.00%)
     
  • FTSE 100

    7,979.47
    +26.85 (+0.34%)
     
  • Nikkei 225

    39,838.91
    +35.82 (+0.09%)
     

Experts in AI are warning it poses a “risk of extinction” akin to nuclear war

Sam Altman, CEO of OpenAI, testified in Washington earlier this month.
Sam Altman, CEO of OpenAI, testified in Washington earlier this month.

In an open letter, dozens of industry experts have warned that artificial intelligence could lead to an extinction event. The letter was published on May 30 by the Center for AI Safety (CAIS), an advocacy group that aims to reduce societal-scale risks from AI, according to its website. It compares the potential effects of AI to pandemics caused by deadly diseases and nuclear warfare.

Sam Altman—the billionaire co-founder of ChatGPT maker OpenAI—signed the letter, along with the CEOs of AI firms Google DeepMind and Anthropic.

Read more

It was also signed by Dr. Geoffrey Hinton, Dr. Yoshua Bengio, and Dr. Yann LeCun, a group of computer scientists and professors often described as the “godfathers of AI” for their extensive work developing the field of AI deep learning. They won the prestigious Turing Award in 2018 for their efforts.

In addition to those industry heavyweights, a diverse range of celebrities and professors signed the letter, including the singer Grimes (who has previously used AI for creative exploration) and the popular podcaster and neuroscientist Sam Harris.

Quotable: The (brief) statement

“Mitigating the risk of extinction from AI should be a global priority alongside other societal-scale risks such as pandemics and nuclear war,” the one-sentence statement said. Its brevity and relatively broad scope allowed for signatories with a wide range of viewpoints.

What does more regulation of AI mean to Sam Altman?

Altman—one of the most recognizable faces in AI—has alternated between saying he’s “scared” of AI and championing it as “the greatest technology humanity has yet developed” (and that was in the same interview).

The OpenAI CEO met with US president Biden and testified before the Senate Judiciary Committee earlier this month, asking lawmakers for increased regulation of his industry.

“My worst fear is we cause significant harm to the world,” Altman said during the testimony. “If this technology goes wrong, it can go quite wrong.”

Altman outlined what the new regulations would look like in a blog post published last week with OpenAI’s two other co-founders. He called for three major reforms, including increased coordination between AI developers across the world, and the creation of an advanced technology that could rein in a potential “superintelligence” created by AI.

He also encouraged the formation of a global regulatory group for AI technology similar in structure to the International Atomic Energy Agency, with powers to inspect systems, require audits, and test for compliance with safety standards.

Related stories:

🇪🇺 OpenAI’s Sam Altman threatened to leave the countries of the EU if he doesn’t like their ChatGPT regulation

🇮🇹 Italy has banned ChatGPT, but will its clampdown work?

🇨🇳 China wants to require a security review of AI services before they’re released

More from Quartz

Sign up for Quartz's Newsletter. For the latest news, Facebook, Twitter and Instagram.

Click here to read the full article.

Advertisement