Advertisement
U.S. markets open in 9 hours 15 minutes
  • S&P Futures

    4,984.75
    -6.75 (-0.14%)
     
  • Dow Futures

    38,605.00
    -37.00 (-0.10%)
     
  • Nasdaq Futures

    17,554.75
    -52.50 (-0.30%)
     
  • Russell 2000 Futures

    2,010.10
    -0.10 (-0.00%)
     
  • Crude Oil

    78.27
    +0.09 (+0.12%)
     
  • Gold

    2,042.00
    +2.20 (+0.11%)
     
  • Silver

    23.20
    +0.07 (+0.30%)
     
  • EUR/USD

    1.0817
    +0.0005 (+0.04%)
     
  • 10-Yr Bond

    4.2750
    -0.0200 (-0.47%)
     
  • Vix

    15.42
    +0.71 (+4.83%)
     
  • dólar/libra

    1.2632
    +0.0006 (+0.05%)
     
  • USD/JPY

    150.0000
    +0.0740 (+0.05%)
     
  • Bitcoin USD

    52,065.79
    +276.69 (+0.53%)
     
  • CMC Crypto 200

    885.54
    0.00 (0.00%)
     
  • FTSE 100

    7,719.21
    -9.29 (-0.12%)
     
  • Nikkei 225

    38,241.71
    -121.90 (-0.32%)
     

Biden's AI order is massive, but far from enough to address the technology's risks

The Biden administration on Monday unveiled its most comprehensive effort to regulate powerful artificial intelligence technologies yet, issuing a sweeping executive order. Its aim: ensuring American leadership in AI while preventing AI abuses that could threaten Americans’ civil rights and safety.

The order is part of the administration’s attempt to ensure government rules keep pace with the rapidly evolving technology. Experts have praised portions of the executive order, saying its scale and scope should prove helpful as companies continue to develop new forms of AI.

"One of the reasons that it's a helpful step is it's an all-of-government approach rather than just thinking siloed about either one specific industry or one substantive area of the law," explained Aaron Cooper, vice president of global policy at BSA, a software industry trade group. "It’s taking a holistic approach."

But despite the wide-ranging scope of the executive order, other experts said it fails to address a number of issues, including how AI can help prevent societal problems ranging from consumer privacy and security to competition.

"This [plan] is silent on AI and democracy. … This is silent about the fact that we could use AI to promote citizen engagement," said Beth Simone Noveck, director of the Burnes Center for Social Change at Northeastern University in Boston. "There is nothing in here about public consultation. There's nothing in here about engaging with citizens to develop any of these things that are going to come next."

Congress, a new agency, and social risks

Congressional legislation is the biggest tool the government has to potentially regulate and foster AI growth.

But despite hosting a number of hearings and various lawmakers proposing different forms of legislation, Congress is still far from passing any kind of federal legislation related to AI.

President Joe Biden signs an executive on artificial intelligence in the East Room of the White House, Monday, Oct. 30, 2023, in Washington. Vice President Kamala Harris looks on at right. (AP Photo/Evan Vucci)
Good but not great? President Joe Biden signs an executive order on artificial intelligence at the White House, Monday, Oct. 30. Vice President Kamala Harris looks on. (AP Photo/Evan Vucci) (ASSOCIATED PRESS)

Without that, US states and municipalities could pass a hodgepodge of overlapping AI laws that confuse the industry and do little to help Americans.

And we’ve seen the same kind of thing happen with privacy legislation — without federal rules, individual states have passed their own regulations, creating a patchwork of laws across the country.

According to Cooper, the executive order also falls short of requiring AI developers and those deploying high-risk AI systems to perform impact assessments and operate risk management programs. Congressional legislation, he explained, is the only way to get that kind of regulation in place.

New rules alone, however, might not be enough. 

Bruce Schneier, security technologist and lecturer at Harvard Kennedy School, told Yahoo Finance that AI is the type of technological innovation that will require the creation of an entirely new government agency.

"If this is actually as revolutionary a technology as we think it is, and it probably is, it needs a new government agency," he explained. "Think of the technologies that have caused the formation of new government agencies. Trains did. Radio did. Nuclear power did. These technologies were ... believed to be so fundamental that we needed a government agency to deal with societal change."

The order also doesn’t go far enough in addressing broader social challenges, whether that’s getting more people involved in the democratic process, enhancing literacy, or making improvements to healthcare.

"The big thing that I wanted to see more of [was] … how the public will be engaged in developing all of this," said Noveck. "There hasn't been enough public conversation around the benefits and risks of AI. The convening sort of happened to have been, you know, corporate CEOs in largely closed-door meetings."

A solid start

While the executive order lacks a number of provisions, what it does include is still winning praise.

That includes immigration reforms geared toward attracting workers with AI expertise from other countries and attempting to prevent bias in AI algorithms. What’s more, the order continues to keep a focus on helping to grow new AI businesses.

Still, as the technology continues to change, it’s clear the government will need to work especially fast, and carefully, to get the right rules in place in time. Because there’s no turning back now.

Daniel Howley is the tech editor at Yahoo Finance. He's been covering the tech industry since 2011. You can follow him on Twitter @DanielHowley.

Click here for the latest technology news that will impact the stock market.

Read the latest financial and business news from Yahoo Finance

Advertisement