UK urged to take swift action on AI regulation to avert potential existential threat | Future Focus

On this week's episode of Yahoo Finance Future Focus, our host Brian McGleenon sat down with Lord Chris Holmes, a prominent advocate for the ethical use of technology, regarding his proposed Artificial Intelligence Regulation Bill. Lord Holmes emphasised the urgent need for regulatory action to mitigate the significant risks posed by rapid advancements in AI. Despite forecasts predicting the UK AI market surpassing $1 trillion by 2035, Lord Holmes cautioned that without proper oversight, AI could lead to catastrophic consequences, even potentially threatening the existence of humankind. He highlighted the importance of fostering the development of "ethical AI" based on principles such as trust, transparency, and accountability. Lord Holmes stressed the need for public engagement and warned against the deployment of AI in the competing interests of nation-states, particularly on the battlefield.

Video Transcript

[MUSIC PLAYING]

BRIAN MCGLEENON: On this week's episode of "Yahoo Finance Future Focus," we are delighted to welcome back into the studio Lord Chris Holmes. Lord Holmes has proposed an artificial intelligence regulation bill, and he's put it to UK lawmakers. Lord Holmes, welcome to "Yahoo Finance Future Focus."

CHRIS HOLMES: Thank you very much for the invitation, Brian.

BRIAN MCGLEENON: Why does the UK need regulation around artificial intelligence?

CHRIS HOLMES: In many ways, AI is already all around us. The prime minister hosted a really successful AI safety summit last November at Bletchley Park. And that really went to the frontier risks, the existential risks associated with AI, incredibly important. But having done that, it makes it even more important to look to all of the other elements of AI where it's already impacting people's lives.

Just to take one obvious example, let's look at recruitment. When it comes to shortlisting and candidate selection, there's already a great deal of AI being used, probably unknown to most, if not all, of those candidates in any particular selection process.

BRIAN MCGLEENON: And just touching upon the existential threats, you know, what are we talking about in that regard?

CHRIS HOLMES: Potentially, the complete annihilation of humankind. So in terms of risks, it's probably up there on the top slot. And I remember when I was on the AI Select Committee in 2018, and we did our report, and though we put together a very well-thought-through and nuanced report as to the whole AI landscape, most of the newspaper headlines said things like, lords predict human extinction from artificial intelligence.

And there's no question that AI potentially deployed on the battlefield is a risk that we should all be fully aware of. And that's why part of the underpin for all of my bill and indeed the select committee report that we did talked about ethical AI. We understand the principles to make a success of this. Having that sense of ethical AI bedrock to any regulatory approach has to make sense.

BRIAN MCGLEENON: Where does the UK stand currently with the development of AI? Is it a world leader?

CHRIS HOLMES: The UK is in a great place when it comes to the development of AI in so many different areas. We have superb startup and scale-ups. But in terms of the legislative and the regulatory arena, we need to and we must act. We have the opportunity to do something particularly unique in the UK, not just because of that financial ecosystem, our geography, our time zone, the city, our tech startups, the university sectors, all of those incredibly important and impressive, but because of the greatest gift that we have of all, English common law, to be able to construct something on a common law basis, the certainty and stability that comes from common law.

It's why it's used for contract all around the world. To base something on common law so it can then develop over time through case law and precedent whilst also being interoperable with other regulatory approaches, such as EU.

BRIAN MCGLEENON: The legal frameworks, the contrast between the UK's flexibility and the EU's, does that give the UK an advantage when it comes to whatever regulatory framework that they develop?

CHRIS HOLMES: It does if we choose to take it. And a really quick example from last year, when we did the Electronic Trade Documents Act, it's a blockchain act that never mentions blockchain. So through that, it sets out criteria, but it never mentions a specific technology. So not only is the technology neutral. It's also, as far as one can be, technology future-proofed.

And this is the same approach that we can take with the AI regulation bill to set out principles, concepts, values, ethics, things that we understand. We know how to make a success of this. It's why I believe we should and we must act to legislate now.

BRIAN MCGLEENON: There are existing regulatory bodies. Now, should they be handling this, or do we need a new singular AI authority to oversee this rapidly advancing technology?

CHRIS HOLMES: The government's position is that the existing regulators, such as the FCA, Ofcom, Ofgem, and so on, they should regulate AI in their sector, in their verticals. My sense is that we should go somewhat beyond that. And the first clause of my bill suggests an AI authority, not to be the AI regulator, not to be a huge cumbersome, bureaucratic, constantly expanding regulator at all, but to be light, agile, and horizontally focused, so to look across all of those existing regulators and to assess their competency to address the challenges and the opportunities of AI and where the gaps may be, to look across all of the current relevant legislation, such as in consumer protection, for example, and to assess its competency to address the challenges and the opportunities of AI. In no sense light touch, but when we're at our best, right touch regulation.

BRIAN MCGLEENON: Lord Holmes, it's always a pleasure to speak to you. Thank you for coming on this week's episode of "Yahoo Finance Future Focus."

CHRIS HOLMES: Pleasure. Thank you very much for the opportunity.

[MUSIC PLAYING]

Advertisement