Altman, cofounder of the startup that last year kicked off the
generative AI boom, was abruptly fired by OpenAI’s board last
week, sending shockwaves through the tech world and prompting
employees to make threats of a mass resignation at the company.
Across the Atlantic, the European Commission, the European
Parliament and the EU Council have been hashing out the fine
print of the AI Act, a sweeping set of laws that would require
some companies to complete extensive risk assessments and make
data available to regulators.
In recent weeks, talks have hit stumbling blocks over the extent
to which companies should be allowed to self-regulate.
Brando Benifei, one of two European Parliament lawmakers leading
negotiations on the laws, told Reuters: “The understandable
drama around Altman being sacked from OpenAI and now joining
Microsoft shows us that we cannot rely on voluntary agreements
brokered by visionary leaders.
“Regulation, especially when dealing with the most powerful AI
models, needs to be sound, transparent and enforceable to
protect our society.”
On Monday, Reuters reported that France, Germany and Italy had
reached an agreement on how AI should be regulated, a move
expected to accelerate negotiations at the European level.
The three governments support "mandatory self-regulation through
codes of conduct" for those using generative AI models, but some
experts said this would not be enough.
Alexandra van Huffelen, Dutch minister for digitalisation, told
Reuters the OpenAI saga underscored the need for strict rules.
She said: “The lack of transparency and the dependence on a few
influential companies in my opinion clearly underlines the
necessity of regulation.”
Meanwhile, Gary Marcus, an AI expert at New York University,
wrote on social media platform X: "We can’t really trust the
companies to self-regulate AI where even their own internal
governance can be deeply conflicted.
"Please don't gut the EU AI Act; we need it now more than ever."
(Reporting by Martin Coulter and Supantha Mukherjee; Editing by
Susan Fenton)
[© 2023 Thomson Reuters. All rights
reserved.]
This material may not be published,
broadcast, rewritten or redistributed.
Thompson Reuters is solely responsible for this content.
|
|