AI industry leaders create forum to regulate big machine learnings models

Send a link to a friend  Share

[July 26, 2023]  (Reuters) - OpenAI, Microsoft, Alphabet's Google and Anthropic are launching a forum to regulate the development of large machine learnings models, the industry leaders of artificial intelligence said on Wednesday. 

Google, Microsoft and Alphabet logos and AI Artificial Intelligence words are seen in this illustration taken, May 4, 2023. REUTERS/Dado Ruvic/File Photo

The group will focus on ensuring safe and responsible development of what is called "frontier AI models" that exceed the capabilities present in the most advanced existing models.

They are highly capable foundation models that could have dangerous capabilities sufficient to pose severe risks to public safety.

Generative AI models, like the one behind chatbots like ChatGPT, extrapolate large amounts of data at high speed to share responses in the form of prose, poetry and images.

While the use cases for such models are plenty, governments bodies including the European Union and industry leaders including OpenAI CEO Sam Altman have said appropriate guardrail measures would be necessary to tackle the risk posed by AI.

The industry body, Frontier Model Forum, will work to advance AI safety research, identify best practices for deployment of frontier AI models and work with policymakers, academic and companies.

But it will not engage in lobbying with governments, an OpenAI spokesperson said.

"Companies creating AI technology have a responsibility to ensure that it is safe, secure and remains under human control," Microsoft President Brad Smith said in a statement.

The forum will create an advisory board in the coming months and also arrange for funding with a working group as well as create an executive board to lead its efforts.

(Reporting by Chavi Mehta in Bengaluru and Jeffrey Dastin in Palo Alto, Calif; Editing by Arun Koyyur)

[© 2023 Thomson Reuters. All rights reserved.]

Copyright 2022 Reuters. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.  Thompson Reuters is solely responsible for this content.

 

 

Back to top