Leading AI companies join new US safety consortium: Biden administration
Leading artificial intelligence (AI) companies joined a new safety consortium to play a part in supporting the safe development of generative AI, the Biden administration announced Thursday.
Microsoft, Alphabet’s Google, Apple, Facebook-parent Meta Platforms, OpenAI and others have joined the AI Safety Institute Consortium, a coalition focusing on the safe development and deployment of generative AI.
The newly formed consortium also includes government agencies, academic institutions and other companies like Northrop Grumman, BP, Qualcomm and Mastercard.
“The U.S. government has a significant role to play in setting the standards and developing the tools we need to mitigate the risks and harness the immense potential of artificial intelligence,” Commerce Secretary Gina Raimondo said in a statement.
The group will be under the umbrella of the U.S. AI Safety Institute and will work toward achieving goals unveiled by President Biden’s executive order issued in late October that focused on ensuring the safety in AI development while preserving the privacy of data. The group will work on developing guidelines for “red-teaming, capability evaluations, risk management, safety and security and watermarking synthetic content.”
Biden’s executive order directs federal agencies to ramp up the development of techniques to train AI systems. In the process, it strives to ensure data privacy during training.
Generative AI can create videos, photos and text, all from prompts entered into queries. The development has caused both alarm and enthusiasm around it, with some fearing that its rapid evolution would replace a majority of jobs or cause disruptions during elections.
“President Biden directed us to pull every lever to accomplish two key goals: set safety standards and protect our innovation ecosystem,” Raimondo said. “That’s precisely what the U.S. AI Safety Institute Consortium is set up to help us do.”
Copyright 2023 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.