The Biden administration's proposal joins hundreds of other
guidelines and policy frameworks released by tech companies,
industry associations and other government agencies over the
past few years.
Like the others, the White House version suggests numerous
practices that developers and users of AI software should
voluntarily follow to prevent the technology from unfairly
disadvantaging people.
In some cases, algorithms for administering healthcare have not
prioritized the needs of Black patients, and facial recognition
has been deployed for policing in schools despite its potential
for underperforming on darker skin tones.
"These technologies are causing real harms in the lives of
Americans, harms that run counter to our core democratic values,
including the fundamental right to privacy, freedom from
discrimination and our basic dignity," a senior administration
official told reporters.
Some companies have introduced ethical safeguards into practice.
But administration officials said they were concerned that new
problems with AI are emerging.
The Biden administration's move comes at a time when the
European Union is moving toward regulating high-risk systems
while the United States has not come close to a comprehensive
law to regulate AI.
Tuesday's announcement did not include proposals for new laws.
Instead, officials said individual regulators including the
Federal Trade Commission would continue to apply existing rules
to cutting-edge systems.
The White House's proposal says everyone in America should be
protected from unsafe or ineffective systems, discrimination by
algorithms and abusive data collection. They also should be
entitled to notice and explanation concerning AI programs they
encounter.
The Bill of Rights also asks companies, government agencies and
others adopting AI to conduct significant testing and oversight
and publicize results, so all stakeholders can understand what a
"reasonable starting point is" for action on issues, senior
administration officials said.
(Reporting by Paresh Dave and Nandita Bose; Editing by Leslie
Adler)
[© 2022 Thomson Reuters. All rights
reserved.]
This material may not be published,
broadcast, rewritten or redistributed.
Thompson Reuters is solely responsible for this content.
|
|