US Proposes Tougher AI Guidelines To Prevent Exploitation, Discrimination And Abuse

The White House proposed a non-binding Artificial Intelligence (AI) Bill of Rights likely to help parents, patients and workers avert harm from the increasing use of automation in education, health care, and employment.

  • Like the others, the White House version suggests numerous practices that AI software developers and users should voluntarily follow to prevent the technology from unfairly exploiting people.

  • In some cases, algorithms for administering healthcare did not prioritize the needs of Black patients. They also shirked from deploying facial recognition for policing in schools despite its potential for underperforming on darker skin tones, Reuters reports.

  • A September finding revealed that pulse oximeter errors could have led Black patients to face a 4.5-hour delay in COVID-19 treatment.

  • In August, Meta Platforms Inc (NASDAQ: META) fired 60 employees randomly using an algorithm.

  • A senior official emphasized how these technologies caused actual harm in the lives of Americans, including the fundamental right to privacy, freedom from discrimination, and basic dignity.

  • The officials said individual regulators, including the Federal Trade Commission, would continue to apply existing rules to cutting-edge systems.

  • The proposal sought protection from unsafe or ineffective systems, algorithm discrimination, and abusive data collection. They also should be entitled to notice and explanation concerning AI programs they encounter.

  • The proposal also asked companies, government agencies, and others adopting AI to conduct significant testing and oversight and publicize results.

Price Action: META shares traded higher by 1.9% at $141.16 in the premarket on the last check Tuesday.

See more from Benzinga

Don't miss real-time alerts on your stocks - join Benzinga Pro for free! Try the tool that will help you invest smarter, faster, and better.

© 2022 Benzinga does not provide investment advice. All rights reserved.