Silhouette of a human and a robot playing cards

A Bill in Congress Would Limit Uses of Facial Recognition


This week IBM, Amazon, and Microsoft all said they would halt sales of facial recognition to US police and called on Congress to impose rules on use of the technology.

A police reform bill introduced in the House of Representatives Monday by prominent Democrats in response to weeks of protest over racist policing practices would do just that. But some privacy advocates say its restrictions aren’t tight enough and could legitimize the way police use facial recognition today.

“We’re concerned,” says Neema Guliani, senior legislative counsel for the ACLU in Washington, DC, citing evidence that many facial recognition algorithms are less accurate on darker skin tones. “There should be a ban on use of facial recognition.” Last year several cities passed such bans, including San Francisco.

The proposed Justice in Policing Act would, among other things, tighten the definition of police misconduct and ban chokeholds like the one that killed George Floyd in Minneapolis last month. It is sponsored by senators Cory Booker (D-New Jersey) and Kamala Harris (D-California), and representatives Karen Bass (D-California) and Jerrold Nadler (D-New York). A five-page summary of the bill’s main provisions doesn’t mention that it includes what could become the first federal restrictions on facial recognition technology.

None of those rules directly limit what a sheriff’s office or city police department could do with facial recognition.

The Secret to Machine Learning? Human Teachers

One part of the bill requires that federal uniformed officers, such as FBI agents, wear bodycams and use dashcams in marked vehicles. It states that facial recognition cannot be built into these devices, or used to scan bodycam video in real time, for example to spot persons of interest in a crowd. To apply facial recognition to bodycam footage federal agents would need to secure a warrant after convincing a judge the information is “relevant to an ongoing criminal investigation.”

READ ALSO  Just How Secure is The Brave browser? 

Another provision specifies that police departments using federal grants to buy or rent bodycams must adopt policies on the use of facial recognition on footage from the devices, including securing a judge’s approval and only deploying it in cases of “imminent threats or serious crimes.”

Jameson Spivack, a policy associate at Georgetown’s Center on Privacy and Technology, says those restrictions wouldn’t affect many of the ways facial recognition is used by US law enforcement. The technology is more commonly applied to footage from sources other than body or dash cams, such as surveillance cameras, sometimes solicited from private citizens or businesses. “If Congress passes this legislation that barely touches facial recognition at all, companies could go right back to selling to the police and not much will change,” Spivack says.

Civil rights groups that campaign on surveillance and facial recognition say that would be concerning because the technology is unreliable and expands police powers—effects that burden communities of color most of all.

“I’m very disappointed that Congress would take this sort of regulatory approach,” says Albert Fox Cahn, founder of the nonprofit Surveillance Technology Oversight Project and a fellow at NYU School of Law. “This is incredibly biased technology that will put Americans of color at higher risk of wrongful arrest than white Americans.”

IBM and Amazon didn’t respond to requests for comment on the Justice in Policing Act; Microsoft declined to comment. IBM has halted sales of facial recognition permanently; Microsoft and Amazon both paused sales only to US police until federal regulation is in place, with Amazon saying its hiatus will last 12 months.

READ ALSO  Zynga creates $25 million fund for promoting diversity education and charities

Facial recognition providers have come under growing scrutiny from researchers and privacy advocates in recent years over evidence that the technology often makes more errors on darker skin tones than lighter ones.



Source link

?
WP Twitter Auto Publish Powered By : XYZScripts.com