September 20th, 2024

Federal civil rights watchdog sounds alarm over Feds use of facial recognition

The U.S. Commission on Civil Rights criticized federal agencies' use of facial recognition technology, highlighting concerns over standardization, oversight, and civil rights violations, particularly affecting women and people of color.

Read original articleLink Icon
ConcernSkepticismFrustration
Federal civil rights watchdog sounds alarm over Feds use of facial recognition

The U.S. Commission on Civil Rights (UCCR) has raised significant concerns regarding the use of facial recognition technology (FRT) by the Department of Justice (DOJ), Department of Homeland Security (DHS), and Department of Housing and Urban Development (HUD). A report released by the commission highlights issues such as a lack of standardization, transparency, and oversight, which can lead to wrongful arrests and discrimination, particularly against women and people of color. While DOJ and DHS have implemented interim policies for FRT use, HUD lacks any formal guidelines. The report emphasizes the absence of public databases to track FRT usage, making it difficult to monitor potential civil rights violations. Customs and Border Patrol employs FRT in various locations, while HUD uses it in public housing without adequate tracking of its impact. Recommendations from the report include requiring federal agencies to disclose their FRT usage, provide training for personnel, and ensure that FRT is not the sole factor in arrests. Congress is urged to direct the National Institute of Standards and Technology (NIST) to evaluate FRT technologies and report on their accuracy across demographic groups. The report underscores the need for greater accountability and oversight in the deployment of emerging technologies in public services.

- UCCR report criticizes the use of facial recognition technology by federal agencies.

- Lack of standardization and oversight raises concerns about civil rights violations.

- Women and people of color are disproportionately affected by misidentification.

- Recommendations include better training, transparency, and public accountability for FRT use.

- Congress is urged to evaluate FRT technologies and their impact on civil rights.

Related

When Facial Recognition Helps Police Target Black Faces

When Facial Recognition Helps Police Target Black Faces

Karl Ricanek, an AI engineer, reflects on facial recognition technology's moral implications. His work evolved from US Navy projects to commercial use, despite early awareness of biases. Real-world misidentifications stress the need for ethical considerations.

You can opt out of airport face scans

You can opt out of airport face scans

Passengers can opt out of airport facial recognition for US domestic flights by following specific steps. Challenges faced during opt-out prompt the Algorithmic Justice League's "Freedom Flyers" campaign. Concerns include data security, bias, and surveillance normalization. Opting out advocates for biometric rights.

DHS plans to collect biometric data from migrant children "down to the infant"

DHS plans to collect biometric data from migrant children "down to the infant"

The U.S. DHS plans to collect facial images of migrant children to enhance facial recognition technology, raising concerns about privacy, consent, and ethical implications amid unclear implementation status and confirmed funding.

Americans Are Uncomfortable with Automated Decision-Making

Americans Are Uncomfortable with Automated Decision-Making

A Consumer Reports survey shows 72% of Americans are uncomfortable with AI in job interviews, and 66% with its use in banking and housing, highlighting concerns over transparency and data accuracy.

Access your brain? The creepy race to read workers' minds (2023)

Access your brain? The creepy race to read workers' minds (2023)

The use of neurotechnology and AI in hiring raises ethical concerns, potentially increasing racial disparities and disadvantaging neurodiverse candidates. Regulations are needed to protect mental privacy and ensure informed consent.

AI: What people are saying
The comments reflect a range of opinions on the use of facial recognition technology and its implications for privacy and civil rights.
  • Concerns about mass surveillance and the legality of government use of facial recognition technology are prevalent.
  • There is skepticism regarding the effectiveness of regulatory bodies like the U.S. Commission on Civil Rights.
  • Many commenters express alarm over the normalization of facial recognition in public spaces, such as airports.
  • Debates arise over the balance between privacy rights and the potential benefits of facial recognition technology.
  • Some argue that private companies may exploit facial recognition if government regulations are imposed.
Link Icon 10 comments
By @AlbertCory - 7 months
The issue with governments implementing mass warrantless surveillance is not training or standards, NIST or otherwise.

It should be straight-out illegal. Governments do not have "free speech" rights.

As for private citizens doing it: I think there are already sufficient laws about recording people without their consent. You can hire someone to stand on a corner and watch for a specific person to walk by, but a law prohibiting you from recording everyone who walks by is most likely going to withstand a court review. It's a question of scale.

By @blackeyeblitzar - 7 months
I was appalled to see TSA facial recognition scanners at airports recently, where instead of checking your ID and boarding pass they scan your face. Almost everyone simply accepted the new process instead of opting out. I’m not sure how the eventual forced violation of biometrics can be stopped when most people don’t care.
By @Animats - 7 months
The article mentions NIST standards for face recognition. NIST evaluates face matching systems.[1] They've gotten considerably better in the last decade, as would be expected.

[1] https://pages.nist.gov/frvt/html/frvt11.html

By @doctorpangloss - 7 months
Facial recognition, and the lack of regulating it: a victim of conflating privacy in the sense of limiting government powers and privacy in the sense of whether or not a piece of data is sensitive of embarrassing.
By @chrismeller - 7 months
> Civil rights watchdog

Well not quite. They keep getting money every year, but officially their mandate expired in the 90’s and has never been renewed. As crazy as that is by itself, they certainly don’t have the authority to do anything.

By @samarthr1 - 7 months
Genuine question, are the ethics of "anonymous facial recognition", where the system keeps track of a particular face, does not remember the face, and does not correlate the face to a identity , comparable to a social-credit like system?
By @fsndz - 7 months
Soon enough we will have the Machine, like in POI.
By @luxuryballs - 7 months
ah yes the watchdog alarm aka the nothing will be done but now we feel like someone other than us is working on it alarm (manufactured consent alarm?)
By @beaglesss - 7 months
You have no right to privacy in public. If you make this illegal private companies will do it instead, and the first amendment makes that impossible to stop.