"In 16 'undisclosed locations' across northern Los Angeles, digital eyes watch the public. These aren’t ordinary police-surveillance cameras; these cameras are looking at your face. Using facial-recognition software, the cameras can recognize individuals from up to 600 feet away. The faces they collect are then compared, in real-time, against 'hot lists' of people suspected of gang activity or having an open arrest warrant.
Facial Recognition Software might have a Racial Bias Problem
"In 16 'undisclosed locations' across northern Los Angeles, digital eyes watch the public. These aren’t ordinary police-surveillance cameras; these cameras are looking at your face. Using facial-recognition software, the cameras can recognize individuals from up to 600 feet away. The faces they collect are then compared, in real-time, against 'hot lists' of people suspected of gang activity or having an open arrest warrant.
Considering arrest and
incarceration rates across L.A., chances are high that those hot lists
disproportionately implicate African Americans. And recent research
suggests that the algorithms behind facial-recognition technology may
perform worse on precisely this demographic. Facial-recognition
systems are more likely either to misidentify or fail to identify
African Americans than other races, errors that could result in innocent
citizens being marked as suspects in crimes. And though this technology
is being rolled out by law enforcement across the country, little is
being done to explore—or correct—for the bias."
"In 16 'undisclosed locations' across northern Los Angeles, digital eyes watch the public. These aren’t ordinary police-surveillance cameras; these cameras are looking at your face. Using facial-recognition software, the cameras can recognize individuals from up to 600 feet away. The faces they collect are then compared, in real-time, against 'hot lists' of people suspected of gang activity or having an open arrest warrant.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment