Home SECURITY London Underground’s facial recognition system is biased towards blacks

London Underground’s facial recognition system is biased towards blacks

0
London Underground’s facial recognition system is biased towards blacks

[ad_1]

London Underground’s facial recognition system is biased towards blacks

How London Underground is using facial recognition during a king’s coronation and why it’s dangerous.

At a meeting of the UK Parliamentary Committee on Science, Innovation and Technology, it was stated that the facial recognition system used by the London Underground during the coronation of the king, can show racial bias at certain thresholds. On May 6, 2023, the coronation of the British King Charles III took place in London.

Dr. Tony Mansfield, chief scientist at the National Physical Laboratory, told deputies that the NEC-based system used by the country’s largest police department tends to be biased against black people on test data created for his research.

It has been found that if the system is run at low and light matching thresholds, then it begins to show bias against black men and women together. Mansfield believes that the underground does not use the system at such thresholds.

According to doctor’s report false positive identifications increase at low thresholds for matching faces (0.58 and 0.56) and “begin to show a statistically significant imbalance between demographic groups: blacks have more false positives than Asians or whites.”

However, the NEC Neoface system at higher thresholds (0.64 or higher) did not give false positive identifications and therefore had no bias. At an average threshold (0.60 – 0.62), the system produced 8 false positives, but did not have a statistically significant demographic bias.

A spokesperson for the Underground said the system had been used three times since the study was published, including in parts of London and during the coronation of a king.

Police operations involving 6 police vans resulted in 4 alerts and no false positives. There were 2 arrests, and on the remaining alerts, the police decided that an arrest was not required, although the identification was correct.

On each operation, the police created a separate set of data for a location-based “watch list” and an “intelligence file” for the operation. After each operation, the watchlist was deleted. No data from real-time facial recognition operations was stored, according to a metro spokesman.

[ad_2]

Source link

www.securitylab.ru

LEAVE A REPLY

Please enter your comment!
Please enter your name here