According to an annual report released by U.S. Customs and Border Protection (CBP), more than 23 million people were “processed” using facial recognition technology at entry, exit and preclearance locations (e.g., airports, seaports, pedestrian crossings, etc.) in 2020 — up 21% from 2019. Despite the increase, however, the system caught no imposters traveling through airports and fewer than 100 pedestrian imposters, OneZero’s Dave Gershgorn reported.
At face value, this seems like a good thing: people are not trying to illicitly sneak through airports posing as other people, and now we’ve got the technology to A) prove it and B) dissuade future perpetrators. But in actuality, studies have shown that facial recognition is not as effective when considering a handful of demographics in particular — namely women and people with darker skin tones — which begs a fundamental question about its use: How can authorities justify implementing a potentially biased technology if it’s not even helping them identify any criminals?
In 2019, The New York Times reported that, according to a study by the National Institute of Standards and Technology, facial recognition systems falsely identified African-American and Asian faces 10 to 100 times more frequently than Caucasian faces. It also failed to identify women more often than men, and older adults more frequently than younger adults.
“The program’s implementation has been met with skepticism from the Government Accountability Office (GAO),” Gershgorn wrote. “In late 2020, the oversight organization lambasted CBP over lackluster accuracy audits, poor signage notifying the public the technology is being used, and little information offered to the public on how its systems worked.”
On the contrary, CBP has maintained the system has shown “virtually no measurable differential performance in results based on demographic factors,” but has thus far failed to provide any sufficient evidence to substantiate those claims, according to Gershgorn, who also pointed out that CBP suffered a data breach in 2019, when a rogue contractor downloaded nearly 200,000 facial-recognition images and leaked some of them into the public domain.
Over the course of the past two years, facial recognition has helped to capture just seven imposters traveling through U.S. airports and 285 by way of land crossings, but it’s difficult not to question its application and subsequent effectiveness. CBP is adamant that it will be a critical component of restoring the travel industry to its pre-pandemic glory, though it is not immediately clear how or why that would be true.
More Like This
How Four American Towns Called “Valentine” Celebrate on February 14
Three Excellent Winter Getaways Hiding in the Hills of Virginia
In the UK, Lying About COVID While Traveling Could Mean 10 Years in Jail
The post Facial Recognition Technology at Airports Isn’t Even Working appeared first on InsideHook.