Home Office admits it knew passport photo checker had issues with dark skin
10 October 2019, 12:44 | Updated: 10 October 2019, 12:46
The system was deployed even though the Home Office knew it struggled with very light or very dark skin tones.
An online passport photo checking system was launched by the Home Office despite it being aware that the technology struggled with very light or very dark skin tones, it has emerged.
The development comes after the PA news agency revealed how a black man had issues when trying to renew his passport, because facial recognition technology falsely flagged his lips as an open mouth.
Joshua Bada, a 28-year-old from west London, was forced to explain to the system, “My mouth is closed, I just have big lips” in order to proceed with the image.
The automated facial detection system informs people when it thinks the photo uploaded may not meet strict requirements, which include a plain expression and the mouth to be kept closed, though they can override the outcome if they believe the system is wrong.
A freedom of information (FOI) request reported by New Scientist has since shown that the Home Office knew the facial recognition technology failed for some ethnic groups when testing was carried out before it went live in 2016 but decided it worked well enough to be deployed.
“User research was carried out with a wide range of ethnic groups and did identify that people with very light or very dark skin found it difficult to provide an acceptable passport photograph, however, the overall performance was judged sufficient to deploy,” the department said in its response to the request, submitted by Sam Smith, of campaign group MedConfidential.
“We are constantly gathering customer feedback and carrying out further user testing to enable us to work alongside our supplier to keep refining the algorithm.”
A Home Office spokeswoman said: “We are determined to make the experience of uploading a digital photograph as simple as possible, and will continue working to improve this process for all of our customers.”
Experts believe the problem could be a result of algorithmic bias, meaning the data fed into the system may not have been large or diverse enough to represent all.
Noel Sharkey, professor of artificial intelligence and robotics at the University of Sheffield, said the Home Office should “hang its head in shame”.
“Inequality, inequality, inequality needs to be stamped on when it raises its ugly head,” he said.