Hanwang, the facial-recognition company that has placed 2m of its cameras at entrance gates across the world, started preparing for the coronavirus in early January.
Huang Lei, the company’s chief technical officer, said that even before the new virus was widely known about, he had begun to get requests from hospitals at the centre of the outbreak in Hubei province to update its software to recognise nurses wearing masks.
“We wouldn’t wait until something explodes to act. If three or five clients ask for the same thing . . . we’ll see that as important,” said Mr Huang, adding that its cameras previously only recognised people in masks half the time, compared with 99.5 per cent accuracy for a full face image.
Since then, demand has soared, from police stations, railway stations and all the office towers that use Hanwang’s cameras to screen employees, and Mr Huang reassigned teams of people to work on the challenge.
The company now says its masked facial recognition program has reached 95 per cent accuracy in lab tests, and even claims that it is more accurate in real life, where its cameras take multiple photos of a person if the first attempt to identify them fails.
“The problem of masked facial recognition is not new, but belongs to the family of facial recognition with occlusion,” Mr Huang said, adding that his company had first encountered similar issues with people with beards in Turkey and Pakistan, as well as with northern Chinese customers wearing winter hats that covered their ears and face.
Counter-intuitively, training facial recognition algorithms to recognise masked faces involves throwing data away. A team at the University of Bradford published a study last year showing they could train a facial recognition program to accurately recognise half-faces by deleting parts of the photos they used to train the software.
When a facial recognition program tries to recognise a person, it takes a photo of the person to be identified, and reduces it down to a bundle, or vector, of numbers that describes the relative positions of features on the face.
But as it is applied against a larger population, there is a greater chance of a misidentifying a masked face, since there is less information to work with and there may be multiple people with similar features around their eyes and nose.
In Hanwang’s case, Mr Huang said that the company’s devices were designed to work in office settings with a database of up to 50,000 employee faces. He said the system was able to use photos taken from the Chinese police’s identification card database of some 1.2bn people, but that it was not built to work on such a huge system.
When a facial recognition system has calculated its vector of facial features, it compares it to the vectors of the faces in the database, finding a match if it meets a certain degree of accuracy.
Hanwang’s system tries to guess what all the faces in its existing database of photographs would look like if they were masked.
To improve its guesses, Hanwang asked its 2,000 or so employees for photos of themselves wearing masks, but also created another database of 6m photos of people with artificially generated masks — with different styles to reflect the ones commonly worn in China.
Police forces around the world are trialling facial recognition. When asked about its software being used to identify protesters in Hong Kong, Mr Huang said he had heard of the need for “anti-terrorist measures” in the region.
Mr Huang said that Hanwang’s market was mostly focused on entrance gates. “The market for the situation [you describe] in Hong Kong is too small,” he said. Hong Kong-based SenseTime, which also claims to have a solution to recognise masked faces, said it had no plans to apply its technology to public spaces.
Additional reporting by Nian Liu in Beijing and Qianer Liu in Shenzhen