How well can algorithms recognize your masked face?
There’s a scramble to adapt to a world where people routinely cover their faces. …
reader comments
42 with 36 posters participating
Facial-recognition algorithms from Los Angeles startup TrueFace are good enough that the US Air Force uses them to speed security checks at base entrances. But CEO Shaun Moore says he’s facing a new question: How good is TrueFace’s technology when people are wearing face masks?
“It’s something we don’t know yet because it’s not been deployed in that environment,” Moore says. His engineers are testing their technology on masked faces and are hurriedly gathering images of masked faces to tune their machine-learning algorithms for pandemic times.
Facial recognition has become more widespread and accurate in recent years, as an artificial intelligence technology called deep learning made computers much better at interpreting images. Governments and private companies use facial recognition to identify people at workplaces, schools, and airports, among other places, although some algorithms perform less well on women and people with darker skin tones. Now the facial-recognition industry is trying to adapt to a world where many people keep their faces covered to avoid spreading disease.
Facial-recognition experts say that algorithms are generally less accurate when a face is obscured, whether by an obstacle, a camera angle, or a mask, because there’s less information available to make comparisons. “When you have fewer than 100,000 people in the database, you will not feel the difference,” says Alexander Khanin, CEO and cofounder of VisionLabs, a startup based in Amsterdam. With 1 million people, he says, accuracy will be noticeably reduced and the system may need adjustment, depending on how it’s being
Continue reading – Article source