<https://www.vice.com/en/article/k78ygn/researchers-create-master-faces-to-bypass-facial-recognition>
Researchers Create 'Master Faces' to Bypass Facial Recognition
According to the paper, their findings imply that facial recognition
systems are “extremely vulnerable.”
Researchers have demonstrated a method to create "master faces,"
computer generated faces that act like master keys for facial
recognition systems, and can impersonate several identities with what
the researchers claim is a high probability of success.
In their paper <https://arxiv.org/pdf/2108.01077.pdf>, researchers at
the Blavatnik School of Computer Science and the School of Electrical
Engineering in Tel Aviv detail how they successfully created nine
"master key" faces that are able to impersonate almost half the faces in
a dataset of three leading face recognition systems. The researchers say
their results show these master faces can successfully impersonate over
40 percent of the population in these systems without any additional
information or data of the person they are identifying.
The paper cites previous research which showed a similar method for
creating master fingerprints
<https://ieeexplore.ieee.org/document/7893784>. According to the paper,
their findings imply that facial recognition systems are “extremely
vulnerable.”
The "master key" faces tended to be older, and didn't have glasses or
facial hair.
The researchers tested their methods against three deep face recognition
systems–Dlib, FaceNet, and SphereFace. Lead author Ron Shmelkin told
Motherboard that they used these systems because they are capable of
recognizing “high-level semantic features” of the faces that are more
sophisticated than just skin color or lighting effects.
The researchers used a StyleGAN to generate the faces and then used an
evolutionary algorithm and neural network to optimize and predict their
success. The evolutionary strategy then creates iterations, or
generations, of candidates of varying success rates. The researchers
then used the algorithm to train a neural network, to classify the best
candidates as the most promising ones. This is what teaches it to
predict candidates’ success and, in turn, direct the algorithm to
generate better candidates with a higher probability of passing.
“We are interested in further exploring the possibility of using the
master faces generated by our method in order to help protect existing
facial recognition systems from such attacks,” Shmelkin told Motherboard.
The researchers even predict that their master faces could be animated
using deepfake technology to bypass liveness detection, which is used to
determine whether a biometric sample is real or fake.
The paper also notes that white males over the age of 60 in the
University of Massachusetts’ Labeled Faces in the Wild (LFW) dataset
tended to be less varied compared to younger groups, so much of that
group could be covered by a single older master face. Additionally, only
two of the nine master faces created were female, which the paper notes
matches the “much lower frequency” of female faces in the LFW dataset
(22 percent.)
The success of their findings shows how facial recognition software can
be flawed and biased. Its continued use by law enforcement has resulted
in multiple wrongful arrests
<https://www.vice.com/en/article/xgx5gd/man-wrongfully-arrested-by-facial-recognition-tells-congress-his-story>,
even when it has proven to be easy to tamper with
<https://www.vice.com/en/article/m7e9qv/hackers-fool-facial-recognition-into-thinking-im-mark-zuckerberg>.
_______________________________________________
nexa mailing list
[email protected]
https://server-nexa.polito.it/cgi-bin/mailman/listinfo/nexa