We have completed maintenance on DiscoverMagazine.com and action may be required on your account. Learn More
Technology

How to Spot Synthetic Faces Online — the Clue Is in the Eyes

Fake faces created by neural networks do not have realistic pupils, say computer scientists.

close up woman's eye - shutterstock 1224470146
(Credit: Oleg Gekman/Shutterstock)

Newsletter

Sign up for our email newsletter for the latest science news
 

Computer-generated faces have recently become so good that they are hard to distinguish from the real thing. That makes them a useful tool for malicious operators on the internet who can use them, for example, to create fake profiles for nefarious social media accounts.

So computer scientists have been looking for ways to spot these images quickly and easily. Now Hui Guo at the State University of New York and colleagues have found a way to expose fake faces. Their weakness is their eyes, they say.

The technology behind synthetic face generation is a form of deep learning based on generative adversarial networks. The approach is to feed images of real faces into a neural network and then ask it to generate faces of its own. These faces are then tested against another neural network which tries to spot the fakes, so that the first network can learn from its mistakes.

The back and forth between these “adversarial networks” quickly improves the output to the point where the synthetic faces are hard to distinguish from real ones.

Spot the fake (Source: arxiv.org/abs/2109.00162)

But they are not perfect. For example, generative adversarial networks have trouble accurately reproducing facial accessories such as earrings and glasses, which are often different on each side of the face. But the faces themselves seem realistic, making it hard to spot them reliably.

Now Guo and co say they have found a flaw: generative adversarial networks do not produce faces with regular pupils—ones that are circular or elliptical--and this provides a way to expose them.

Guo and co developed software that extracts the shape of the pupil from facial images and then used it to analyze 1000 images of real faces and 1000 synthetically generated faces. Each image was scored according to the regularity of the pupils.

Eye Spy

“Real human pupils have strong elliptical shapes,” say the team. “However, the artifacts of irregular pupil shapes lead to significantly lower scores.”

This is the result of the way that generative adversarial networks work, with no inherent knowledge of the structure of human faces. “This phenomenon is caused by the lack of physiological constraints in the GAN models,” say Guo and co.

That’s an interesting result that provides a quick and easy way to spot a synthetic face—provided the pupils are visible. “With this cue, a human can visually find whether the face is real or not easily,” say the researchers. Indeed, it would be straightforward to create a program to do the job.

But this immediately suggests a way for malicious operators to beat such a test. all they have to do is circularize the pupils in the synthetic faces they create—a trivial task.

And therein lies the challenge in the cat-and-mouse game between the creators of fake images and those that try to spot them. This battle is far from over.

Ref: Eyes Tell All: Irregular Pupil Shapes Reveal Gan-Generated Faces : arxiv.org/abs/2109.00162

1 free article left
Want More? Get unlimited access for as low as $1.99/month

Already a subscriber?

Register or Log In

1 free articleSubscribe
Discover Magazine Logo
Want more?

Keep reading for as low as $1.99!

Subscribe

Already a subscriber?

Register or Log In

More From Discover
Recommendations From Our Store
Shop Now
Stay Curious
Join
Our List

Sign up for our weekly science updates.

 
Subscribe
To The Magazine

Save up to 70% off the cover price when you subscribe to Discover magazine.

Copyright © 2024 Kalmbach Media Co.