For years, researchers have developed ways to bypass biometric scanners and impersonate other people. Now, there’s a new technique: disguising yourself as Elvis Costello to fool a facial recognition system.
Late last month, researchers at Carnegie Mellon unveiled a new attack designed to fool facial recognition systems that involves the use of glasses with ‘geek’ frames. It used patterned costing 22 cents to print, which could be stuck to the rims of existing eyeglasses, containing patterns designed to throw the computers off. The patterns didn’t resemble other people’s eyebrows and noses, because that might attract unwanted attention when you’re trying to enter a top secret government facility. Instead, they used seemingly random patterns designed to throw off a particular kind of technology used for facial recognition; neural networks.
Neural networks are a form of artificial intelligence that learn to make decisions based on lots of historical data. Show one a picture of Clive Owen, and then a thousand pictures of someone who isn’t Clive Owen, and it will happily recognize him, if not comment on his excellent acting in dystopic sci-fi thriller Children of Men.
It turns out that you can often fool these networks by changing just a little of the input data. In this case, replacing the area around the eyes with patterns designed to disrupt the image disrupts the artificial intelligence algorithm’s ability to recognize faces.
Dodging and impersonating
The team showed how this could be used in two ways. The first involved ‘dodging’, where the glasses prevented a legitimate target from being recognized. Generating false negatives like this would be useful if you wanted to avoid being identified by facial scanners in a crowd, say.
The second involved outright impersonation. In their tests, the researchers used printed glass rims to successfully impersonate not only the suave male English star of The International, but also Milla Jovovich and Carson Daly.
These tests ran against neural networks developed by the research team, but also against commercial facial recognition system Face++, where it worked 100% of the time (although this test used digital images rather than live people wearing the glasses – your mileage may vary).
This isn’t the first successful attack on biometrics. Check out this video from the Chaos Computer Club (translated from German) which shows successful attacks against Apple’s Touch ID fingerprint sensor, and against facial recognition systems designed to test for live subjects rather than photographs by watching for blinks. It’s an hour long, but worth watching all the way through.
Neither is it the first research that enables people to subvert biometric systems while also making a bold fashion statement. Artist and designer Adam Harvey developed a unique way to cope with this threat. His solution, CV Dazzle, is a makeup system inspired by the dazzle system of painting used to obscure the movement and direction of ships during the world wars. It uses large splotches of colour and avant garde hairstyling designed to break up the key things that facial recognition systems use to identify you, such as the bridge of your nose and the eyes.
This will help you to dodge specific facial recognition systems that Harvey calls out on his web page, such as the detection that Apple uses in its iPhoto software. This is great for protecting you from systems that want to track your visage. Unfortunately, it also leaves you looking like a refugee from an eighties New Romantic album cover.
What does this tell us about biometrics? Using ‘something you are’ to verify your identity can increase security compared to the use of passwords, say, but it also means that if someone figures out how to compromise a system by impersonating your unique biometric data, you can’t change it very easily.
How can we solve that problem? First, biometrics developers will constantly develop more foolproof ways of scanning a user’s features. 3D imaging and more complex facial pattern analysis can all help to thwart attacks.
Second, using multiple forms of verification raises the bar for attackers. Maybe they can simulate your face, but have they also reproduced your fingerprint to fool your laptop’s built-in reader, or grabbed that two-factor authentication token you’re also using? As the battle to protect identities goes on, the more weapons you have at your disposal, the better.