Life

Here's Exactly How The New iPhone X Face ID Feature Will Work

by Kaitlyn Wylde

Over the next few weeks you're going to be hearing a lot about Face ID and the iPhone X, and you're definitely going to want to know how facial recognition works, not to mention how the technology behind it is going to change the way we use our phones, regard our safety, and interact with our other apps. While it might sound like some whacky feature from the future — you know the kind of thing we'd see in a sci-fi movie with people in space, sitting in front of their see-through computers at their spaceship desks — it's actually a technology that we've had for quite a while, one that we use every day without even knowing it.

Facial recognition is a biometric identification process that sounds a lot more scary and scientific than it actually is. The term "biometric" refers to the process of collecting enough biological data to technically recognize an individual. The data collected to identify a person's face is called 'a "faceprint." It's made up 80 nodal points on the face. These points typically include the distance between the eyes, the width of the nose, the depth of the eyes, the shape and height of the cheek bones, and the length and shape of the jaw line. Together, those facial "landmarks" create a dynamic sense of what you look like, and that information is stored and can be used in a variety of ways. And while our only recall for that kind of technology might be in detective shows and thrillers, it's actually something that's already in our phones. Yeah, it's nothing new, it's just that we're about to start using it in a whole new way.

Apple

The types of facial recognition that we're used to might be much less advanced than the kind of technology that Face ID uses in the iPhone X, but it's a similar concept. You know how when you use Snapchat or Instagram Stories, the filters just somehow know how to find your face and fit you perfectly? Even when you move your head around and talk, the filter stays with you? And with some filters, if you open your mouth or raise your eyebrows, you trigger an animation, because the applications are taking a biometric assessment of your face and using the data to adjust the filter to you. The biometric system knows how to find your face (the target) and hold on to it, while constantly adjusting to stay comprehensive. Snapchat and Instagram Stories have trained their systems to recognize what the border of lips look like, how to find the shapes of eyes, the placement of the nose, and the length of ears, and then they take that information to make a digital mask and shift it to match your face — to which they can apply the augmented reality, or filter.

Two dimensional facial recognition has been around for even longer. Facebook stores data on individual's faces so that once you tag someone once, it knows how to suggest a tag for them the next time you upload a picture of that person. Plus, many phones already on the market use 2D facial recognition to unlock phones — despite the fact that it leaves them vulnerable to hacking, as anyone can hold up a picture of the phone's owner to the phone and unlock the system. So even though we've never used 3D facial recognition to unlock our phones or purchase apps or songs, the technology has been there. So there's no need to freak out that Apple's going all Minority Report. Basically, it's just advancing and repurposing a technology that we've been known to use for fun, to be a bit more functional.