A Woman In China Claims That Her iPhone X Was Unlocked By A Coworker’s Face, & It’s Raising Questions About Diversity In Tech

Justin Sullivan/Getty Images News/Getty Images

A woman in the city of Nanjing, China, claimed that her colleague was able to unlock her iPhone X using Face ID, according to the South China Morning Post. The woman, who was identified only by her surname, Yan, reportedly returned an iPhone X after she said her colleague was able to unlock it with the Face ID feature. She was reportedly given a new iPhone X, and the same thing allegedly happened again, she told the Jiangsu Broadcasting Corporation. An Apple spokesperson speaking to Bustle says that the company can't confirm the details of the original story, but notes that there is an approximately one in 1 million chance that a random person from the population could unlock someone else's iPhone X using Face ID, compared with a 1 in 50,000 chance for Touch ID.

A white paper by Apple says that "The probability of a false match is different for twins and siblings that look like you as well as among children under the age of 13, because their distinct facial features may not have fully developed." Additionally, the Apple spokesperson tells Bustle that Face ID continues to learn as it's used, becoming able to recognize subtle changes in appearance such as facial hair or makeup. Despite these possibilities for a false match, the story is raising concerns on the internet about the importance of diversity in tech.

Some people on Twitter are suggesting that the reason the Face ID recognized both Chinese women's faces has to do with bias by programmers of the tool. In a Twitter thread that has now gone viral, with over 20,000 likes, TC Ivy, CEO of marketing company V3 Inbound said, "Devices can't be biased, but if the creators don't account for their own biases it shows up in things like Asian women being indistinguishable to iPhones and black hands not triggering sensors in soap machines."

Despite both women being able to unlock the same iPhone X, facial recognition has been successfully used in China for the past few years, according to the South China Morning Post, which reported that both the China Merchants Bank and the Agricultural Bank of China have been using the technology in their ATMs since 2015 without any reported problems.

According to the Post, "Chinese authorities have used the technology to identify law breakers and jaywalkers, while companies have made 'face swiping' part of their retailing and services, from paying for ice cream at KFC outlets to boarding flights."

It's no secret that lack of diversity in tech is an ongoing issue, and The Hill reported that "a new Government Accountability Office (GAO) report details a lack of racial diversity among technology firms," Ali Breland wrote for The Hill. "The report also noted that female, black and Hispanic individuals make up a smaller proportion of the technology workforce than they do the U.S. workforce at large."

The iPhone X has had other criticisms from people who said their children have been able to unlock their phones using the facial-recognition feature. According to WIRED, a 10-year-old boy was able to unlock his mother's phone using Face ID. "It was funny at first," Attaullah Malik, the boy's father, told WIRED. "But it wasn't really funny afterward. My wife and I text all the time and there might be something we don’t want him to see. Now my wife has to delete her texts when there's something she doesn’t want [their son] to look at."

Attaullah Malik on YouTube

The Atlantic reported that some facial-recognition features across different platforms, not just the iPhone, are also sometimes unable to successfully identify people of color and can't account for varying skin tones, which could be one of the reasons for facial-recognition failure. "Research suggests that the improving accuracy rates are not distributed equally. To the contrary, many algorithms display troubling differences in accuracy across race, gender, and other demographics," Clare Garvie and Johnathan Frankle reported for the Atlantic in 2016. "A 2011 study, co-authored by one of the organizers of [National Institute of Standards and Technologies's] vendor tests, found that algorithms developed in China, Japan, and South Korea recognized East Asian faces far more readily than Caucasians. The reverse was true for algorithms developed in France, Germany, and the United States, which were significantly better at recognizing Caucasian facial characteristics."

Additionally, Recode reported earlier this year that during an artificial intelligence panel at the World Economic Forum Annual Meeting, MIT Media Lab director Joichi Ito said facial-recognition fails likely stem from the fact that many engineers are white, and the people used to test facial-recognition technology are also white. "A lack of diversity in the training set leads to an inability to easily characterize faces that do not fit the normal face derived from the training set," graduate researcher Joy Buolamwini told Recode. She went on to say that "to cope with limitations in one project involving facial-recognition technology, she had to wear a white mask so that her face could 'be detected in a variety of lighting conditions.'"

Recode reported that the flaws in facial-recognition technology are likely due pre-written code developers share so they don't have to create everything from scratch. "While this is a temporary solution, we can do better than asking people to change themselves to fit our code," Buolamwini said. "Our task is to create code that can work for people of all types."

Diversity in tech is key to ensuring that everyone is equally represented when developing new technology like facial recognition. This latest controversy surrounding Face ID is bringing up crucial questions about why inclusion is not just a suggestion, but a necessity.