Why Don't iPhones Don't Autocorrect The Word 'Abortion?'
Why won't your iPhone autocorrect the word 'abortion?'
On Tuesday, The Daily Beast's Michael Keller published an investigation revealing which words iPhones refuse to autocorrect. The words tend have a common thread: they're tinged with controversy. If you mistype "abortion," "rape," "ammo," and "bullet" into your iPhone, autocorrect won't kick in—and spellcheck won't even suggest their correct spellings.
Apple's abortion-aversion is nothing new. In December 2011, a blogger discovered that Siri wouldn't tell users where they could find an abortion clinic. In extreme cases in Washington, D.C., Siri actually directed them to pro-life clinics. While Apple did say that these omissions were unintentional, Keller's findings throw those claims into question. (Particularly when you consider words like "effleurage" and "osteogenesis" were given correct suggested spellings when misspelled.)
Bustle talked to Michael Keller to learn more.
Bustle: What led you to do this research?
Keller: We did a package of stories about the Roe v. Wade fortieth anniversary, mapping the geography of access to abortion services in the country. I evidently am a bad speller when it comes to the word “abortion,” and so I noticed that when I misspelled it, and I would ask for suggestions [on my iPhone], they wouldn’t show up.
So, it led me to the question, is it just that one word? Is it just my phone? I got some friends who donated their old iPhones, and we upgraded their software, factory restored their settings, and tested it. We tested the full corpus of words to see what other ones it might not accurately correct.
What was your most surprising finding?
"Abortion" stands out. "Murder," "rape," and "virginity" were interesting ones. The vast majority of the words that came up were these kind of technical, jargon words.
And you had the comparison with all those words that you would probably never use in a message, like "abiogenesis,” “abomasum,” “edutainment,” “electrophysiology,” and “lymphadenopathy,” that all autocorrect. How did you gather all that data?
I started with a corpus of words. I excluded words that were shorter than three characters and filtered out dashed words and words with periods with them.
The hard part was creating the different misspellings. Essentially, I wrote a script to programmatically create 14 different misspellings of these words. Once we had those misspellings, I wrote an iOS app essentially that would serve up those words on the iOS simulator.
The application showed a misspelled version of the word. Then another application would simulate double clicking on the word and navigating through the menus and picking a suggestion for the word. If the word wasn’t corrected at least once, it went under a list of words to investigate and double check with an actual iPhone. We went through that list, and on these factory-restored iPhones, we typed them in and double-checked that the software didn’t correct it.
There’s a video on the tumblr that shows the simulated user going through the iPhone iOS script.
What are the consequences of companies like Apple having a certain level of control over how we communicate?
For me, speaking as a journalist, I feel like it’s our job to show these things and then readers will either care or not care. We never know if people are actually going to care about what we find.
It’s similar to the idea of privatization of censorship. Obviously, they’re a private company, and they can do whatever they want. This would be a different issue if it was the government. No one is forcing us to use Apple products, and I think people would say that if people are really bothered by these types of filtering, then market forces will arise, and there will be a competitor that doesn’t do that, if there’s sufficient demand.
So, I think it’s really difficult to say what will come of it. In general, people like to know what their technology does.