New York’s “Textalyzer” Law Is A Huge Problem For Women’s Privacy
Driving while texting is unquestionably a bad, stupid, no-good idea. The Center for Disease Control & Prevention estimates that every day, nine people are killed and over 1000 people injured across America due to car crashes caused by "distracted driving." But practice is being met with pushback from law enforcement worldwide. In the UK, being caught texting while driving means six points off your license and a £200 ($US258) fine, which means an automatic license disqualification if you've only had your license for two years or under. And now, in New York, a law is being proposed that would allow police to use a "textalyzer" — like a breathalyzer, but to track texting — to crack down on texting while driving, but opponents say it could be a huge privacy issue — and for women in particular.
Distractions can take a lot of forms, but texting while driving is a big one:
Across the United States, laws against texting while driving are fairly uniform at the state level. Only Arizona, Missouri and Montana don't have laws against it for every driver on the roads; Missouri's only forbids drivers under 21 from texting behind the wheel, which is very important considering that 42 percent of high school students who drove reported they'd texted while driving before. But some states are pushing for stronger consequences for people who do text and drive, and the "textalyzer" is seen as a way to help police accomplish just that — at the cost of women's privacy.
What A "Textalyzer" Law Actually Means
The law, proposed by Republican senator Terrence Murphy, seems on the surface to be sensible. It proposes "field testing" of phones by police officers in the event of a crash or incident, to check whether or not they were being used as the crash occurred and see if the person involved could have been distracted by their phones. It's supposed to be done via software called a "textalyzer" (named, of course, after a breathalyzer), which could, in theory, analyze what people were doing on their phones at the time of the incident and pinpoint whether there was human activity.
The impetus behind the initiative is understandable. It's been propelled by a man named Ben Lieberman, who lost his 19-year-old son in a head-on collision that was proven in court to have been caused by the fact that the driver was texting. Lieberman's efforts are designed to make finding out that information easier, and he's partnered with the technology company Cellebrite to attempt to help bring specifically-built technology for the purpose into New York law. Cellebrite claims on its website that it has the capability to produce software that will help investigators in the field:
It also says that it can "directly extract passwords, disable or bypass user locks and decode data from more than 1,500 mobile applications in minutes", while noting that they "preserve privacy" by copying the information used so that witnesses and victims can be shown what's being read and nothing else.
Red flags, anyone?
Why This Is Bad For Women's Privacy
The American Civil Liberties Union has already spoken out against the proposed law on the basis that a textalyzer would be a violation of privacy. There's a reason, they explain, that obtaining mobile phone logs and other information legally is supposed to be difficult difficult: The ways in which we use our phones these days, from personal photographs and social media to mobile banking, mean that cell phones often contain extremely private information. The thing the ACLU particularly objects to is that, like a breathalyzer, the textalyzer would be used in cases where nobody has been arrested, but instead of just checking alcohol levels, potentially innocent people could instead be exposing personal details to strangers.
The ACLU also points out a flaw in the technology of Cellebrite itself: It isn't fully developed yet, and might not be until after the law passes, if it does. They explain that this makes actually testing its viability and how well it protects privacy really difficult:
Cellebrite has told the press that all the software will be able to do is track whether or not the phone was physically being used at the time of the incident. If the law passes and the technology works as intended, it's also worth pointing out that this design would hardly be foolproof. Passengers could claim they weren't using the phone, or that the phone had software issues, or that the processes that claimed to be done by human hands were in fact automated. And that's not the only potential flaw in the design.
A piece of software that can be used to get past safety protocols like passwords and screen locks, and has at present no publicly tested way of telling "relevant" information from "irrelevant" information, is, by definition, not safe. How is the information going to be used and stored by the police? How will the textalyzer decide what is or isn't pertinent, and guarantee that personal details aren't made available to police officers or others involved in an investigation? What protections would the software offer to make sure that nobody except an authorized person could use it, and how would they guarantee that unscrupulous users wouldn't be able to manipulate the software to do something problematic (copy a woman's phone number, for instance, or view the photographs of a minor)?
Privacy issues surrounding technology and what it makes available to others has already become a news story this year, with privacy activists and legal experts concerned about the rights of children coming out to protest the existence of Snapmaps, the Snapchat feature that tracks and maps your precise location when the app is used. While Cellebrite's software, and the law enforcing it, look fairly innocuous, without being able to see how it actually works and whether it does indeed protect privacy, it's going to be a tough act to sell.