Life
Driving while texting is unquestionably a bad, stupid, no-good idea. The Center for Disease Control & Prevention estimates that every day, nine people are killed and over 1000 people injured across America due to car crashes caused by "distracted driving." But practice is being met with pushback from law enforcement worldwide. In the UK, being caught texting while driving means six points off your license and a £200 ($US258) fine, which means an automatic license disqualification if you've only had your license for two years or under. And now, in New York, a law is being proposed that would allow police to use a "textalyzer" — like a breathalyzer, but to track texting — to crack down on texting while driving, but opponents say it could be a huge privacy issue — and for women in particular.
Distractions can take a lot of forms, but texting while driving is a big one:
Across the United States, laws against texting while driving are fairly uniform at the state level. Only Arizona, Missouri and Montana don't have laws against it for every driver on the roads; Missouri's only forbids drivers under 21 from texting behind the wheel, which is very important considering that 42 percent of high school students who drove reported they'd texted while driving before. But some states are pushing for stronger consequences for people who do text and drive, and the "textalyzer" is seen as a way to help police accomplish just that — at the cost of women's privacy.
Why This Is Bad For Women's Privacy
The American Civil Liberties Union has already spoken out against the proposed law on the basis that a textalyzer would be a violation of privacy. There's a reason, they explain, that obtaining mobile phone logs and other information legally is supposed to be difficult difficult: The ways in which we use our phones these days, from personal photographs and social media to mobile banking, mean that cell phones often contain extremely private information. The thing the ACLU particularly objects to is that, like a breathalyzer, the textalyzer would be used in cases where nobody has been arrested, but instead of just checking alcohol levels, potentially innocent people could instead be exposing personal details to strangers.
The ACLU also points out a flaw in the technology of Cellebrite itself: It isn't fully developed yet, and might not be until after the law passes, if it does. They explain that this makes actually testing its viability and how well it protects privacy really difficult:
"The legislation is premised on the availability of technology that will do certain things in certain guaranteed ways, but that technology won’t be available for anyone to look at until the passage of legislation authorizing its use ... Legislators are being asked to vote for a new law based on unproven assurances that textalyzer software can be developed that will analyze cellphone usage without invading privacy."
Cellebrite has told the press that all the software will be able to do is track whether or not the phone was physically being used at the time of the incident. If the law passes and the technology works as intended, it's also worth pointing out that this design would hardly be foolproof. Passengers could claim they weren't using the phone, or that the phone had software issues, or that the processes that claimed to be done by human hands were in fact automated. And that's not the only potential flaw in the design.
A piece of software that can be used to get past safety protocols like passwords and screen locks, and has at present no publicly tested way of telling "relevant" information from "irrelevant" information, is, by definition, not safe. How is the information going to be used and stored by the police? How will the textalyzer decide what is or isn't pertinent, and guarantee that personal details aren't made available to police officers or others involved in an investigation? What protections would the software offer to make sure that nobody except an authorized person could use it, and how would they guarantee that unscrupulous users wouldn't be able to manipulate the software to do something problematic (copy a woman's phone number, for instance, or view the photographs of a minor)?
Privacy issues surrounding technology and what it makes available to others has already become a news story this year, with privacy activists and legal experts concerned about the rights of children coming out to protest the existence of Snapmaps, the Snapchat feature that tracks and maps your precise location when the app is used. While Cellebrite's software, and the law enforcing it, look fairly innocuous, without being able to see how it actually works and whether it does indeed protect privacy, it's going to be a tough act to sell.