Reporting Harassment On Twitter Just Got Way Simpler, And It's About Time
It's about time. On Tuesday, Twitter unveiled a simpler way to report harassment on the site — an important step from a company that has had consistent problems with people who take things far beyond trolling. Along with streamlining the reporting process by asking less upfront information, Twitter has promised faster response times for reported or flagged activity and profiles.
After a person reports some sort of harassment, they will be asked the nature of the exchange, with options for physical threats and violence. As TechCrunch pointed out, the faster response time will likely be tied to those options — the more serious the harassment, the more likely the complaint will leap to the front of the line. The new process also allows other users not directly involved in Twitter harassment to report activity, which Del Harvey, Twitter's vice president of trust and safety, told The Verge could help them become better watchdogs.
We're also working to take advantage of more behavioral signals — including reports from bystanders — and using those signals to prioritize reports and speed up our review process.
The company has also updated its system for blocking users by adding the blocked accounts page. This allows you to manage the profiles that you've already flagged and makes your profile invisible to people you have put on the list.
Twitter made some more advanced blocking updates available to a small group of users for now, but they will eventually release more controls in the next few months. The Verge reports that Twitter is working with advocacy groups and other organizations to make sure that they are preventing abuse without stepping on freedom of expression.
Previously, Twitter had a fussy questionnaire to report abuse. With mobile friendly updates and a more streamlined process, this will hopefully drive not only victims to report harassment or abuse, but other people as well.
Twitter seems to be responding well to the criticism about how it handled widespread abuse directed at female game designers and gamers during the horribly misogynistic #GamerGate debacle. Certainly, the company can't be responsible for all of the 500 million tweets sent every day, but it does need to bear at least some of the burden.
It's nice to see a company taking actions to not only recognize the potential harm in its service, but taking meaningful steps to try and mitigate those risks.