Fashion

What +Size Women Think About IG Banning #Curvy Tag

by Amanda Richards

"It's like Instagram is being run by two 15-year-old dudes in a basement," responds one Facebook commenter to the news that Instagram blocked searches of the hashtag #curvy. Coupled with the deletions of fat-positive photos and accounts, these actions are causing many to wonder, "Does Instagram discriminate against fat people?"

After BuzzFeed discovered that #curvy is now unsearchable on the site, Instagram released a statement claiming that they banned the word because too many people were using the tag on pornographic images, which is in direct violation of their policies. However, BuzzFeed also pointed out that plenty of other tags (and corresponding pornographic photos) are still available, including words like #clitoris and #dildo.

I did my own search, and discovered that despite Instagram's attempt to protect users from pornography (presumably because there is no age requirement to join the social media channel), the tags #pornos, #pornostar, and #pornvideo are still alive and well and, you know, full of porn. More on that later.

For many people in body-positive and fat-positive communities, Instagram's allegedly discriminatory practices have reached a boiling point, and they want answers. The problem is that Instagram makes answers incredibly hard to find. Its policies are vague, seemingly benign photos are deleted without explanation, and users who turn to Instagram's support email routinely see their messages bounce back from a full mailbox, as several users have told me.

The worst part? This is old news. Article upon article has been written about Instagram's unfair censorship practices, but there don't seem to be any signs that Instagram is changing. Samantha Newman, a 20-year-old student from Ohio, had her story go viral after Instagram deleted her account for posting photos in a bikini.

(Samantha Newman.)

"One of the reasons I was forced to go to the media with my story about my Instagram being unfairly removed is because there was no official or efficient way to contact Instagram and appeal what happened," Newman claims to Bustle via email. "I think [my account] got removed because there are extreme fatphobic biases at work, whether within their algorithm or within their staff [...] there is absolutely no excuse not to explain why my photo was removed and why millions of others are not."

While Instagram apologized and reinstated Newman's account, her question is one that's still being asked by many women, particularly women with certain body types who dare to share photos of themselves in bikinis and various other states of undress. Megan Hodge, a 30-year-old waxing trainer from Cleveland, Ohio, has a similar story. She posted the photo below, showing herself in a bra, only to see it promptly removed.

(Megan Hodge)

"It's absolutely a bigger body problem," she claims to Bustle in an email, alluding to the fact that straight size women (celebrities like Kim Kardashian included) more commonly get away with violating nudity clauses on IG than plus size women. "The numerous accounts [of] smaller-breasted women I see in their bathing suits and/or underwear that nobody takes offense to or reports. They are conventionally attractive, so [it seems] no one is singling them out."

Twenty-two-year-old student Courtney Paige is similarly suspicious after seeing one of her photos deleted. Paige is plus-size and regularly uploads self portraits to the site. The image yanked from her stream was arguably even more innocent than the first two.

"I was topless in the image, but you could only vaguely see side boob," she writes. "I think it got removed because it happened to cross the path of an unhappy policy troll who found it distasteful. An unlucky draw, so to speak. I think their editing process is random and inconsistent. I have stumbled across many an account and/or image that is really offensive and still remains active for quite some time."

Armed with these three stories, I managed to get ahold of a public relations representative from Instagram, who asked me to send her my questions. I asked five short questions, including an inquiry as to who decides if/when a photo is deleted, whether a team of Instagram's employees scan photos with certain hashtags or are alerted to a potentially infringing image after it is flagged a certain number of times, whether or not community guidelines have been updated to include less/more rules based on what content is being put out there, if Instagram has any plans to change the policies on nudity (especially considering things like #freethenipple that have gone viral), and how users can contact Instagram with their concerns or appeal the process.

I thought I'd get actual answers, but was instead forwarded to another PR rep, who sent me this explanation, along with the request that I don't name him directly. Here's what I received:

All of it seems pretty straightforward — if you don't think about it too much. However, none of the women I talked to were really that nude in their photos, and high-profile offenders like Justin Bieber deleted the photo of his "fully nude buttocks" on his own, worried about what his younger fans might think (seemingly not due to any prompting on the platform's end). Instagram still can't or won't answer specific questions about how their policies are applied in practice, and why certain types of photos get deleted more than others.

"When my photo was removed, I was told that their algorithm simply targets cases which seem most urgent via the number of reports made on the particular photo. But there are holes in that story big enough for me to fit through," Newman writes. "At the time my photo was removed, and many others, we have maybe a few hundred followers, if only a few thousand. How can they expect me to believe that my account with 500 followers (at the time) elicited more reports for my photos than an account with 500,000 followers? What about five million? It doesn’t add up."

In addition to the deleted images and suspended accounts, Instagram users also see popular hashtags suddenly disabled. You can no longer seek images with the #curvy tag using Instagram's search function; users can still add the tag to their photos, but can't actually find other images that use it. While thousands of confident fat women used #curvy to connect with one another, Instagram claimed that the tag was being inundated with pornography, and therefore had to go. All the while, tags like #fatbitch, #fatcunt, and #fatwhore still exist. There sure are quite a few #pornos left untouched, too.

This alone should prompt much more specific and detailed responses than the ones Instagram is offering. I'd hoped that these women's stories would suggest some answers, but instead, I was left with even more questions than when I started. Why are the policies applied so inconsistently? Does an algorithm alert Instagram staff to certain images and not others? Who makes the call? Is there a deletion bot running amok, or an overworked staff of mods trying to apply vague rules? Or are users' biggest fears actually true: Instagram is simply another mechanism by which marginalized voices are silenced and subject to more prejudice than others?

Images: Amanda Richards; Courtesy Megan Hodge, Samantha Newman; peach_e_paige/Instagram