Women Are Happier With Their Weight When They're Told Men Like Heavier Women, Because Patriarchy is Real

1950s scale with a nude woman looking at her weight; screen print from a photograph, 1958. (Photo by GraphicaArtis/Getty Images)
Source: GraphicaArtis/Archive Photos/Getty Images

It turns out there's one seemingly surefire way to help women feel better about their bodies — and it still kind of sucks. According to science, women's body image improves if they think men prefer larger body types. Thanks, science. Thanks a lot. I mean it's not your fault the patriarchy is real but still, this is bumming me out. 

In a study published in Social, Psychological, and Personal Science, researchers decided to investigate how women's perception of their own bodies changed if their perception of what body types are considered "ideal" was also changed. Over the course of three independent experiments, participants were shown images of models who didn't meet the current thin body type considered ideal. Some women were told men found the models attractive, while others were told men preferred thin women, others told that women found the models attractive, and some given no information at all. The researchers found that the women told men liked the heavier models demonstrated greater satisfaction with their own weight; women in the other conditions did not. 

In other words, women's self-image and perceptions of our own bodies is dependent on how we think we will be seen by the male gaze. So that sucks. 

Now, in general, I'm a fan of anything that makes women feel good about themselves, given that we live in a society that is almost always more invested in ways to tear us down than to build us up. So if telling women that men don't prefer the ultra-skinny body type currently held up as "ideal" helps women, then I suppose I'm glad. 

But. But. But

But that doesn't change the fact that this study is really showing that women have been taught by society to defer to male opinion and the male gaze so much that changing our understanding of what men think changes how we think of ourselves. And that does not make me glad, not at all. 

In an ideal world, one in which society did not conflate being fat with being of lesser value as a human, people's feelings about their bodies would be based on health or how our bodies actually felt or maybe highly personalized visual preference. We don't live in that ideal world, but the fact that we are this far from it is depressing. I mean, this basically says that women don't even base our idea of our own bodies on a society's ideal but on what society tells us is men's ideal.

And I really don't think it's unreasonable to say that women should be able to have their own independent opinions of ourselves, instead of being taught that we have to value the opinion of men so much that it should be part of our own evaluation of ourselves. 

Image: Giphy

Must Reads