From the way sex happens to the way we perceive the act of sex and those having it, pop culture has a warped view when it comes to making the beast with two backs. For starters, no one ever accidentally does a cough/sneeze/moan hybrid (think wounded bush pig) while having sex on screen but that shit happens in life. Even in "real" depictions of sex in mainstream movies and TV, everything is so perfectly curated, you'd think sex was some kind of weird naked beauty pageant for two. "To make the world a better place, I vow never to have a weird pimple on my back, an ingrown hair on my bikini line, or drool on my partner's head by accident when I'm on top," the contestants say.
Sex in pop culture is generally just a lot of beautiful people bumping together in pretty beautiful ways, and a lot of what we see on screen can't be damaging towards our everyday attitudes. SHOCKING, I know. And even when shows like Girls and Sex and the City are being subversive, they're really not being: white people having sex in three pretty standard positions (missionary, cowgirl, doggy), isn't really as revolutionary a depiction of sex on screen as it would like to be.
Hollywood depictions of sex are almost exclusively relegated to heterosexual pairings; movies and TV story lines might include non-straight couplings, but they almost never show sex that isn't between a man and a woman. The almost-assuredly-straight sex scenes are gracefully choreographed: She flips her hair over her naked shoulders and groans like she's the winner of American Idol: Sex Noises; His abs glisten in the moonlight as they orgasm in unison. And then we all walk away wondering why we're a conflicted jumble of sexual insecurities. The source is pretty clear: TV and movies teach us some really damaging things about sex.
1. "ALL ADVENTUROUS WOMEN DO"
No, Hannah Horvath, all adventurous women absolutely do not. When it comes to sexually transmitted illnesses, pop culture is the worst. Unless you're watching a very serious movie or program about AIDS, STIs are generally treated like they "ain't no thang." Dudes make bro jokes about them. In Girls (in what could have been a truly revelatory TV moment), the protagonist gets HPV (which can lead to life threatening cervical cancer), she simply brushes it off as a symptom of sexually enlightenment and dances around her room to Robyn. Because you know, that's what people in real life do when they catch a nasty disease from having sex.
2. YOU HAVE TO LOSE YOUR VIRGINITY BY A CERTAIN AGE OR YOU SUCK
The age-old teen movie cliche is that you have to lose your virginity in high school or everything after that will suck forever because you're the biggest loser ever. American Pie peen left dry becomes the The 40 Year Old Virgin. It's an attitude that's entirely responsible for actual humans thinking they need to get laid ASAP, by any means necessary, because there is literally nowhere else to get this ridiculous idea from other than Hollywood.
3. THERE'S NO SUCH THING AS SAFE SEX
Have you ever noticed that everyone in the movies and on TV just starts having sex immediately after kissing for a second? Even if it's with a new partner for the first time, no one ever stops to get a condom. Skirt is lifted, wang goes in (think Patrick Swayze and Kelly Lynch in Roadhouse). Whatever happened to good old fashioned not wanting to get herpes or babies? Or waiting until you're exclusive and are both regularly tested for STIs before you start raw dogging? Remember kids: no glove, no love.
4. THE ONLY KIND OF SEX IS PRETTY NORMAL SEX
In mainstream media, even when it's trying to come off as "wild" sex is pretty normal. When things get weird (for instance, when John Slattery wants to pee on Carrie in Sex and the City), everyone is vaguely disgusted and it becomes a big "thing". In fact, when sex starts deviating from "normal" positions, there's always some kind of taboo attached to it. To use Sex and the City as an example again, Samatha's sex life is considered incredibly deviant. But in real life, most of us are doing at least half the sex things (or equivalents thereof) that she does on the regular. People in real life get freaky and when it's between two consenting adults, individual sexual proclivities are officially not weird, despite what pop culture might have you thinking.
5. ONLY VERY DEVIOUS WOMEN RECEIVE ORAL SEX
Why is it that women never receive oral sex in pop culture unless both hers and the man's body are completely covered by the duvet (doesn't it get hot in there?)? The women you associate with receiving oral sex openly are women like Samantha Jones (who we've already established is a total perve) and most recently Rosamund Pike's Amy in Gone Girl, who obviously is a murderous femme fatale. In reality, some very nice women definitely enjoy receiving oral sex. Above the covers too!
6. SEX ALWAYS LOOKS REALLY SEXY
Sex never looks sweaty, squishy, makeup smeared, queefy, period bloody or anything else that isn't completely windblown sexiness on screen. I'm LOLing for infinity. Real sex isn't always sexy; what feels good isn't necessarily what looks the best. Skin and fat rolls into weird shapes when you're twisted into different positions and that's totally normal, so don't pay any attention to the skinny attractive people having clean sex on your TV.