It can be hard for women in the workplace to get across how deeply gender — or more accurately, cultural expectations tied to gender — influences their careers. Unless someone is out there experiencing it every day, taking home lower paychecks and justifying things at work that men never have to, sexism is easily brushed off as a thing of the past. After all, women can become CEOs these days, and at least we're not living in the Mad Men era of housewifery and being asked to sit on our bosses' laps, right? ...Right?
It's safe to say nobody (or at least, most of us, hopefully) wants to return to the inflexible gender norms of the mid-20th century, but to claim we've moved beyond sexism simply means you're not paying attention. Workplace sexism a particular problem; not only does it affect women's financial independence, but it also tends to be systemic and by extension, harder to root out.
As any woman can tell you (and as last year's presidential campaigns showed), women are expected to walk a tightrope between femininity and masculinity. If we show too much ambition, we're power-hungry; if we express an interest in having children, we're passed over for promotions on the assumption that we'll be giving up our careers. All this builds up to create a decidedly unbalanced playing field. Research has shown that small acts of sexism, like sexist jokes or judging a woman's appearance, are just as harmful as more overt manifestations like sexual harassment or handing better assignments to male employees.
With that in mind, here are seven things women are expected to justify at work that men simply aren't.