13 Fantastic Feminist Moments From 2014 Entertainment, Because Hollywood Finally Realized Equal Rights Are a Beautiful Thing
The word "feminism" was a definite buzzword of 2014. When celebrities weren't busy announcing to various magazines and news outlets that they aren't feminists, clearly because they didn't quite understand the definition, the rest were coming out of the girl power closet to announce they are feminist. In fact, the very concept was such a hot button topic that Time even made the absurd suggestion of removing it from the lexicon. However, enough of Hollywood is on board with the crazy idea that men and women should have equal rights, and feminism is here to stay. By the way, that's what feminism means, y'all. Equal rights for men and women. For the record. Just in case you're still confused.
And if this kind of behavior can continue on into 2015, I wouldn't be mad about it. Consider this slideshow a helpful guidebook for celebrities using their star power for good, because standing up for what you believe in really does change people's minds and their lives.