5 Reasons We Still Need Feminism
Although women's rights have come a long way in the past 100 years, the idea that women are now completely equal and thus no longer need feminism just isn't true. Yes, women do have more social, political and economic rights than ever before — but the fact is, we still have to deal with the harmful side effects of gender inequality on a daily basis. We still have to deal with pay inequality, body-shaming, sex-shaming, slut-shaming, mansplaining, victim-blaming and the constant erosion of our reproductive rights here in the U.S. Additionally, women living in developing nations are still being forced to cope with harsh, gender-specific health threats like female genital mutilation and fight for basic rights like education. So despite what non-feminists might say, it's painfully evident that the global struggle for women's rights is far from over, and we definitely still need feminism.
If you've been having your doubts about whether we still need feminism, or if you're just sick of hearing people say that feminism has outlived its usefulness, then this list is for you. Check out these five reasons that feminism's work is far from done — and the next time a coworker, classmate, friend or anyone else tells you that we're past the point where anyone needs to care about feminism, hit them with these facts.
1. Because Gender Inequality Is Literally Bad For Women's Health
Women are twice as likely to suffer from anxiety and depression than men — and while this is partly the fault of biology, the societal pressures women face because of sexism and gender inequality are also a huge contributor to this awful statistic. Furthermore, since poor mental health often leads to physical health problems such as diabetes, COPD, and heart disease, it's undeniable that feminism is still relevant and necessary to our society — not just for the advancement of women's rights, but for the overall state of women's health and well-being.
2. Because We Deserve Equal Pay For Equal Work
How can anyone argue that feminism is irrelevant when women in the U.S. make an average of 78 cents to their male coworker's dollar for doing the exact same job? And that pay gap gets even larger when you look at the salaries of women of color in this country. This state of affairs isn't just ridiculous — it's also unjust. Until the wage gap is closed (and even after that), we're going to need feminism — because I don't want my nieces (or yours) to work their butts off all through school and endure the misery that is post-grad job hunting, just so they can spend decades being paid less than their male coworkers for the same damned job.
3. Because Reproductive Rights Are At Stake
The recent war on women's reproductive rights — which includes the financial attacks on Planned Parenthood that led the House of Representatives to pass a bill attempting to freeze funding for the non-profit, as well as the extreme difficulties that many women still must cope with just to obtain birth control — may be the biggest reason yet that we still need feminism.
It is not OK that women have to jump through so many hoops to exercise their reproductive rights that the simple act of trying to get birth control pills can make women feel like they're embarking on an ancient mythological quest.
4. Because Our Culture Hasn't Stopped Excusing Rape
There is no excuse for rape. Ever. But we live in a rape culture where women are incessantly slut-shamed and frequently held responsible for their own rapes — or at the very least asked what they were doing/wearing/saying to provoke their attackers. The unacceptable practice of victim blaming remains common, even in an age when we are told feminism's work is complete — we're often encouraged to feel pity for a man accused of rape rather than demand justice for the victim.
And instead of making sure our young men truly understand what does and doesn't qualify as sexual consent, we more often tell our young women to not dress in revealing clothes, flirt too much, drink too much or go out alone at night. This kind of advice is not only victim-blaming — it won't actually keep anyone safe from sexual assault. Until women are no longer held even partially responsible for the actions of violent men, we will still need feminism.
5. Because Women & Girls In Developing Nations Are At An Extreme Disadvantage
Even though we clearly still need feminism, it's easy for some Westerners to say that the movement has achieved all its goals — because they fail to see the troubling bigger global picture of women's rights. But it's impossible to pretend that the state of women's rights in many developing nations is anything but dire.
According to the United Nations Population Fund, an organization which works to protect reproductive rights and support equality in developing countries, women comprise two thirds of all illiterate adults worldwide, as well as 60 percent of the world's poorest people. Education is a fundamental human right — and pretty essential to building a good life, too — but many non-Western women and girls (like Malala Yousafzai) have had to literally risk their lives in order to get it. And this is just the tip of the iceberg when it comes to things that women the world over must struggle with, simply because they were born women.
So until women everywhere are granted equal political, economic and social rights, feminism should have a place and purpose in all of our lives.