Disclaimer: I hate to discuss “social issues” on my blog because a lot of times it is emotional, personal issue & intertwined in faith & religion. However, I have felt very strongly about this issue particularly since I feel it’s making my gender look bad.
Women, I know we love to dwell on stuff. But seriously though can we quit this talk of “War on Women”?
First of all it’s non-existent; we’re making a mountain out of a mole hill. Some politicians said some things that were taken out of context. No one in their right mind is going to take away birth control. Yes, there is talk of defunding Planned Parenthood. So what if they do. It will still exist and you can always go to your county health clinic and receive the same amount of care. If you aren’t too proud to get free birth control at PP then your local county health clinic shouldn’t bother you either. As for insurance providers opting out of providing it, so what, that’s their prerogative and their business decision. If ObamaCare didn’t exist this wouldn’t even be an issue. A downfall of a single payer system, limited options for you to make the right decisions for you and your family.
Second of all, conservatives aren’t the ones degrading women. Liberals and other women are. Yes, Rush Limbaugh used a derogatory word, he apologized. Anyone listening to him that day, which I was, knows he was using absurdity to show how ridiculous the actual situation was. Seriously, we have more pressing issues like the economy, jobs & some wars. However, democrats & liberals in congress wish to bring out an old issue, such as women’s rights to health care in order to stall.
Also wasn’t it degrading to women when Monica Lewinsky was found underneath the president’s desk? Have we not progressed enough in the professional field that we have to perform sex acts to advance our career? Or how about Bill Maher continually calling Sarah Palin & Michelle Bachman derogatory names and running them down. Aren’t they women too? Or because they’re conservative they don’t count?
I consider myself a feminist because my gender has never stopped me from doing anything I wanted to. I have never felt less of a human being because I was a woman nor have I ever let anyone make me feel that way. Ladies, quit letting politicians and other mouth pieces, liberal or conservative make you feel less than what you are.
The “War on Women” is what you make it. I say don’t make it anything. Go out in the world and be bold, go be a CEO of a company, which will show any actual women haters out there, what we are really capable of.
Some other’s opinions: