Just by entering the word Feminism on You tube one day I got an array of videos and comments. People saying that women need to stay in the kitchen, that women are only good for sex, and that feminists are destroying masculinity. Now, I have only been studying feminism and feminist theory one semester but none of these things are ever mentioned in my women studies courses. In fact, my women studies teachers are the only one's that encourage you voice your own opinion whether the teacher agrees with you or not. Never have I read that men should not be masculine, rather Andrea Dworkin argues that feminists believe in the good of men and that maybe one day we will come to an agreement. Non-feminists argue that men will be pigs and that they will never change, what is more male bashing? The more I do research into feminism and what feminism is I just become a better and stronger feminist, why? I believe that feminism can change the world and make a difference. I have a daughter and I want her to critically analyze everything like the feminist ideology claims and have her make her own informed decisions. People out there still have the "F-word" all wrong! Feminsm is Social Justice! It is wanting the best for everyone. There is not one thing about feminism that I disagree with because to me being a feminist is being pro-love, pro-understanding, pro-helping people, pro-children, pro-stable loving relationships and most of all pro-human. What is feminism to you?