Call us: (54.11) 4896-2693 E-mail us:

The TR Company


9/5/19 Feminism

Feminism is defined as “the belief that men and women should have equal rights and opportunities.”  It encompasses social, political and economic equality of the sexes. However easy this notion might seem, there is some misunderstanding about what this term actually means.
“Feminism comprises the belief that women are superior”: Feminism has often been misread as a celebration of womanhood, but that would defeat the purpose of feminism, which is to challenge the notion of “womanhood” altogether. Identifying as a feminist does not mean believing that women are better than men. Rather, it’s saying that being a woman or gender non-conforming person should be as good as being a man.
“Feminism encourages negative views of men”. Since feminism is against gender stereotypes, generalizations like “men are bad” would actually fall under the category of “anti-feminist,” since it is itself a gender stereotype. It is true that the current state of the world is one in which men have more power, and that is just what feminism aims to dismantle: not in order to hurt men, but to free people from stereotypes and expectations that could be harmful.
To put it simple: if you believe women and men should have the same political, economic, cultural, personal and social rights Congratulations, you are a feminist!

No Comments
Post a Comment