
Originally Posted by
Elfren Eve
Its true, however, in terms of history, its quite ironic.
Back to when different types of civilizations were developing, majority of the society had men dominance (and believed in patriarchy), and women had almost no rights.
It's funny how 21st century, we complain of how women has more rights, when we as men always had authority and dominance over women (This goes back far in history).
I hate to say it, but that's how our society shapes its self.
Not all countries are like the United States, Australia, Canada, and any economically stable countries.
Cambodia, Pakistan, Afghanistan, etc. all have very harsh women rights. No education, only labor, martial rape, etc. Its really tough for women to live in countries like these.
From a male perspective living in a economically developed country, of course, I believe that we should aim for equality with women.
But don't be so narrow minded, this is a very sensitive topic.