There is something I have wondered at for a while now, and I'll get right to the point.
Why is it that throughout history, the majority of cultures have been patriarchal or at least egalitarian? Rarely, if ever (I know of no examples) has there been a culture in which women dominate to the extent that men did, and in some areas still do.
Why is this? I can't think of any real reason why this should be the case, but it is and has been, all across the globe, in cultures that have had little to no contact with each other.
Thoughts?
Why is it that throughout history, the majority of cultures have been patriarchal or at least egalitarian? Rarely, if ever (I know of no examples) has there been a culture in which women dominate to the extent that men did, and in some areas still do.
Why is this? I can't think of any real reason why this should be the case, but it is and has been, all across the globe, in cultures that have had little to no contact with each other.
Thoughts?