Brawndo said:
Many traditionally male roles and traits are no longer celebrated or instilled in today's youth. In my opinion, pop culture has largely contributed to the feminizing of men and boys. For example, one thing that particularly bothers me is how popular it is on TV shows to portray husbands and fathers as bumbling idiots who are easily controlled by their wives and children.
And let's not forget the hypersensitive nanny-state parents that don't let boys be boys. I worked as a camp counselor at my local YMCA with 12-14 year olds, and I have never seen such a group of sissies, hypochondriacs, and whiners. How are these boys going to grow up to be leaders of men and protectors of women?
EDIT: Since people on this forum never seem to read past the OP:
From Post #19: Never once did I suggest or condone a return to a time of women being "barefoot and pregnant" in the kitchen while the manly men went out to hunt bears. I like that women work and men have shared responsibility with children. I don't think the father/husband should hold a dictatorship over his household and beat his wife and kids.
But I do think that men and women have certain innate traits that make them better suited for different things. When I'm feeling sad and I need a sympathetic ear, I call my mother or a female friend, because women are generally better at empathy. And every girlfriend I've ever had enjoyed feeling safe in my presence, even if pragmatically there isn't much danger a cop couldn't protect her from. But boys and male teens today are increasingly turning into overly sensitive delicate flowers
I'm not sure if "concerned" is the appropriate term, but I definitely agree with the gist of this. It's becoming increasingly unpopular for a man to act as a man, and that just doesn't sit well with me. I don't really care about the "Gender Roles" per se, but I definitely think we should be teaching male children to be confident, assertive, and protective. It's not so much a gender thing as it is a maturity thing. The way we're going, we'll have whole generations insistent that someone else is at fault for every single problem they've ever had, and who refuse to take any sort of personal responsibility. It's depressing as all hell, and we need to fix that if we're ever going to move forward as a society.
I'd also list out what to teach women, but I still don't understand how they think, so I really can't. I would assume (and these are generalizations based on personal experience and stereotypes) that to reach an appropriate level of maturity, women should generally be taught compassion, empathy, and various other "maternal" aspects. That's just a guess though.