What is a "man"? Simply an adult male, or does it also imply social responsibility?
Back in the day (either the Good Old Days or The Outdated Patriarchal Regime, depending on your Feminist leanings), being a "man" implied a degree of independence, strength and responsibilty - as the (presumed) head of the family unit the man would be breadwinner, decision-maker, moral guardian and ultimately responsible for his children's religious and academic education. He'd also be the one expected to enlist in times of war.
But now? Just as changes to the workplace means that people are no longer defined by their profession, social changes mean that men and women's social roles aren't necessarily defined by their gender. A lot of expectation of gender roles remain. Some of these, I feel are a natural reflection of indisputable gender differences - men should make up more of the armed forces and frontline emergency services than women, and women should be expected to care for their own children. When the Feminist revolution comes I expect I'll be first up against the wall for those comments, but the irony is that many modern women with ostensibly Feminist leanings still expect men to be strong, financially stable, chivalrous, and so on.