It seems that in today's society you practically HAVE to belong to a flavor of Christianity to be socially acceptable. For example, there are many people who have said that in an election they'd vote for the opposite party if that candidate were Christian and their party's candidate were atheist or agnostic. As an agnostic all I can do is sigh. While the person in my example may cite that "America is a Christian nation," the fact of the matter is that all the founding fathers were deist at best and that religion should have no part in policy making. (for example, Benjamin Franklin said "Lighthouses are more useful than churches") At the same time I've noticed I've come under a lot of fire for my lack of faith. Many seem to think that I'm somehow less of a person because I don't follow a religion.
Why do you think people feel religion is a necessity? While this may appear to be a generation-related trend, many young people also share this mindset. Also, atheists/agnostics, how do you react to this sort of logic?
Why do you think people feel religion is a necessity? While this may appear to be a generation-related trend, many young people also share this mindset. Also, atheists/agnostics, how do you react to this sort of logic?