western country in the world (and among the most irreligious/atheist/agnostic countries in the world too.)
This freaking scares me (mind my french!)
Is religion and moral values becoming less important in western society? Is this a good or bad thing? What do you guys think of religion and Christianity and atheism and all those sorts of things? Should the U.S have a talk to Australia about their rapid decline in religion?
And has anyone noticed the weird coincidence that Australians are the least religious western country and Australians are known by us Americans to be quite lazy and in some places very backwards?
God bless their souls I guess