Is the US a Christian nation?
Despite what the religious right would like you to believe, the US was founded on principles that are in direct opposition to core Christian doctrine. If someone claims that the US is a Christian nation, then they either don't understand what the founders of this country intended, or they don't understand Christianity.
No comments:
Post a Comment