Is the US a Christian nation (still)?

 

We were founded as a Christian nation. Today we are post-Christian.

See:


Freedom Ribbon

 

Response to comment [from a Catholic]: "Hissium Serpentum, [HisServant] I know you posted here cause I saw your name next to the thread list but I can't read your posts..."

 

We are here for debate (Isa 1:18).

 

Response to comment [from a Christian]: "[S]o let me get this straight.....people left a Christian nation (England) to form a Christian nation..."

 

Where were you during Saturday morning cartoons as a kid? T.V.?

 

Tea Party - Schoolhouse Rock - No more Kings  

 

"...England had a State sponsored religion (Church of England) which is what people coming here were trying to get away from."

 

That's correct.

"The issue of religious freedom has played a significant role in the history of the United States and the remainder of North America. Europeans came to America to escape religious oppression and forced beliefs by such state-affiliated Christian churches as the Roman Catholic Church and the Church of England. That civil unrest fueled the desire of America’s forefathers to establish the organization of a country in which the separation of church and state, and the freedom to practice one’s faith without fear of persecution, was guaranteed. That guarantee was enshrined in the First Amendment to the Constitution (text) as, “Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof...” Full text:
History of Religion in America. Religion, 1500s-Present.

Also see:

The Christian's Responsibility in a Pagan Society, Part 1 by John MacArthur

 

Is the US a Christian nation (still)?