America the Christian Nation?

Today we celebrate the signing of the Declaration of Independence. There’s lots of debate that surrounds the intentions of our Founding Fathers in establishing this great nation. So, what do you think: Was America established as a Christian nation or not?

About Dave Dunham
  • The Dane

    Not if the forefathers were serious about that whole Liberty thing. One can have liberty or one can have a Christian nation, but one can’t have both.

    The Danes last blog post..20090417.teaParty

  • NickO

    In the Constitution, the founding fathers specifically forbade establishment of religion. So it could be said that the U.S. was not founded to be a Christian nation.

    However, IIRC, 94% of the citations in the Federalist papers are to the Bible. Most of the argumentation for a constitutional republic put forth even by “atheists” such as Jefferson assume that the U.S. would be a nation of law-abiding people whose guiding light was the Bible.

    So it is not a stretch to say that the U.S. was founded on principles articulated in the Christian Bible. Implicitly, therefore, the U.S. was a Christian nation at its founding, largely because its population was overwhelmingly Christian. (For instance, all but two of the signers of the Declaration of Independence were evangelical Christians.)

    The founding fathers obviously did not consider the possibility that the state itself would become a god and politics would devolve into religious cults warring over how to serve their god.