Thought of the Day 10.14.09

Is America a Christian nation?

I used to think this was a terribly important question. After all, if America was founded on Christian principles, then the abandonment of Christianity would eventually mean the loss of key American ideas. But then I found that people are irrational. Even though they do not recognize God as their source, they still believe strongly in the freedoms and rights our system protects.

So then I started thinking about whether it’s accurate to say America is a Christian nation now. Certainly if you look at the way we treat poor people, allow divorce, wage war, celebrate sexual deviance, hoard our money, and protect abortion, you’d have to admit at least that we aren’t a very good Christian nation. Besides, the popular view of America entails freedom of religion, or from it, as the case may be.

My task is to serve God by serving my country. I’m just not sure anymore whether the most effective way to do that is by persuading people that America is or ever was a Christian nation.

2 comments:

Stan said...

I don't know that a nation can be classified as "Christian". Only individuals can be "Christian".

That being said, if we let loose of the idea that our country was established with Christian ethics and perspectives in mind, we lose the ability to hearken back to our roots. "The Founders never intended this." We've seen how that works with things like sexual purity and marriage. Give up the original roots, and the whole thing slips away until the original notions are considered archaic and foolish. I'm not sure I like that plan.

Andrew Tallman said...

Stan, didn't you just repeat what I said in the first paragraph?

Let's admit the obvious. People in America who don't believe in God or at least Christianity still are robustly attached to the idea of personal rights and freedoms. Yes, this is an irrational disconnect between the conclusion and the premises of their worldview, but it simply is their reality. So it's not clear that losing the foundation means losing the result.

But look at it in reverse. Considering all the effort that has gone into calling people to "hearken back to our roots," can you honestly assess that as a successful strategy for persuading them of anything meaningful?

All it seems to do is reinforce the conservatives in our pride of being the "true Americans," which arrogance further alienates us from being effective at reaching the culture.

But here's one more thing to consider. The "founding fathers move" is an attempt to gain epistemological high ground in the debate, right? But what do you say if your country doesn't have this particular luxury? Would we not be aiming for the same things even if our country were China, Iran, or Russia? Calling those countries back to their roots might well be a very bad thing. Roots aren't always right.

And that's the point. If you do manage to convince non-Christians that the origins of this country were Christian, they're just as likely to say, "And good riddance, then," as they are to say, "Oh, gosh, you're right, we must return to that ASAP."

I'm looking at this approach purely as a public policy debating strategy, and I'm increasingly thinking that it's not a useful one.