Matthew Baugh
A Conscientious Objector in the Culture Wars


A Christian Nation?
Previous Entry :: Next Entry

Read/Post Comments (1)
Share on Facebook
I was at a church conference recently. One presenter was making a pitch for churches using better communications skills. That sounded like a good thing to me, but she kind of lost me with the next part. In an urgent voice she said...

"We're not even considered a Christian nation anymore."

I'm sure she wanted to impart just how important it was for us to be more effective at what we do. While I certainly agree with that, the way she said it made me wonder.

1. When were we a 'Christian Nation'?
2. What exactly does it mean to be a 'Christian Nation'?
3. Who is in charge of certifying nations for Christianity?
4. When did the certification run out for America?

Many American Christians look back with a sense of nostalgia to a time when all American citizens were practicing Christians with a strongly held personal faith in God. The thing is, I don't believe there ever was such a time in U.S. history.

It's true that for a lot of American history church membership has been a cultural norm. People went to church for a variety of reasons: because of faith, because it was a good place to make business contacts, because it was a good place to meet someone to marry, and because it was expected of them. Christianity was inextricably linked in many people's minds to American patriotism, to American style Democracy, to Free Market Capitalism, and to manifest destiny.

In other words, a lot of people were going to church, but many of them were going for reasons that had nothing to do with genuine faith.

I am a Christian, but in no way do I want to go back to that. A nation where everyone feels compelled to go to church and is convinced that American national interest is automatically God's will is a long way from what I would call 'Christian.'

I would much rather see a nation where Christians were committed to living out Christ-like values. That would mean that...

- Non-Christians would be treated with compassion and respect.

- Help would be offered to the poor and suffering. This wouldn't be filtered through what group they belonged to.

- The right of all people to the essentials of life, and human dignity would be important.

- The faithful would feel compelled to speak out against the government when its policies were cruel, selfish, or unjust.

To answer my own questions, I don't know that America (or any nation) has ever really been a Christian nation. Some have adopted that title but that's just a superficial label. It's what's in the hearts of the people that matters.

I don't think that the label of bring a 'Christian nation' is a worth-while goal. I think a better goal for American Christians is to strive to remain true to the values of our faith and let that push us to make a positive difference in our nation.


Read/Post Comments (1)

Previous Entry :: Next Entry

Back to Top

Powered by JournalScape © 2001-2010 JournalScape.com. All rights reserved.
All content rights reserved by the author.
custsupport@journalscape.com