Over the years I’ve heard a term used to describe American culture from many orthodox Christians: post-Christian.
The more I’ve thought about it in recent days, the more I’ve become convinced that this is a pernicious term that deserves to die.
The term serves two purposes. The first is to make us think that good old days were here and they’ve passed away never to return…
“America was a Christian nation was but now that’s all gone, now it’s completely given over to post-modernists, etc. etc “
Or it’s used to lower our expectations.
“Look, this about as good as we can expect living in a post-Christian nation.”
I won’t deny there’s a certain reality that belies these statements. Certainly, America has turned from God in many ways and is no longer the nation it once was but to speak of a “post-Christian nation” is to speak contrary to what scripture teaches us.
The post-Christian rhetoric spurs us on to expecting and living in a state of perpetual defeat and decline which we are helpless to do anything about. It invites us to nostalgia to a perpetual wake where we bemoan what once was and embrace hopelessness as the status quo.
Yet, reality is that many nations have been through this cycle of spiritual death and then returning to God. As Ecclesiastes 1:9 says, “There is nothing new under the sun.”
The perpetual gloom we tend to choose when we embrace a “post-Christian” understanding of our country is not the hope we’ve been called to. Here’s a thought; If we want to say anything about the state of our country, why don’t we say we’re living in a country, that’s pre-Christian as truly every nation on Earth is or a spiritually cold country.
Yes, there’s a lot of prayer and hard work ahead, and yes we’ve sinned greatly as a nation, but there’s no reason to given into the sourness of “post-Christian” thought.