Not a real surprise since many had a real watered down definition of what it means to be “Christian” (American = Christian).

Barna reports:

For much of America’s history, the assumption was that if you were born in America, you would affiliate with the Christian faith. A new nationwide survey by The Barna Group, however, indicates that people’s views have changed. The study discovered that half of all adults now contend that Christianity is just one of many options that Americans choose from and that a huge majority of adults pick and choose what they believe rather than adopt a church or denomination’s slate of beliefs. Still, most people say their faith is becoming increasingly important as a source of personal moral guidance.

Read the rest.

You May Also Like

Perhaps We Are ALL Postmodern and Don’t Realize It

It’s the thought we don’t want to consider – it’s the idea…

Give Thanks

Robert Caspar Lintner once said, “Thanksgiving was never meant to be shut…

Confirming What The Right Thinks About CNN

Audio and video captured by a CNN whistleblower Cary Poarch and released by Project Veritas is compelling and explains the current tone and direction of CNN.

The Original Web-Ster: The Flat Hail Society

The non-meteorological-based Flat Hail Society (FHS) issued a statement today saying its…