Not a real surprise since many had a real watered down definition of what it means to be “Christian” (American = Christian).

Barna reports:

For much of America’s history, the assumption was that if you were born in America, you would affiliate with the Christian faith. A new nationwide survey by The Barna Group, however, indicates that people’s views have changed. The study discovered that half of all adults now contend that Christianity is just one of many options that Americans choose from and that a huge majority of adults pick and choose what they believe rather than adopt a church or denomination’s slate of beliefs. Still, most people say their faith is becoming increasingly important as a source of personal moral guidance.

Read the rest.

You May Also Like

Not Just Any Religion Will Do

Following up on yesterday’s post “no theological virtues, no political order“.  Chuck…

(Video) Left vs. Right: How Do We Make Society Better?

Shane Vander Hart looks at a favorite PragerU video that explores the difference between how the left and the right sees how to improve society.

The Endless Culture War

Culture wars never end. That is why the gospel of redemption in Christ must be taken everywhere. It transforms the lost & the world in which they live.

Reading Well Again

Shane Vander Hart: We would all be better off if we would spend less time on social media and more time reading literature. I know I would.