Not a real surprise since many had a real watered down definition of what it means to be “Christian” (American = Christian).

Barna reports:

For much of America’s history, the assumption was that if you were born in America, you would affiliate with the Christian faith. A new nationwide survey by The Barna Group, however, indicates that people’s views have changed. The study discovered that half of all adults now contend that Christianity is just one of many options that Americans choose from and that a huge majority of adults pick and choose what they believe rather than adopt a church or denomination’s slate of beliefs. Still, most people say their faith is becoming increasingly important as a source of personal moral guidance.

Read the rest.

You May Also Like

G.K. Chesterton: The New Rebel is Not a Revolutionary

British writer, G.K. Chesterton (1874-1936) wrote in his book Orthodoxy: The Romance…

Do Not Even Think About Trying to Resuscitate: a Commentary

How far are we from “Do Not Resuscitate” becoming everyone’s default medical…

John Piper: We Are a People of The Book

John Piper on the centrality of the Bible from his sermon, “Building…

Film Review: Love Covers All

Producer/Director Kyle Prohaska’s second film, Love Covers All, is an enjoyable and thought-provoking one that contains a positive faith-affirming message.