Not a real surprise since many had a real watered down definition of what it means to be “Christian” (American = Christian).

Barna reports:

For much of America’s history, the assumption was that if you were born in America, you would affiliate with the Christian faith. A new nationwide survey by The Barna Group, however, indicates that people’s views have changed. The study discovered that half of all adults now contend that Christianity is just one of many options that Americans choose from and that a huge majority of adults pick and choose what they believe rather than adopt a church or denomination’s slate of beliefs. Still, most people say their faith is becoming increasingly important as a source of personal moral guidance.

Read the rest.

You May Also Like

Where Christian Liberty is Concerned in Parenting

Now that I am an adult, I see clearly the wisdom my…

A Conservative View of Public Policy

John Hendrickson: Conservatives believe that not only is the Constitution based in truth, but also the biblical heritage of the nation. 

The Case for Conscience

Jonathan Merritt writes one of those misguided posts at the Daily Beast arguing…

Brandt Jean to Amber Guyger: “I Forgive”

18-year-old Brandt Jean to his brother’s killer, Amber Guyger: “I forgive, and I know if you go to God and ask him, he will forgive you.”