Not a real surprise since many had a real watered down definition of what it means to be “Christian” (American = Christian).

Barna reports:

For much of America’s history, the assumption was that if you were born in America, you would affiliate with the Christian faith. A new nationwide survey by The Barna Group, however, indicates that people’s views have changed. The study discovered that half of all adults now contend that Christianity is just one of many options that Americans choose from and that a huge majority of adults pick and choose what they believe rather than adopt a church or denomination’s slate of beliefs. Still, most people say their faith is becoming increasingly important as a source of personal moral guidance.

Read the rest.

You May Also Like

Dealing With Doubt

Shane Vander Hart: While doubt is something that many of us go through from time to time, Christians are not meant to remain in that condition.

The Southern Battle Flag is not about Slavery anymore

Keith Rockefeller: If Southerners want to fly the Confederate Battle flag in its various forms it is their right and they should just be left alone.

Book Review: Fatherless

Fatherless marks the start of a new career for Dr. James Dobson…

8th Circuit Hears Iowa Planned Parenthood Fraud Case

(Omaha, NE) Susan Thayer filed suit against Planned Parenthood of the Heartland…