Not a real surprise since many had a real watered down definition of what it means to be “Christian” (American = Christian).

Barna reports:

For much of America’s history, the assumption was that if you were born in America, you would affiliate with the Christian faith. A new nationwide survey by The Barna Group, however, indicates that people’s views have changed. The study discovered that half of all adults now contend that Christianity is just one of many options that Americans choose from and that a huge majority of adults pick and choose what they believe rather than adopt a church or denomination’s slate of beliefs. Still, most people say their faith is becoming increasingly important as a source of personal moral guidance.

Read the rest.

You May Also Like

Where can we find security?

Why I tend to be inspired at 12:30 AM, always gets me.…

Creating a Flexible Birth Plan

By Katie Moore Some expectant mothers daydream about their delivery day. These…

Responding to Haiti Long-term

Soon after the first quake rocked Port-au-Prince I had an interesting conversation with a friend that began with this engaging question, “How should we respond to the devastation in Haiti?”

Are We Witnessing Hollywood Conservative Streak in Avengers: Infinity War?

Collin Brendemuehl: In Avengers: Infinity War there is a subtext of honesty that conservatives should recognize and acknowledge. It’s something we’ve been looking forward to seeing for several decades. I dare say that this movie is making pro-life overtures.