Not a real surprise since many had a real watered down definition of what it means to be “Christian” (American = Christian).

Barna reports:

For much of America’s history, the assumption was that if you were born in America, you would affiliate with the Christian faith. A new nationwide survey by The Barna Group, however, indicates that people’s views have changed. The study discovered that half of all adults now contend that Christianity is just one of many options that Americans choose from and that a huge majority of adults pick and choose what they believe rather than adopt a church or denomination’s slate of beliefs. Still, most people say their faith is becoming increasingly important as a source of personal moral guidance.

Read the rest.

You May Also Like

The “Back-up Guys”: The Vice Presidential Debate (and the Election) for Kids

Parents can use last night’s Vice Presidential debate and the rest of the election to teach their kids about their role as future voters.

Heavenly or Hellish Creatures

The small choices matter.  One will reflect, plan, worry and generally count…

Newsflash: Christian Organizations Enforce Codes of Conduct

Shane Vander Hart: Christian organizations expect all of their staff to abide by biblical standards, gay teenagers are no exception.

Where can we find security?

Why I tend to be inspired at 12:30 AM, always gets me.…