Not a real surprise since many had a real watered down definition of what it means to be “Christian” (American = Christian).

Barna reports:

For much of America’s history, the assumption was that if you were born in America, you would affiliate with the Christian faith. A new nationwide survey by The Barna Group, however, indicates that people’s views have changed. The study discovered that half of all adults now contend that Christianity is just one of many options that Americans choose from and that a huge majority of adults pick and choose what they believe rather than adopt a church or denomination’s slate of beliefs. Still, most people say their faith is becoming increasingly important as a source of personal moral guidance.

Read the rest.

You May Also Like

Four Action Steps to Help Prevent Bullying

I’ve written about how it is inappropriate to link suicide with bullying…

Monkeying Around With End-of-Life Health Care

If only Americans will receive the same type of end-of-life health care under Obamacare that Panbanisha received at the Great Ape Trust in Des Moines.

Debate: David French vs. Sohrab Ahmari (Video)

David French of National Review and Sohrab Ahamri of the New York Post debated how cultural conservatives should respond to the secular left.

The Flight of Phoenix—Into Absurdity

Phil Bair: Joaquin Phoenix exhibits the same severe lack of rational thinking skills that has become a virulent epidemic in Hollywood.