Not a real surprise since many had a real watered down definition of what it means to be “Christian” (American = Christian).

Barna reports:

For much of America’s history, the assumption was that if you were born in America, you would affiliate with the Christian faith. A new nationwide survey by The Barna Group, however, indicates that people’s views have changed. The study discovered that half of all adults now contend that Christianity is just one of many options that Americans choose from and that a huge majority of adults pick and choose what they believe rather than adopt a church or denomination’s slate of beliefs. Still, most people say their faith is becoming increasingly important as a source of personal moral guidance.

Read the rest.

You May Also Like

Love, Love!, Love? -Part Four-

This is the last part of this series in which we have…

Binny’s Beverage Depot Protects Its Beer Better Than We Protect Our Kids

Steve C. Sherman: We need to protect our kids as much as Binny’s Beverage Depot protects its booze! It is time to place armed guards in schools.

40 Inspirational Speeches in 2 Minutes

Feeling a little down today, but I was glad I ran across…

End Taxpayer Funding of Abortion in Iowa

It still isn’t too late for the Iowa Legislature to end the taxpayer funding of abortion as it is being negotiated in a conference committee.