Not a real surprise since many had a real watered down definition of what it means to be “Christian” (American = Christian).

Barna reports:

For much of America’s history, the assumption was that if you were born in America, you would affiliate with the Christian faith. A new nationwide survey by The Barna Group, however, indicates that people’s views have changed. The study discovered that half of all adults now contend that Christianity is just one of many options that Americans choose from and that a huge majority of adults pick and choose what they believe rather than adopt a church or denomination’s slate of beliefs. Still, most people say their faith is becoming increasingly important as a source of personal moral guidance.

Read the rest.

You May Also Like

Bill Nye is Clueless About Common Core and Its Opposition

Bill Nye gives an inane explanation for why he supports core standards and why some people oppose them demonstrates he lacks knowledge on both subjects.

The Moral Confusion of the Iowa’s United Methodists

Shane Vander Hart: The leadership of the Iowa Conference of the United Methodist Church in its agreement with Rev. Anna Blaedel rejects the authority of scripture.

Widely Published Webcam Abortions Study by Partner of Planned Parenthood

A touted study on webcam abortions in The Des Moines Register is discovered to have been written by a partner of Planned Parenthood, an unreported bias.