Not a real surprise since many had a real watered down definition of what it means to be “Christian” (American = Christian).

Barna reports:

For much of America’s history, the assumption was that if you were born in America, you would affiliate with the Christian faith. A new nationwide survey by The Barna Group, however, indicates that people’s views have changed. The study discovered that half of all adults now contend that Christianity is just one of many options that Americans choose from and that a huge majority of adults pick and choose what they believe rather than adopt a church or denomination’s slate of beliefs. Still, most people say their faith is becoming increasingly important as a source of personal moral guidance.

Read the rest.

You May Also Like

U.S. House Passes Late-Term Abortion Ban

The U.S. House of Representatives last night approved, 228 to196, a late-term abortion ban which prohibits abortion after 20 weeks post-fertilization.

The Avengers and the Pro-Life Universe

Mark McCurdy: I doubt the writers of Avengers: Infinity War and End Game were trying to make a pro-life statement, I think that’s precisely what they did.

Kirk’s Six Principles of Conservatism

Shane Vander Hart: Conservatism is about principles, not personalities or politics. Russell Kirk offers some principles to help formulate a working definition.

Pro-Life Billboard Removed, Exclusive Interview with Lamar Advertising Exec

Billboards usually stay up at least a month, and often longer, depending…