Not a real surprise since many had a real watered down definition of what it means to be “Christian” (American = Christian).
For much of America’s history, the assumption was that if you were born in America, you would affiliate with the Christian faith. A new nationwide survey by The Barna Group, however, indicates that people’s views have changed. The study discovered that half of all adults now contend that Christianity is just one of many options that Americans choose from and that a huge majority of adults pick and choose what they believe rather than adopt a church or denomination’s slate of beliefs. Still, most people say their faith is becoming increasingly important as a source of personal moral guidance.
Latest posts by Shane Vander Hart (see all)
- Blum Endorsed By FreedomWorks and American Conservative Union - October 18, 2018
- Watch and Recap: Kim Reynolds and Fred Hubbell Meet for 2nd Debate - October 17, 2018
- Pediatricians Group: Comprehensive Sex Education Is a Failure - October 17, 2018