Most US Christians Call Nation ‘Christian Country’

Most US Christians Call Nation ‘Christian Country’

A majority (67%) of self-identified Christians in the U.S. affirm that “historically, the United States has been a Christian country,” according to a Barna Group report published June 30. This is a four-point increase from 2019, when 63% affirmed this view. In 2021,...