Tag: Christianity in America

Read More

The decline of Christianity in America, and why I’m hopeful

Christianity is declining in America.  New research from the Pew Research Center looks at the data and paints a picture of an even less Christianized future for America.  In looking at religious trends and various […]