A couple of interesting headlines today regarding religion in America from these two articles
– “25% of us have left childhood religions” (Source: San Francisco Chronicle)
– “Protestants Verging on Becoming Minorities” (Source: US News and World Report)
I thought both of these articles were really interesting and in several conversations I’ve had with some friends, it seems that a lot of people see this as a good and healthy sign. Religion is no longer a vice or crutch that people in America turn to – we are “evolving” out of it.
I would take issue with some of those sentiments, but their existence and the articles above signal some kind of major culture shift away from “America as a Christian nation.” As nice as it would be, I don’t think there’s an “easy answer” for the church other than simply remaining faithful (and looking at church history, that doesn’t seem so easy!). Part of me is excited by the idea of living in a culture that is “post-Christian” (or, post-Christian majority if you don’t like the term post-Christian), because I think it forces us to move forward with a great deal more humility as we tread unknown waters.
So…interesting stuff. Any thoughts?