Is Now 'The End of White Christian America'?
White Christians set the tone for this country, dating back to its founding. But that’s changing in some profound ways. For one thing, white Christians no longer comprise a majority of the nation. As the cultural and religious ground shifts under them we’ll see how their influence is changing.