Is Now 'The End of White Christian America'? | KCUR

Is Now 'The End of White Christian America'?

Aug 5, 2016

White Christians set the tone for this country, dating back to its founding. But that’s changing in some profound ways. For one thing, white Christians no longer comprise a majority of the nation. As the cultural and religious ground shifts under them we’ll see how their influence is changing.

Guest: