Featured Video Play Icon

Is America a Post-Christian Nation? (The Atlantic)

In this video news segment from 2016, the Atlantic discusses the implications of a study from The Public Religion Research Institute (2014) stating that the United States is no longer a White Christian nation. Examining demographic, political, and cultural shifts, this video featuring Robert P. Jones, author of The End of White Christian America, discusses how the power of White Christians has declined precipitously in recent years, and how different groups of White Christians are responding to the fact that they can “take one seat at the table” and stop “pretending that they own the whole table.”

discussion

What does it mean that the U.S. was considered to be a White Christian nation?

What are some of the reasons that the U.S. is no longer considered a White Christian nation, according to the news piece?

What are some of the ways that this video illustrates the decline in power that White Christians are experiencing?

Why is this shift significant? What are the different ways that White Christians are reacting to the fact that they are no longer the dominant group in the U.S. according to the video?

Our Funders