White Christian America Ended in the 2010s - Word&Way

White Christian America Ended in the 2010s

Like the tumultuous adolescent years of human development, the changes during the teen years of the 21st century disrupted American identity as we’ve known it. And as with teenagers, they have created a lot of anxiety and fear about the future.

Read full story