Why do many today assert that the United States is (or should be) a Christian nation and what do they mean by this? What are the implications of thinking about Christianity in this way? And what are the implications for political and social life in the U.S. moving forward?