Not only are college history courses falsely alleging that America is a fundamentally racist nation, but the decline of Western Civilization has also resulted in the decline of Americans’ basic civics knowledge.
Not only are college history courses falsely alleging that America is a fundamentally racist nation, but the decline of Western Civilization has also resulted in the decline of Americans’ basic civics knowledge.