Confronting U.S. History
Many American schoolchildren in the southern part of the United States are still taught that the American Civil War was caused by northern aggression, where Union soldiers attacked the south, raped their women, and stole their land?
German philosopher Georg Hegel told us, “The only thing we learn from history is that we learn nothing from history.”
How could this turn out any differently, when we deliberately lie about our past? What’s the matter with telling the truth?
Leave a Reply