What the Classroom Didn’t Teach Me About the American Empire
By Howard Zinn
With an occupying army waging war in Iraq and Afghanistan, with military bases and corporate bullying in every part of the world, there is hardly a question any more of the existence of an American Empire. Indeed, the once fervent denials have turned into a boastful, unashamed embrace of the idea.
However the very idea that the United States was an empire did not occur to me until after I finished my work as a bombardier with the Eighth Air Force in the Second World War, and came home. Even as I began to have second thoughts about the purity of the “Good War,” even after being horrified by Hiroshima and Nagasaki, even after rethinking my own bombing of towns in Europe, I still did not put all that together in the context of an American “Empire.”
I was conscious, like everyone, of the British Empire and the other imperial powers of Europe, but the United States was not seen in the same way. When, after the war, I went to college under the G.I. Bill of Rights and took courses in U.S. history, I usually found a chapter in the history texts called “The Age of Imperialism.” It invariably referred to the Spanish-American War of 1898 and the conquest of the Philippines that followed. It seemed that American imperialism lasted only a relatively few years. There was no overarching view of U.S. expansion that might lead to the idea of a more far-ranging empire — or period — of “imperialism.”
[For the rest of this article, and for the intro by Tom Engelhardt, go here to TomDispatch.com.]