The other day I was talking about the decline of the American Empire(ie trillions dollar deficit, weakened global military apparatus, narcissistic home populace) and the implications, as well as consequences, its collapse would have on the world. I was talking about what a horrible effect that could plausibly have on the world when my friend responded that it would likely be good, highlighting the end of British Imperialism as a triumph for humanity. Really? Has humanity really benefited through the ages from the demise of Western empires like ancient Greece, Rome, and Imperial Britain? I mean, many nations may have achieved their autonomy from Empire's ending, but historically have in turn begotten horrible poverty because of their immature and weak economic institutions... not to mention the often volatile and anarchic political situations. It seems like academia has waged a war against Imperialism for over a century, but when we look at nations like India and China, what is the bright beacon of light in otherwise dark circumstances? To me, it is quite clearly the liberal, secular influence of the West(which has much more influence in India than in terms of China.)
Now, I don't mean to be beating the drums too much, and I understand as well as recognize the cost both in lives and treasure Imperialism has on the home nation, not to mention the often negative cultural and social affects it brings on the victim country. I also don't want to come off as arrogant, nor do I want to disregard the fruits of free trade and the fact that some influence is natural so exerting by way of arms is unnecessary.
I foresee responses that highlight the usual points of Imperialism=Bad in general, but I'm more curious about people's ACTUAL perception of America's global military apparatus, instead of some philosophical discussion of Imperialism's moral implications(Whether good or bad). What is concrete evidence that America is a malevolent force in the world? Has the Democratic peace theory not largely succeeded, thanks largely to Western nations? Do countless nations not live off American aid and/or under the shadow of her tree of liberty(heh), allowing the US military to fight wars that their militaries would otherwise be forced to partake in?
(Note: Keep in mind when I talk of America as an Empire I mean in a distinctly American sense of the term, as used by Thomas Jefferson when he wrote about the "Empire of Liberty" that would be America.)