I'm not asking about your personal opinion of exporting democracy. Nor am I asking for a partisan bitchfest about whether Obama or Bush Administrations specifically supported democracy. I'm asking whether you believe the US government, regardless of the administration, has traditionally supported democracy in other nations.
I think this is one of the biggest myths we Americans have about our own history, because I just don't see any historical evidence to support this idea. Sure, we support democracy in places where the incumbent authoritarian regime is hostile to us...but then, so do Russia and China, so it's hard to claim any moral high ground there. By the same token, we've never had any qualms about cozying up to dictators who wanted to be our friends.
What do you think? Has the United States historically displayed a track record of supporting democratic movements abroad?