Is the world truly Americanized? or is it Britishized? Or is it possibly Romanized? I think a fair argument could be made in either direction. American civilization is less authoritative and less invasive than the extensive colonization efforts of the English Empire.
Maybe America does have a little bit of our own flavor and is not so dependent upon the English. The question is: Why am I arguing with myself?