To answer your question: No, America is not the greatest nation on Earth. I don't believe there is such a thing, 'cause even when people get nostalgic of the 1950's, they're remembering it from the perspective of the white man, like my grandfather who ruled his house without question. Women, children and minorities were objects to be molded and manipulated by men like my grandfather. But even since that time, even with the social progress that has been made, women still are not being paid at an equal level that men are, minorities are still being targeted and abused more than whites and our laws have gotten so draconian in response to terrorist attacks that American's rights are hollow caricatures and it is getting even worse. What saddens me the most is that things could be so much better for all of us if enough people at the top got behind the effort to really change the way America is ran...what is especially saddening is that it isn't going to happen. The powers of influence in Washington far exceed empathy for average everyday Small Town USA folks such as myself.