I think that history will look back on America as one of the greatest countries on the planet. America has spread its ideals more then any other country on the planet has ever done so. It has lead the world in technology, art, and freedom. And in the past, it has tried its best to export that freedom of expression to other people. However, America has not been the same since after WW2. After WW2 there has been a deliberate try from the liberals to destroy this country. This disease that has infected this country comes directly from Russian influence. They have infiltrated our country and have worked to destroy it thru the feminist/Marxist movement to the anti-war/anti-America movement. Americans are completely ignorant to this, but as someone that can see this from the outside, it is quite obvious what is going on. The feminist/anti-war movement is a product of Russian influence to destabilize our country..