Has anyone else noticed that republicans don't think much of America. Seems that the government is corrupt, evil, incompetent, useless, creating a nation of welfare bums, killing opportunity, taxing everyone to death, spendthrift and generally tyrannical. Of course they only think that in "democrat" season, but still. They scream about how if they don't win, america will cease to exist etc etc.
Seems they have absolutely no confidence in the strength of the average american. They have no confidence in a government that has been the beacon of liberty and freedom to the world, a government that has created the environment and opportunity for America to become the greatest, strongest nation on earth. they view the government apparently that is so fragile that it can collapse within one presidential term, a government that should not have anything to do with education, the environment, wall street regulation, unions, healthcare, social programs etc. A government that should not be investing in the nations infrastructure because it can't afford it. A government that is apparently on the verge of being foreclosed on. A government that isn't even good at expediting wars (as long as a democrat is leading it), which also seems to the sole reason most righties think the government should exist.
When did republicans stop being proud of America?
You righties have a terrible, pessimistic view of one of the most important pillars of American greatness. You constantly put it down, criticize it, wrap yourselves up in the flag and constitution (as long as its convenient), hate science but expect the fruits of its application.
When and why did you guys start hating America so much?