Bush by far, he dragged the US into 2 failing wars, started a Global War on Terror, that only spurred more terror, he ignored Katrina, he introduced perhaps the most destructive bill in Human history, PATRIOT.
What has Obama done? Social reform, promotion of peace, socialised healthcare, welfare, all these amazing policies (everyone on the damn planet loves these, except you Americans.) And what do you have to say about it? It's damaging your country? >.>