Is it possible that America and the west as a whole has fallen into oligarchy? More and more it seems government legislation is being directed by the market. Regulatory measures being delimited or outright removed by big oil, banking, health care industry or any other private enterprise that has enough wealth. Do Americans agree with the idea a corporation having the same rights as individuals? Seems no matter who is elected the pandering to the wealthy and big business is unavoidable.
The unwillingness of Banks to allow for the public to have any effect over their actions even though they are "to big to fail". Corporations buying politicians so they don't advocate the interests of the people. Wealthy politicians spending millions personal income on campaigns. The list seems endless in terms of the corruptibility of this political system.
Do Americans care their politicians are being bought by corporate interest? Or is that all a figment of the left wings imagination? A paranoia trip? It is hard to argue though the greater and greater of concentration of wealth in the hands of fewer and fewer people is not happening. The bail out was the single largest transfer of wealth from the middle class to the most wealthy. Is this welfare for the rich? Will the public bail out the rich next time? Seems like a pretty solid deal being “to big to fail”. Do Americans believe that taking that power away from these soft power rullers is possible?
What say you all on this?