I was raised that the world does not owe you a thing so get out there and work for what you want. Others seem to have been raised to feel that they are owed something simply because they were born. This debate on health care is a good example of that. My feeling is that if you want health care then get a job and buy insurance while others think you deserve health care just because. I don't get it.