College in America was never meant to leave kids with a lifetime of insurmountable debt. Fire all of the liberal college professors and let them try to exist in the real world. Hire real-life professors that teach professions rather than indoctrination of liberal nonsense. Is there really any point in continuing the minority/women's studies nonsense? Any point beyond creating jobs for freaks?