- Joined
- Apr 29, 2013
- Messages
- 6,081
- Reaction score
- 3,216
- Location
- Benghazi
- Gender
- Male
- Political Leaning
- Very Liberal
It seems that America has shifted liberal, especially on social issues. From a liberal perspective, there's always room for improvement... but short of a cataclysm where society has to rebuild itself, is it safe to say the culture war is over? Seems that the heavy lifting is done, and its now all momentum.
What are your thoughts?
What are your thoughts?