- Joined
- Sep 17, 2013
- Messages
- 48,281
- Reaction score
- 25,273
- Location
- Western NY
- Gender
- Male
- Political Leaning
- Liberal
I don't think it was necessary for every American to be Christian, I'm not much of one myself. But I do believe everyone benefitted from Christianity being the faith of 90% of the country for many years.
I also believe that Christianity helped create the excellent standards of living in the US and Europe that made people from Asia, the Mideast, and other places desire to move there.
What, specifically, about Christianity made the U.S. better than, say, Vietnam?