Again, however, they never really found teeth until the Christian Church came along and began enforcing them as being mandatory and objective aspects of social morality, rather than optional "philosophies" a person could either take or leave on their own initiative. There is also little denying that Western society, as a whole, was better off for it.
In any eventuality, the fact of the matter remains that concepts like "just war" and "human rights" originated in Christian cultures, and really no where else in any meaningful fashion. You can do with that knowledge what you will.
However, the correlation would seem to be strong enough to imply something more than mere coincidence, in my opinion.