What Does Religion Say About War?
What Does Religion Say About War? Most Christians believe that war should be avoided if possible, and should only be undertaken if all efforts to resolve an issue by peaceful means have failed. Many Christians see war as the result of a failure to live by God’s standards. How does religion play a role in