What Was The Manifest Destiny And Westward Expansion?
What Was The Manifest Destiny And Westward Expansion? Manifest Destiny was a popular belief in the mid-to-late 19th century. Its proponents claimed that the United States had the divine right to expand westward—meaning that U.S. expansion was the will of God. … Manifest Destiny continued as a key American philosophy until after World War I.