What Did US Foreign Policy Return To After WWI?
What Did US Foreign Policy Return To After WWI? President Woodrow Wilson called WW I “the war to end all wars.” After the war the United States returned to its isolationist foreign policy. … This event marked the end of American isolationism and neutralism and the beginning of foreign and defense policy of intense internationalism.