What Ended World War 1?

What Ended World War 1? What ended World War 1? In 1918, the infusion of American troops and resources into the western front finally tipped the scale in the Allies’ favor. Germany signed an armistice agreement with the Allies on November 11, 1918. World War I was known as the “war to end all wars”