Did The War Of 1812 Increased American Patriotism?

Did The War Of 1812 Increased American Patriotism? Did the War of 1812 increased American patriotism? “Victory” in the War of 1812 unleashed a wave of American patriotism after 1815, ironically emphasizing the triumph of the American Revolution more than the split decision of the “Late War.” The glories of the latter struggle—such as they