What Was Happening In America In The 20th Century?
What Was Happening In America In The 20th Century? In the early 20th century, America was flexing its economic and political muscle on the international stage. The era was defined by the temperance movement, Progressive-era activism, the sinking of the Titanic and World War I. What major events happened in the 20th century? “The war