What Is Meant By The Term New South?

What Is Meant By The Term New South? The term “New South” refers to the economic shift from an exclusively agrarian society to one that embraced industrial development. … Alabama’s natural resources, however, gave the state an advantage over some of its neighboring states in attracting investment and industry. What did Southerners mean by the