What Happened To The Native American When The Settlers Went West?
What Happened To The Native American When The Settlers Went West? Hover for more information. As whites settled the American West, Native Americans were pushed off of their ancestral lands and confined to reservations. It typically put the Native Americans on marginal lands that could not support them, particularly after the buffalo herds had been