How Did The United States Gain Control Of West Florida?
How Did The United States Gain Control Of West Florida? Within months it was annexed by the United States, which claimed the region as part of the Louisiana Purchase of 1803. In 1819 the United States negotiated the purchase of the remainder of West Florida and all of East Florida in the Adams–Onís Treaty, and