What Country Claimed Florida Before Then?

What Country Claimed Florida Before Then? Florida was under colonial rule by Spain from the 16th century to the 19th century, and briefly by Great Britain during the 18th century (1763–1783) before becoming a territory of the United States in 1821. What countries have claimed Florida? Florida Became a British Colony In 1763, France, Britain,