What Year Did Football Start In America?
What Year Did Football Start In America? The sport of American football itself was relatively new in 1892. Its roots stemmed from two sports, soccer and rugby, which had enjoyed long-time popularity in many nations of the world. On November 6, 1869, Rutgers and Princeton played what was billed as the first college football game.