At What University Did American Sociology In The United States Emerge?

At What University Did American Sociology In The United States Emerge? At what university did American sociology in the United States emerge? Its foundations can be traced to the emergence of the Chicago School in the early decades of the twentieth century. The first professional sociology journal in the United States, the American Journal of