What Do You Learn In American Studies?
What Do You Learn In American Studies? American Studies majors acquire a broad understanding of the American history, and learn to think analytically and systematically about American structures and institutions, and the representation of various cultural groups in art, literature, and popular culture. What is an American studies major? American studies majors explore the colorful