How Did Hollywood Become The Center Of The Entertainment Industry?
How Did Hollywood Become The Center Of The Entertainment Industry? Hollywood had become the centre of the American film industry by 1915 as more independent filmmakers relocated there from the East Coast. For more than three decades, from early silent films through the advent of “talkies,” figures such as D.W. Griffith, Goldwyn, Adolph Zukor, William