American Cinema – Before 1960
New Hollywood refers to a transformative era in American cinema that began in the late 1960s and continued into the 1980s, marked by a shift towards director-driven films and an emphasis on more personal storytelling. This period saw the emergence of young filmmakers who sought to break away from traditional Hollywood norms, resulting in innovative narratives and bold subject matter that often reflected the social upheavals of the time. New Hollywood directors became the new auteurs, blending commercial appeal with artistic expression, reshaping the landscape of American cinema.
congrats on reading the definition of New Hollywood. now let's actually learn it.