John Wayne was an iconic American actor and filmmaker known for his distinctive voice, rugged masculinity, and roles in Western films. He became a symbol of American culture and values, particularly during the mid-20th century, influencing the genre of Westerns and how masculinity was portrayed on screen.
congrats on reading the definition of John Wayne. now let's actually learn it.