John Wayne was an iconic American actor and filmmaker, best known for his roles in Western films during the mid-20th century. His rugged persona and distinctive voice made him a symbol of masculinity and American ideals, particularly in the genre of Westerns where he often portrayed strong, stoic characters who embodied themes of honor and bravery.
congrats on reading the definition of John Wayne. now let's actually learn it.