AP US History
American art refers to the visual arts produced in the United States, reflecting the country's diverse cultures, history, and landscapes. This form of expression evolved significantly over time, particularly during the period of Manifest Destiny, as artists sought to capture the spirit of expansion and the changing American identity. The themes of exploration, nationalism, and the romanticized depiction of the West played a vital role in shaping American art during this era.