Increased American expansionism refers to the growing desire and action taken by the United States during the early 19th century to extend its territory and influence across North America. This phenomenon was marked by a belief in Manifest Destiny, where Americans felt it was their right to expand westward, which had significant implications for neighboring regions, including Canada, particularly in the context of the War of 1812.