African American History – Before 1865
Manifest destiny is the 19th-century belief that it was the divine right and destiny of the United States to expand its territory across North America. This concept was used to justify westward expansion, often at the expense of Native American lands and other nations, promoting a sense of American exceptionalism and the idea that expansion was both inevitable and beneficial.
congrats on reading the definition of manifest destiny. now let's actually learn it.