U.S. encroachment refers to the gradual expansion of American influence and territorial control over other regions, particularly in the Americas and the Pacific, during the 19th and early 20th centuries. This concept is closely tied to the idea of Manifest Destiny, which justified the belief that it was America's divine right to expand its territory and spread its values, often at the expense of indigenous populations and foreign nations.