The British colonies in America were a series of territories established by England along the eastern seaboard of North America from the early 17th century until the late 18th century. These colonies played a crucial role in shaping the economic, cultural, and political landscape of what would become the United States, especially during events like the Seven Years' War.