AP US History
The American colonies were territories established by European powers in North America, primarily by England, France, and Spain, during the 17th and 18th centuries. These colonies played a crucial role in shaping early American society, economy, and politics, laying the groundwork for future independence and nationhood.