West African colonies were territories in West Africa that were subject to European colonial rule from the late 19th century until the mid-20th century. These colonies, established primarily by France and Britain, experienced significant political, social, and economic transformations due to colonial policies and practices, which ultimately led to movements for independence and decolonization across the region.