American education reforms refer to the various movements and initiatives aimed at improving the quality, accessibility, and structure of education in the United States, particularly during the 19th century. These reforms were driven by a belief in the importance of education for a democratic society and sought to establish public schooling, promote literacy, and address social inequalities within the educational system.