Hegemony refers to the dominance and control exercised by one group, nation, or social class over others. It is the ability to shape the norms, values, and worldviews of a society through cultural, political, and economic means, often without the use of overt force.