Theories of International Relations
Legitimacy refers to the recognized right of an authority, often a governing body or institution, to exercise power and make decisions. It plays a critical role in maintaining stability and order within societies, as legitimacy is often derived from various sources including legal frameworks, cultural norms, and popular consent. The concept is essential for understanding how institutions operate effectively and how they gain the support of the populace.
congrats on reading the definition of legitimacy. now let's actually learn it.