Intro to American Government
Legitimacy refers to the justification and acceptance of an authority or institution's right to exercise power and make decisions on behalf of a group or society. It is a crucial concept in understanding the foundations of government and its role in a political system.
congrats on reading the definition of Legitimacy. now let's actually learn it.