Study smarter with Fiveable
Get study guides, practice questions, and cheatsheets for all your subjects. Join 500,000+ students with a 96% pass rate.
Cognitive biases show up across nearly every unit of AP Psychology. You'll encounter them in social psychology (how we judge others), attitude formation (why stereotypes persist), decision-making (why smart people make irrational choices), and learning (how prior knowledge shapes new information). The exam regularly tests whether you can identify which bias is operating in a scenario and explain why that shortcut leads to a predictable error.
These biases aren't random glitches. They're systematic patterns that reveal how our brains handle limited time, cognitive load, and emotional pressure. The AP exam will ask you to distinguish between biases that affect how we judge others versus how we judge ourselves, and between memory-based shortcuts versus reasoning errors. Don't just memorize definitions. Know what psychological mechanism each bias demonstrates and be ready to apply them to real-world scenarios in both multiple-choice and FRQ questions.
Our brains constantly use mental shortcuts called heuristics to make quick decisions without exhaustive analysis. These shortcuts are efficient, but they're systematically flawed. They produce predictable errors when the shortcut doesn't match the actual situation.
You judge how likely something is based on how easily examples come to mind. If you can quickly recall plane crashes from the news, you'll overestimate the danger of flying.
You judge probability by how closely something matches a prototype or stereotype. If someone reads poetry and wears glasses, you might assume they're a professor rather than a farmer, even though farmers vastly outnumber professors.
The first piece of information you encounter disproportionately influences your subsequent judgments. An initial price tag sets your expectations for what counts as "reasonable," even if that starting number was arbitrary.
Compare: Availability heuristic vs. Representativeness heuristic: both are mental shortcuts that bypass careful reasoning, but availability relies on memory retrieval ease while representativeness relies on similarity matching. FRQs often present scenarios where you must identify which shortcut is operating.
Attribution is how we explain the causes of behavior, both our own and others'. These biases reveal a fundamental asymmetry in human social cognition: we judge ourselves differently than we judge other people.
When explaining someone else's behavior, you tend to overemphasize their personality (dispositional factors) while underestimating situational factors. A rude waiter? You assume they're a rude person, not that they're stressed and understaffed.
When explaining your own outcomes, you attribute successes to internal factors and failures to external factors. "I aced the test because I'm smart; I failed because the test was unfair."
This bias captures the full asymmetry: you see your own behavior as situationally driven but others' behavior as dispositionally driven. You're late because of traffic; they're late because they're irresponsible.
Compare: Fundamental attribution error vs. Self-serving bias: both distort causal explanations, but FAE applies to judging others (overweighting their personality) while self-serving bias applies to judging ourselves (protecting our ego). If an FRQ asks about attribution, specify which direction the bias operates.
Humans are motivated reasoners. We don't process information neutrally; we actively protect existing beliefs. These biases explain why changing someone's mind is so difficult and why attitudes persist despite contradictory evidence.
You selectively seek, interpret, and remember information that supports what you already believe. If you think a certain diet works, you'll notice success stories and dismiss failures.
This is the mental discomfort you feel when holding contradictory beliefs or when your behavior conflicts with your beliefs. Knowing smoking is harmful while continuing to smoke creates this tension.
You maintain beliefs even after the evidence supporting them has been discredited. If you hear a rumor and later learn it was completely fabricated, the original belief often sticks.
Compare: Confirmation bias vs. Cognitive dissonance: confirmation bias operates before contradictory information is processed (filtering it out), while cognitive dissonance operates after (resolving the discomfort it creates). Both protect existing beliefs but through different mechanisms.
These biases affect how accurately we evaluate our own knowledge, abilities, and predictions. The pattern is consistent: humans systematically overestimate themselves, especially when they lack expertise.
You overestimate the accuracy of your own knowledge or predictions. People who say they're 90% confident in their answers typically get them right only about 70% of the time.
Low performers dramatically overestimate their ability, while high performers slightly underestimate theirs. The core idea is that incompetence prevents you from recognizing your own incompetence.
The "knew-it-all-along" effect. Once you learn an outcome, it seems obvious in retrospect. After an election result, everyone says they "saw it coming."
Compare: Overconfidence bias vs. Dunning-Kruger effect: both involve inflated self-assessment, but overconfidence is a general tendency affecting everyone, while Dunning-Kruger specifically describes how lack of competence prevents accurate self-evaluation. Dunning-Kruger explains why low performers are the most overconfident.
Our judgments aren't made in isolation. Social context systematically biases individual cognition, connecting these biases to broader social psychology concepts.
You adopt beliefs or behaviors because others have adopted them. A crowded restaurant must be good; a trending opinion must be correct.
A positive global impression of someone biases your judgment of their specific traits. You might assume an attractive person is also intelligent and kind, with no evidence for either.
You favor members of your own group over out-group members. This connects to social identity theory (Tajfel & Turner), which argues that group membership becomes part of your self-concept.
Compare: Bandwagon effect vs. In-group bias: both involve social influence on judgment, but the bandwagon effect is about following the majority regardless of group membership, while in-group bias is about favoring your own group regardless of majority opinion.
The same information can lead to different conclusions depending on how it's presented. Context isn't neutral; it actively shapes cognition.
Your decisions change based on how options are described. "90% survival rate" feels safer than "10% mortality rate," even though they convey identical information.
Past investments irrationally influence your future decisions. You keep watching a bad movie because you paid for the ticket, or you stick with a failing project because of the time already spent.
Negative information carries more psychological weight than equivalent positive information. One harsh criticism stings more than one compliment helps.
Compare: Framing effect vs. Anchoring bias: both show that context shapes judgment, but framing involves how the same information is described (gain vs. loss frame) while anchoring involves what information comes first (initial reference point). Both demonstrate that human judgment isn't purely rational.
| Concept | Best Examples |
|---|---|
| Mental shortcuts (heuristics) | Availability heuristic, Representativeness heuristic, Anchoring bias |
| Attribution errors | Fundamental attribution error, Self-serving bias, Actor-observer difference |
| Belief protection | Confirmation bias, Cognitive dissonance, Belief perseverance |
| Self-assessment distortions | Overconfidence bias, Dunning-Kruger effect, Hindsight bias |
| Social influence on judgment | Bandwagon effect, Halo effect, In-group bias |
| Context/framing effects | Framing effect, Sunk cost fallacy, Negativity bias |
| Attitude formation (CED 4.2) | Confirmation bias, Cognitive dissonance, Stereotype-related biases |
| Decision-making errors | Anchoring, Framing effect, Sunk cost fallacy |
Both the availability heuristic and representativeness heuristic are mental shortcuts. What distinguishes how each one leads to judgment errors?
A student blames her poor exam grade on an unfair test but credits her good grade to hard work. Which bias is operating, and how does it differ from fundamental attribution error?
If someone continues to believe a political claim even after seeing evidence that it was based on fabricated data, which two biases might explain this persistence?
FRQ-style: Design a study to test whether the framing effect or anchoring bias has a stronger influence on consumer purchasing decisions. Identify your independent variable, dependent variable, and one potential confound.
Compare cognitive dissonance and confirmation bias: at what stage of information processing does each bias primarily operate, and how do they work together to maintain existing attitudes?