Study smarter with Fiveable
Get study guides, practice questions, and cheatsheets for all your subjects. Join 500,000+ students with a 96% pass rate.
Cognitive biases are central to understanding how cognition actually works in the real world. In Intro to Cognitive Science, you're tested on how mental processes like attention, memory, reasoning, and decision-making can systematically deviate from rational ideals. These biases reveal the underlying architecture of human thought: our reliance on heuristics, our limited cognitive resources, and the shortcuts our brains take to navigate complex environments.
When you study cognitive biases, you're really studying the boundaries and trade-offs of human information processing. Exam questions will ask you to identify which cognitive mechanism explains a given bias, compare biases that stem from similar processes, and apply these concepts to real-world scenarios. Don't just memorize definitions. Know what each bias tells you about how the mind organizes, retrieves, and weighs information.
These biases affect how we seek out, filter, and make sense of new information. The underlying mechanism involves selective attention and motivated reasoning. Our cognitive systems aren't neutral processors; they actively shape what we notice and how we interpret it.
Confirmation bias is the tendency to search for, favor, and recall information that supports what you already believe, while ignoring or downplaying contradictory evidence. It's not just about seeking agreeable facts. Memory distortion plays a role too: you'll recall confirming evidence more easily than disconfirming evidence.
The framing effect shows that identical information leads to different choices depending on how it's presented. A medical treatment described as having a "90% survival rate" feels very different from one with a "10% mortality rate," even though the numbers are the same.
Compare: Confirmation Bias vs. Framing Effect. Both show that information processing isn't objective, but confirmation bias involves active seeking of certain information, while the framing effect involves passive reception of differently presented information. If an exam question asks about how context shapes reasoning, the framing effect is your go-to example.
These biases arise from how we retrieve information from memory to make judgments. The key mechanism is accessibility: whatever comes to mind most easily has a disproportionate influence on our thinking.
The availability heuristic is a mental shortcut where you judge the likelihood of an event based on how easily examples come to mind. If you can quickly think of plane crashes, you'll overestimate how common they are, even though car accidents kill far more people each year (roughly 40,000 annually in the U.S. vs. fewer than 500 in commercial aviation worldwide).
Hindsight bias is the "I knew it all along" phenomenon. After learning an outcome, you misremember your prior predictions as having been more accurate than they actually were.
Compare: Availability Heuristic vs. Hindsight Bias. Both involve memory distorting judgment, but availability affects prospective estimates (predicting future likelihood), while hindsight affects retrospective assessments (evaluating past predictions). Both illustrate that memory is reconstructive, not reproductive.
These biases occur when initial information disproportionately shapes later judgments. The mechanism involves insufficient adjustment: you start from a reference point and fail to move far enough away from it, even when that reference point is irrelevant.
Anchoring bias means that the first piece of information you encounter on a topic heavily influences your subsequent estimates. In one classic study, participants who were first asked whether Gandhi died before or after age 140 gave significantly higher estimates of his actual age at death than those anchored at age 9. The anchor doesn't even need to be plausible.
The sunk cost fallacy is the tendency to continue investing in something because of what you've already put in, rather than evaluating whether continuing makes sense going forward. You stay in a bad movie because you paid for the ticket, or you keep pouring money into a failing project because you've "come this far."
Compare: Anchoring Bias vs. Sunk Cost Fallacy. Both involve being "stuck" on initial information, but anchoring affects estimates and judgments while sunk cost affects behavioral commitment. Anchoring is about numbers; sunk cost is about actions.
These biases involve faulty evaluation of your own knowledge, abilities, and predictions. The underlying mechanism is poor metacognition: your ability to accurately monitor your own cognitive processes is more limited than you'd expect.
The Dunning-Kruger effect describes a specific pattern: people with low ability in a domain tend to overestimate their performance, while people with high ability tend to slightly underestimate theirs. The core insight is that the skills needed to perform well in a domain are the same skills needed to recognize good performance in that domain.
Overconfidence bias is the general tendency for people's confidence in their judgments to exceed their actual accuracy. In calibration studies, when people say they're "90% sure" of an answer, they're typically correct only about 70-80% of the time.
Compare: Dunning-Kruger Effect vs. Overconfidence Bias. Both involve inflated self-assessment, but Dunning-Kruger is specifically about skill-dependent metacognitive failure (the unskilled don't know they're unskilled), while overconfidence is a general tendency affecting people across skill levels. Know this distinction for exam questions about metacognition.
These biases show how social context and emotional valence shape information processing. The mechanism involves the integration of affective and social information into what might otherwise seem like purely cognitive judgments.
The bandwagon effect is the tendency to adopt beliefs or behaviors because many other people hold them. You use others' behavior as a signal about what's correct or desirable, especially when you lack direct information yourself.
Negativity bias is the tendency to weigh negative information more heavily than equally strong positive information. Losses feel roughly twice as painful as equivalent gains feel good, criticism stings more than praise pleases, and threatening faces in a crowd grab your attention faster than friendly ones.
Compare: Bandwagon Effect vs. Negativity Bias. Both show that cognition isn't purely individual or rational, but the bandwagon effect emphasizes social information while negativity bias emphasizes emotional valence. Both can be understood as adaptive heuristics that sometimes misfire in modern contexts.
| Concept | Best Examples |
|---|---|
| Selective Information Processing | Confirmation Bias, Framing Effect |
| Memory-Based Judgment | Availability Heuristic, Hindsight Bias |
| Insufficient Adjustment | Anchoring Bias, Sunk Cost Fallacy |
| Metacognitive Failure | Dunning-Kruger Effect, Overconfidence Bias |
| Social Influence on Cognition | Bandwagon Effect |
| Affective Influence on Cognition | Negativity Bias, Framing Effect |
| Heuristics (Adaptive Shortcuts) | Availability Heuristic, Bandwagon Effect |
| Violations of Rational Choice | Sunk Cost Fallacy, Framing Effect, Anchoring Bias |
Both the availability heuristic and hindsight bias involve memory distortion. What distinguishes when each bias operates (prospective vs. retrospective judgment)?
A student continues studying for a major they hate because they've "already put in two years." Which bias explains this, and why does it violate principles of rational decision-making?
Compare and contrast the Dunning-Kruger effect and overconfidence bias. Under what conditions would you expect each to occur, and what do both reveal about human metacognition?
If a researcher wants to demonstrate that how a problem is presented matters more than its objective content, which bias should they study? Design a simple experiment to test it.
Which two biases best illustrate that human cognition relies on mental shortcuts (heuristics) that are often adaptive but can lead to systematic errors? Explain the trade-off between efficiency and accuracy that each demonstrates.