๐Ÿ’•Intro to Cognitive Science

Cognitive Biases Examples

Study smarter with Fiveable

Get study guides, practice questions, and cheatsheets for all your subjects. Join 500,000+ students with a 96% pass rate.

Get Started

Why This Matters

Cognitive biases are central to understanding how cognition actually works in the real world. In Intro to Cognitive Science, you're tested on how mental processes like attention, memory, reasoning, and decision-making can systematically deviate from rational ideals. These biases reveal the underlying architecture of human thought: our reliance on heuristics, our limited cognitive resources, and the shortcuts our brains take to navigate complex environments.

When you study cognitive biases, you're really studying the boundaries and trade-offs of human information processing. Exam questions will ask you to identify which cognitive mechanism explains a given bias, compare biases that stem from similar processes, and apply these concepts to real-world scenarios. Don't just memorize definitions. Know what each bias tells you about how the mind organizes, retrieves, and weighs information.


Biases in Information Search and Interpretation

These biases affect how we seek out, filter, and make sense of new information. The underlying mechanism involves selective attention and motivated reasoning. Our cognitive systems aren't neutral processors; they actively shape what we notice and how we interpret it.

Confirmation Bias

Confirmation bias is the tendency to search for, favor, and recall information that supports what you already believe, while ignoring or downplaying contradictory evidence. It's not just about seeking agreeable facts. Memory distortion plays a role too: you'll recall confirming evidence more easily than disconfirming evidence.

  • Active, directional search: You don't passively stumble into bias. You go looking for support, whether that's choosing which news sources to read or which search terms to type.
  • Reinforces misconceptions over time, making this bias central to understanding belief perseverance and why people resist changing their minds even when presented with strong counter-evidence.

Framing Effect

The framing effect shows that identical information leads to different choices depending on how it's presented. A medical treatment described as having a "90% survival rate" feels very different from one with a "10% mortality rate," even though the numbers are the same.

  • Demonstrates that cognition isn't purely logical: Context and language activate different mental representations, which then drive different decisions.
  • Tversky and Kahneman's research on this effect is foundational to behavioral economics and prospect theory. Their classic "Asian disease problem" is a textbook example worth knowing.

Compare: Confirmation Bias vs. Framing Effect. Both show that information processing isn't objective, but confirmation bias involves active seeking of certain information, while the framing effect involves passive reception of differently presented information. If an exam question asks about how context shapes reasoning, the framing effect is your go-to example.


Memory-Based Heuristics

These biases arise from how we retrieve information from memory to make judgments. The key mechanism is accessibility: whatever comes to mind most easily has a disproportionate influence on our thinking.

Availability Heuristic

The availability heuristic is a mental shortcut where you judge the likelihood of an event based on how easily examples come to mind. If you can quickly think of plane crashes, you'll overestimate how common they are, even though car accidents kill far more people each year (roughly 40,000 annually in the U.S. vs. fewer than 500 in commercial aviation worldwide).

  • Recency and vividness inflate perceived frequency. A dramatic news story can shift your risk estimates even when the underlying statistics haven't changed.
  • Often adaptive: In environments without mass media, ease of recall is actually a decent proxy for frequency. The bias emerges when media exposure or unusual personal experience skews what's memorable.

Hindsight Bias

Hindsight bias is the "I knew it all along" phenomenon. After learning an outcome, you misremember your prior predictions as having been more accurate than they actually were.

  • Memory reconstruction is the mechanism. You don't just add the new outcome to your memory; you update your earlier beliefs to be consistent with what you now know happened.
  • Problematic for learning from mistakes, because it creates the illusion that outcomes were predictable all along. This reduces your motivation to improve your actual forecasting ability.

Compare: Availability Heuristic vs. Hindsight Bias. Both involve memory distorting judgment, but availability affects prospective estimates (predicting future likelihood), while hindsight affects retrospective assessments (evaluating past predictions). Both illustrate that memory is reconstructive, not reproductive.


Anchoring and Adjustment Failures

These biases occur when initial information disproportionately shapes later judgments. The mechanism involves insufficient adjustment: you start from a reference point and fail to move far enough away from it, even when that reference point is irrelevant.

Anchoring Bias

Anchoring bias means that the first piece of information you encounter on a topic heavily influences your subsequent estimates. In one classic study, participants who were first asked whether Gandhi died before or after age 140 gave significantly higher estimates of his actual age at death than those anchored at age 9. The anchor doesn't even need to be plausible.

  • Adjustment is cognitively effortful, so you tend to stop adjusting before reaching an unbiased estimate.
  • Robust across domains, including salary negotiations, pricing judgments, and legal sentencing decisions.

Sunk Cost Fallacy

The sunk cost fallacy is the tendency to continue investing in something because of what you've already put in, rather than evaluating whether continuing makes sense going forward. You stay in a bad movie because you paid for the ticket, or you keep pouring money into a failing project because you've "come this far."

  • Anchored to prior commitment rather than evaluating current costs and future benefits objectively.
  • Violates rational choice theory, which says only future consequences should matter for decisions. Past costs are gone regardless of what you do next.

Compare: Anchoring Bias vs. Sunk Cost Fallacy. Both involve being "stuck" on initial information, but anchoring affects estimates and judgments while sunk cost affects behavioral commitment. Anchoring is about numbers; sunk cost is about actions.


Self-Assessment and Metacognitive Errors

These biases involve faulty evaluation of your own knowledge, abilities, and predictions. The underlying mechanism is poor metacognition: your ability to accurately monitor your own cognitive processes is more limited than you'd expect.

Dunning-Kruger Effect

The Dunning-Kruger effect describes a specific pattern: people with low ability in a domain tend to overestimate their performance, while people with high ability tend to slightly underestimate theirs. The core insight is that the skills needed to perform well in a domain are the same skills needed to recognize good performance in that domain.

  • Metacognitive deficit is the central problem. If you lack expertise, you also lack the tools to evaluate expertise, including your own.
  • Double burden of incompetence: Poor performers are both unskilled and unaware of being unskilled, which makes self-correction difficult without external feedback.

Overconfidence Bias

Overconfidence bias is the general tendency for people's confidence in their judgments to exceed their actual accuracy. In calibration studies, when people say they're "90% sure" of an answer, they're typically correct only about 70-80% of the time.

  • This affects experts and novices alike, which is what distinguishes it from the Dunning-Kruger effect.
  • It's particularly well-documented in fields like medicine, law, and financial forecasting, where high-stakes decisions are made under uncertainty.

Compare: Dunning-Kruger Effect vs. Overconfidence Bias. Both involve inflated self-assessment, but Dunning-Kruger is specifically about skill-dependent metacognitive failure (the unskilled don't know they're unskilled), while overconfidence is a general tendency affecting people across skill levels. Know this distinction for exam questions about metacognition.


Social and Emotional Influences on Cognition

These biases show how social context and emotional valence shape information processing. The mechanism involves the integration of affective and social information into what might otherwise seem like purely cognitive judgments.

Bandwagon Effect

The bandwagon effect is the tendency to adopt beliefs or behaviors because many other people hold them. You use others' behavior as a signal about what's correct or desirable, especially when you lack direct information yourself.

  • Cognitive efficiency explanation: Following the crowd is a reasonable heuristic in many situations. If 50 people are running away from something, it makes sense to run too before investigating.
  • Can produce groupthink and information cascades, where early adopters disproportionately influence later decisions, even if the early adopters were wrong.

Negativity Bias

Negativity bias is the tendency to weigh negative information more heavily than equally strong positive information. Losses feel roughly twice as painful as equivalent gains feel good, criticism stings more than praise pleases, and threatening faces in a crowd grab your attention faster than friendly ones.

  • Evolutionary explanation: The costs are asymmetric. Missing a threat (a predator) could be fatal, while missing an opportunity (some extra food) is merely inconvenient. Natural selection favored brains that prioritize bad news.
  • Affects attention, memory, and decision-making: Negative stimuli capture attention faster, are processed more thoroughly, and are remembered longer.

Compare: Bandwagon Effect vs. Negativity Bias. Both show that cognition isn't purely individual or rational, but the bandwagon effect emphasizes social information while negativity bias emphasizes emotional valence. Both can be understood as adaptive heuristics that sometimes misfire in modern contexts.


Quick Reference Table

ConceptBest Examples
Selective Information ProcessingConfirmation Bias, Framing Effect
Memory-Based JudgmentAvailability Heuristic, Hindsight Bias
Insufficient AdjustmentAnchoring Bias, Sunk Cost Fallacy
Metacognitive FailureDunning-Kruger Effect, Overconfidence Bias
Social Influence on CognitionBandwagon Effect
Affective Influence on CognitionNegativity Bias, Framing Effect
Heuristics (Adaptive Shortcuts)Availability Heuristic, Bandwagon Effect
Violations of Rational ChoiceSunk Cost Fallacy, Framing Effect, Anchoring Bias

Self-Check Questions

  1. Both the availability heuristic and hindsight bias involve memory distortion. What distinguishes when each bias operates (prospective vs. retrospective judgment)?

  2. A student continues studying for a major they hate because they've "already put in two years." Which bias explains this, and why does it violate principles of rational decision-making?

  3. Compare and contrast the Dunning-Kruger effect and overconfidence bias. Under what conditions would you expect each to occur, and what do both reveal about human metacognition?

  4. If a researcher wants to demonstrate that how a problem is presented matters more than its objective content, which bias should they study? Design a simple experiment to test it.

  5. Which two biases best illustrate that human cognition relies on mental shortcuts (heuristics) that are often adaptive but can lead to systematic errors? Explain the trade-off between efficiency and accuracy that each demonstrates.